var/home/core/zuul-output/0000755000175000017500000000000015145342342014530 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015145356600015476 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000213641315145356415020271 0ustar corecore ݕikubelet.log_o[;r)Br'o-n(!9t%Cs7}g/غIs,r.k9GfB ?"mv?_eGbuuțx{w7ݭ7֫e% oo/q3m^]/o?8.7oW}ʋghewx/mX,ojŻ ^Tb3b#׳:}=p7뼝ca㑔`e0I1Q!&ѱ[/o^{W-{t3_U|6 x)K#/5ΌR"ggóisR)N %emOQ/Ϋ[oa0vs68/Jʢ ܚʂ9ss3+aô٥J}{3CF*A(-aD~JwFPO7M$n6iXύO^%26lDt#3{f!f6;WR.!$5 J:1*S%V!F([EbD]娍ԹiE03`Cfw&:ɴ@=yN{f}\{+>2^G) u.`l(Sm&F4a0>eBmFR5]!PI6f٘"y/(":[#;`1}+7 s'ϨF&%8'# $9b"r>B)GF%\bi/ Ff/Bp 4YH~BŊ6EZ|^߸3%L[EC 7gg/碓@e=Vn)h\\lwCzDiQJxTsL] ,=M`nͷ~Vܯ5n|X&pNz7l9HGAr Mme)M,O!Xa~YB ɻ!@J$ty#&i 5ܘ=ЂK]IIɻ]rwbXh)g''H_`!GKF5/O]Zڢ>:O񨡺ePӋ&56zGnL!?lJJYq=Wo/"IyQ4\:y|6h6dQX0>HTG5QOuxMe 1׶/5άRIo>a~W;D=;y|AAY'"葋_d$Ə{(he NSfX1982TH#D֪v3l"<, { Tms'oI&'Adp]{1DL^5"Ϧޙ`F}W5XDV7V5EE9esYYfiMOV i/ f>3VQ 7,oTW⇊AqO:rƭĘ DuZ^ To3dEN/} fI+?|Uz5SUZa{P,97óI,Q{eNFV+(hʺb ״ʻʞX6ýcsT z`q 0C?41- _n^ylSO2|'W'BOTLl-9Ja [$3BV2DC4l!TO C*Mrii1f5 JA *#jv߿Imy%u LOL8c3ilLJ!Ip,2(( *%KGj   %*e5-wFp"a~fzqu6tY,d,`!qIv꜒"T[1!I!NwL}\|}.b3oXR\(L _nJB/_xY.# ſԸv}9U}'/o uSH<:˷tGLS0l/LKcQ.os2% t)Eh~2p cL1%'4-1a_`[Zz㧦|k˭c ĚOρ_} Ewt3th?tvͪ{~;J0= |JUԍ;Iw}/9nh7l%>'ct Հ}a>-:(QxPyA Z UcÖgڌ:8cΗ|U1,-N9 dI [@3YN%:ò6PT:”QVay 77ĐrX(K&Y5+$wL#ɽ 4d-bbdAJ?w:P>n^2] e}gjFX@&avF묇cTy^}m .Ŏ7Uֻ󂊹P-\!3^.Y9[XԦo Έ')Ji.VՕH4~)(kKC&wfm#Y~!%rpWMEWMjbn(ek~iQ)à/2,?O Y]Q4`Iz_*2coT'ƟlQ.Ff!bpRw@\6"yr+i37Z_j*YLfnYJ~Z~okJX ?A?gU3U;,ד1t7lJ#wՆ;I|p"+I4ˬZcն a.1wXhxDI:;.^m9W_c.4z+ϟMn?!ԫ5H&=JkܓhkB\LQ"<LxeLo4l_m24^3.{oɼʪ~75/nQ?s d|pxu\uw?=QR -Mݞίk@Pc n1æ*m$=4Dbs+J \EƄզ}@۶(ߐ/ۼ𹫘qݎt7Ym݃|M$ 6.x5 TMXbXj-P\jА޴y$j`ROA"EkuS#q * CƂ lu" yo6"3껝I~flQ~NCBX`]ڦÞhkXO _-Qy2$?T3ͤEZ긊mۘ$XD.bͮW`AީClСw5/lbl[N*t*@56."D/< {Dۥ sLxZn$N(lYiV =?_e^0)?]{ @| 6+#gPX>Bk2_@L `CZ?z3~ }[ tŪ)۲-9ֆP}b&x Uhm._O 4m6^^osVЦ+*@5Fˢg'!>$]0 5_glg}릅h:@61Xv` 5DFnx ˭jCtu,R|ۯG8`&ו:ݓ3<:~iXN9`2ŦzhѤ^ MW`c?&d.'[\]}7A[?~R6*.9t,綨 3 6DFe^u; +֡X< paan}7ftJ^%0\?mg5k][ip4@]p6Uu|܀|Kx6خQU2KTǺ.ȕPQVzWuk{n#NWj8+\[ ?yiI~fs[:.۽ '5nWppH? 8>X+m7_Z`V j[ s3nϏT=1:T <= pDCm3-b _F(/f<8sl, 0۬Z"X.~b٦G3TE.֣eմi<~ik[m9뀥!cNIl8y$~\T B "2j*ҕ;ێIs ɛqQQKY`\ +\0(FęRQ hN œ@n|Vo|6 8~J[,o%l%!%tyNO}}=ʬ-'vlQ]m"ifӠ1˟ud9)˔~BѤ]һS8]uBi( Ql{]UcLxٻa,2r(#'CDd2݄kTxn@v7^58þ Ţ&VY+yn~F8I !6WB3C%X)ybLFB%X2U6vw8uUF+X|YukXxVO(+gIQp؎Z{TcR@MSRδ~+1æ|mq՗5$B᲋eY(|*磎\Dži`dZe j'V!Mu@ KV{XץF .Jg< ƜINs:b zĄu3=Az4 u5'og^s7`Rzu-anOIq;6z( rx߅ euPvIɦ7聀t>G;_H;2ʗ6 h6QװxmR JQUbTP2j˔Ni)C)HKE"$ӝ!@2<Bq 2oh80,kNA7,?ע|tC3.㤣TiHEIǢƅaeGF$ u2`d)/-st{E1kٌS*#¦۵_Vu3ЩpRIDr/TxF8g4sѓ{%w .ʕ+84ztT:eEK[[;0(1Q@ET0>@wY)aL5ׄӫ A^%f+[`sb˟(]m`F3 W(( (h<[rqTɈjM-y͢FY~p_~O5-֠kDNTͷItI1mk"@$AǏ}%S5<`d+0o,AրcbvJ2O`gA2Ȏp@Z#"U4Xk1G;7#m eji'ĒGIqB//(O &1I;svHd=mJW~ړUCOīpAiB^MP=MQ`=JB!"]b6Ƞi]ItЀ'Vf:yo=K˞r:( n72-˒#K9T\aVܩO "^OF1%e"xm뻱~0GBeFO0ޑ]w(zM6j\v00ׅYɓHڦd%NzT@gID!EL2$%Ӧ{(gL pWkn\SDKIIKWi^9)N?[tLjV}}O͌:&c!JC{J` nKlȉW$)YLE%I:/8)*H|]}\E$V*#(G;3U-;q7KǰfξC?ke`~UK mtIC8^P߼fub8P銗KDi'U6K×5 .]H<$ ^D'!" b1D8,?tT q lKxDȜOY2S3ҁ%mo(YT\3}sѦoY=-- /IDd6Gs =[F۴'c,QAIٰ9JXOz);B= @%AIt0v[Ƿ&FJE͙A~IQ%iShnMІt.޿>q=$ts,cJZڗOx2c6 .1zҪR "^Q[ TF )㢥M-GicQ\BL(hO7zNa>>'(Kgc{>/MoD8q̒vv73'9pM&jV3=ɹvYƛ{3iψI4Kp5 d2oOgd||K>R1Qzi#f>夑3KմԔ萴%|xyr>ķx>{E>Z4Ӥ͋#+hI{hNZt 9`b˝`yB,Ȍ=6Z" 8L O)&On?7\7ix@ D_P"~GijbɠM&HtpR:4Si גt&ngb9%islԃ)Hc`ebw|Ī Zg_0FRYeO:F)O>UD;;MY,2ڨi"R"*R2s@AK/u5,b#u>cY^*xkJ7C~pۊ ~;ɰ@ՙ.rT?m0:;}d8ۈ ݨW>.[Vhi̒;̥_9$W!p.zu~9x۾vC;kN?WƟ+fx3SuKQqxST Ζ2%?T74a{N8;lr`$pZds=3jwlL Eڲ t|*n8[#yN SrA GYb8ZIaʼn8 #fg3i`F#5N 3q_M]j 8E!@1vցP7!|+R@;HspSI]ڻCZUcg5pDcIϹ,oN-_XI,3\j ]ٟ5~' SuipA!C厐$&k7dmhz/#"݃,YqCL$ڲ`"MUbeT>Xuv~4Le͢ }UVM)[A`b}mcE]LCEg=2ȴcmZ?E*-8nhױ1xR2ϫCya` A y!?h!9yL%VLU2gr26A!4vbSG ]ꧧWp/ &ee *w$-`J\ ptǣC^p#_`{ К8EW>*(D{ٛ,[fnY𱹞M=6&$<,"lX-Ǐ_whaE 98 (oѢ/Р΅ 7ցl6618ł_1/=fu).s¯?.S[{'g=Ҥ):d8h\y6]t1T7IUV:;.1& ,5΀j:<< +Y?58In'bXIǣO{&V\DŽ0,9f O_"[l:h¢8wݓ19\:f6:+ .3}=uvKc ٹeS<>ij(o'ciS<{1$E[nP b?8E'xv[K+E{,Qƙ1*dcs_Z'407|qBOgYU|U--sG8`u! qGYܷw;ȌCPc_|(RaIBwR[كrq IH!6=Ocnи%G"|ڔ^kПy׏<:n:!d#[7>^.hd/}ӾP'k2MؤYy/{!ca /^wT j˚ب|MLE7Ee/I lu//j8MoGqdDt^_Y\-8!ד|$@D.ݮl`p48io^.š{_f>O)J=iwwӑ؇n-i3,1׿5'odۆ3(h>1UW蚍R$W>sngir^$W v:?_ͬ5kݰw[!$s׭dֲcUh=Ɩ9b&2} -/f;M.~dhÓ5¨LIa6PnzɗBQiG'CXt!*<0U-(qc;}*CiKe@p&Em&x!i6ٱ˭K& FCfJ9%ٕQ·BD-]R1#]TROr}S [;Zcq6xMY 6seAU9c>Xf~TTX)QӅtӚe~=WtX-sJb?U'3X7J4l+Cj%LPFxŰAVG Y%.9Vnd8? ǫjU3k%E)OD:"Ϳ%E)=}l/'O"Q_4ILAٍKK7'lWQVm0c:%UEhZ].1lcazn2ͦ_DQP/2 re%_bR~r9_7*vrv |S.Z!rV%¢EN$i^B^rX؆ z1ǡXtiK`uk&LO./!Z&p:ˏ!_B{{s1>"=b'K=}|+: :8au"N@#=Ugzy]sTv||Aec Xi.gL'—Ʃb4AUqػ< &}BIrwZ\"t%>6ES5oaPqobb,v 2w s1,jX4W->L!NUy*Gݓ KmmlTbc[O`uxOp  |T!|ik3cL_ AvG i\fs$<;uI\XAV{ˍlJsŅjЙNhwfG8>Vڇg18 O3E*dt:|X`Z)|z&V*"9U_R=Wd<)tc(߯)Y]g5>.1C( .K3g&_P9&`|8|Ldl?6o AMҪ1EzyNAtRuxyn\]q_ߍ&zk.)Eu{_rjuWݚ;*6mMq!R{QWR=oVbmyanUn.Uqsy.?W8 r[zW*8nؿ[;vmcoW]"U;gm>?Z֒Z6`!2XY]-Zcp˿˘ɲ}MV<в~!?YXV+lx)RRfb-I7p)3XɯEr^,bfbKJ'@hX><[@ ,&,]$*բk-Yv5 '1T9!(*t 0'b@񲱥-kc6VnR0h& 0Z|ђ8 CGV[4xIIWN?Yt>lf@ Vi`D~ڇŁQLLkY <ZPKoma_u` !>Z;3F\dEB n+0Z ?&s{ 6(E|<ޭLk1Yn(F!%sx]>CTl9"و5 |ݹր|/#.w0ޒx"khD?O`-9C| &8֨O8VH5uH)28 Ǿ-R9~ +#e;U6]aD6Xzqd5y n';)VKL]O@b OIAG Lmc 2;\d˽$Mu>WmCEQuabAJ;`uy-u.M>9VsWٔo RS`S#m8k;(WAXq 8@+S@+' 8U˜z+ZU;=eTtX->9U-q .AV/|\ǔ%&$]1YINJ2]:a0OWvI.O6xMY0/M$ *s5x{gsəL3{$)ՆbG(}1wt!wVf;I&Xi43غgR 6 ݩJ$)}Ta@ nS*X#r#v6*;WJ-_@q.+?DK១btMp1 1Gȩ f,M`,Lr6E} m"8_SK$_#O;V 7=xLOu-ȹ2NKLjp*: 'SasyrFrcC0 ѱ LKV:U} -:U8t[=EAV$=i[mhm"roe5jqf$i>;V0eOޞ4ccc2J1TN.7q;"sդSP) 0v3-)-ٕAg"pZ: "ka+n!e߮lɹL V3Os\ဝ+A= 2䣔AzG\ ` \vc"Kj61O Px"3Pc /' PW*3GX liWv-6W&)cX |]O;C%8@*Z1%8Gk@5^NtY"Fbi8D'+_1&1 7U^k6v읨gQ`LRx+I&s5Www` q:cdʰ H`X;"}B=-/M~C>''1R[sdJm RD3Q{)bJatdq>*Ct/GǍ-`2:u)"\**dPdvc& HwMlF@a5`+F>ΰ-q>0*s%Q)L>$ćYV\dsEGز/:ٕycZtO 2ze31cDB/eWy!A/V4cbpWaPBIpqS<(lȣ'3K?e Z?ڠ8VSZM}pnqL f2D?mzq*a[~;DY〩b𻾋-]f8dBմVs6傊zF"daeY(R+q%sor|.v\sfa:TX%;3Xl= \k>kqBbB;t@/Cԍ)Ga[ r=nl-w/38ѮI*/=2!j\FW+[3=`BZWX Zd>t*Uǖ\*Fu6Y3[yBPj|LcwaIuR;uݷ㺾|47ߍeys=.EinE% 1zY\+͕߬VͭW_겼cazyU1wOw)Ǽn@6 |lk'Z|VZpsqL5 څB}>u)^v~,󿴝} 3+m𢛲Pz_Sp2auQAP*tLnIXA6L7 8UgKdT)*7>p{Pgi-b)>U6IXabPde Ӽ8Ģ8GɄnb'G ֤Mcv4?>HC78NE@UMc8>`TvZ:}O wmx|lIu&3Mwfަټ۝9;;l趺8JIjFn  H8QwD e\y6dGUa2,B,ƪ$2J%cUJ0~ƪqe3,ʸwƦ;HcRD%dZ'0ZŨY6eUoϛ;]Yǹ,TGSmX|/ĩLأ`J~ NՌ1Vo!CUV͜ZE(NM&Us&?, 㛟GW5z Rӥ5>g"%3S1iQG/i{ y3^S>-qrC%X% d@ wIى풲'n8<// ,nYK+`>- ^r;_WE"Hmˍ$?6W;o̢,Ffd[At`m[# 5H+uZ]@ׅ5,czFM?|sP;|y^J7=&\Jh=T4*EM\*y:x''|Cֿ{Nm.Mv/M .ou^O<*S^֣ ~"F*Lb>~ӡwmvOS Kں{}s`}b|?2m1m)/DSkT-?8V[l1g#d1[^R;d݅"{GBѨL/aʾ'YSXjG̏lN`PURX(lb"8Ky DH 3^ŭ^^&*`Zp1 K퀡,㲆PEwG>"ERǓ-;z߀sRU%'x{@sw W8 Z\)DNa9Ш7{/Kr&N[Y֨i ,˘ôh=EP3TddWY/Vk". )Qkcc.$L)a{D-oJPP}_nX ~! yk( YҪxn qO+?i|װIԋWwpJ>hLrYw`,x׵l]DJ x]߇x^rYdamtڍhuCsz? L:s J ,?9GAbYdiHy{eM+ނ8;0\ܮEPjƯPYt,{^KARU&eggrwmJ ]ÒVKT 7]>L6UK[4yP8-DZoYԝ(08AH QVСl]̛"cEkAٿutfW2k_IԌݨdx5C(م׋n:wjg5Ql8 .R4UO*7D9k>HP)N$eGJy,qK|);V^ 7|#O%`C!2 q5;byeQ(DE,Q* tXyR1y9\7T_s\~zUg(O%v ?S/m$ #ZnaO RT}ɯ޼=1 &ڬ8>2G(j!ʃ`1hER:>LYT! JfƵ y~u|B` ZT񑝚eoV75[5v@O.`JТh> '(ihV9U]dV3XPT5|H[Ephli4".bmG3]\%h G w8 Z(Hmf>N2[j:L/t57X0uwU[aXjt^Ý'wֱŁi?U8tQڎ&ԟ}Ya=Ӝe\ )W "v0GW;S[꫼3MkgOK<=: ,ٻ%yb8;12ʪ`F䊪f8ʺ`o<؅qW7=)G^ů 8mLV8„ ݁Hv鳩y[\,o#ږqL y*ͭ VlBeѝ 69L(c=e5q{7P~>[ىlwt>]_G0K. b6ˣS] R9aSVĊo߅ )uZ&#!#$1~,ݣAئHci[Ĕ#NjKX7ɪJu˵+BeRuɲ*yf|uսX78Z[uy ,Vjc;ET[/`yZXW@h.* Ad6>3a)HVє Ҋ:+MKŊߧZ<J! =m[b]1ޞ"Jt>AYHRT0{@lkCj"t!rŞG1<})MqoG] wkv+1 өeS 9ںh{|A'rzn7(ǯ{[TXt}FW8.<ɐTu*hGQܝnSVmݢ4SFՎ*A[CPX`93F}|wVnϏ\r^KwjLW f{k p}i,"F - l+4 숩B]:' jQsxl Qۅ ,/p詭Wh u7zEv*$yNF`:Z]:7fKw^ m ttȳd5qhI]v +I!<]])C*_EDtK#HKYZbi 3ϡXce{ܾV33:~b gpt=pѥnEa;\'۝.,ϸVoﳮ˴TfH#֫L=țK$yZ!n2 Xi}0\4I2@XSA띏0:g|q ="<>ZQc]fAԤ>ΞPU28:vt ba(U*G}R=u%VœuC[.L,uj;94EKK5>.P vv0wlfJ v̳U例pY'WT)o2u7 buCQ5qN[wKye)\]=k\ Rm;y'|52bJ~!=̱2*{ _\mҒ8=%8 /ngx*3݆J=kn{w8lֺoO(>o&#.Zވ4ծ"fO3#z}[Nǁ74!|6'\(XDƅ-샹@xPmnޭȝ׭ЀD#!/&s[=wAܖ:Bt5&.ja m"VaGr . ,A }zց3gHm %$#r>};$zKˋB5hJ ;8$pox@0E=^$@4Z}I@iMGMLQzm5YAV;jx(a *.ʦ N}s  D8,ީ$Oy5 S-_'/4O6WCo)PLf\w1q[KB-vmԃdft2L:@D(7,$7IF7G $R{/T7_ok}3TD}o fA5[lP ڑos :Q Qx_V0!6f3Ch[PUZe˅7$P㟧7'@e>nϧuIUS~]':&fD0t7 ֊OVY=/CRigr[w*:,c>/!HW*O({ZC bUM*EEY״iĻs:Xw8ETCiwYxON?>yN_SD! SCdž ϼAA/_Kw^ \zP >,"0#mp51ٿ{m`^2 `i0piJ k,wͺAQB(Ab (_` ;MG靥AnVJ `z] )9#$э\X" `1Ѯc` X"V rθt;C$AW"Y@TX6\\ZτXjDbղq޽ǃY#vUmyb2c oת|/{m+xAyo.P|gDD}䮲EoIopy2I$HoN/olPx/gוvIoNmv/|w fD]$wE,l1ֈ0p`xJ0LsQ;lBxW(ħ}n;ww + .[= !0 d<옸w P!:*`P׬J,X}z7=at;*iܤ\q=aUFu?eSY-R5pkPL/y,+f^f=Xm7tY6C?Z Q25( -B׷'\>MnV4zHkp*fDUbtrOzsG.*} '#8aE@ܐ} DEtϪ_Da8H`)rF0?ܓA؎YKΧWIțlZ{NTsPGD^}엊j~T+"dtR9,*3`|>+2h\UzE9E\ӕǪ 2g>Y_>q74*("Zwo٪Q>)*FGWnc5ĖE߶}TM.!C\SR(&,ՠBǣ 3<#0NP%qQw^@eFOc@ I4@\U+XU&س;شO.w]mS*#[!hC$z1%7Y$aA^i*z\mea2vh!B.G拞͓!U]7i1'*ѝg8[pW+WYj%B"$OƠP1Of a#"CZU1eUA.';qV#A)N],0VÃ8Poŀ {:[MԦ.hA4oci0ox4!˱"g1u|GpmD4YN)XDK8dqâlO%iGa`I^" K7A!0,j^gM=Zq"1kzZ -:!,`2ar;AlmQn86_ܐŁvv$ݔ_O{gMf+$@6{W{Z~hs1 0B$)尕v4ʺj誒OWMO^2|#ss^r"1OqHi>P,u]R2W=aERc|5(`}М2rȨ, \&q "u~o- J&(QPnӧtNOUkҭ~nճtSvݦw٪l avm!(_o!(^P4A-uVu^Pi:; l!*OQPw AUA-{ގz[ o!wBP*BPi[ l!h v4X#R.m# LK 5J:oDb!YUC(Mt,b_l4zʼ!z䑛l~/$oTR*6Fx#^q !rOu+""V֗Nh/&v xыWbK76ɇ #hPI"gD7E@|/e >ҀI%XNUãvL[==HG2Wj<9ɒ}6@9v5ǍboϤTk_k/*{`fɩW*ʅI ELbFoXtuLI!3R3¡pV{uCNV%h~PʆGVPYw<* r爐Xhdpڳ0G"$]\RE(4C @~b6)I,6_ԁFb Oh(s04fL$46.TB奌S>M4=̉lQ٩9.ӏ'~ϻ@/^UؕgJɠ Uy-bڭ0ȥlIRh]ܝ]wC-APK#JIjԿ1ZWY|$ۤ9csrc\b<{]m>k"*VAc|8]??sw :kZq/Xh}[,&<[;)~L6Gƺ+?Aak!ؠá#pj$봟!YHIi}zǾ)^ϵUs?'n^>?N~vyLXL|^ k}5'Ecy[_~R]$_~جWUszt oF`o+n3K(n'0.o?~/v7#nuғh'o-~=;utG_xp Ns~#]/-7?LSTAXW'ئp6 )빶矦/4i"14L0,KTQRI '7#Y"=6 MdW8(?ǔ4(o=-quiTE 7=_0X^-X+z"ZC Z1dzeZRւiUٖt)pb&:UF[s(, U# 6 eq8b$5o$80EwDZ$4g![I{$I,LEX "KR ,$baF)%n(V&Acji8H:;ĩDsX++4x: x9J3H W^ C14gR6pH4'33OR! 3Fҙ.~ړ܇j:e"Ɩ3X#5&֪Kmb #=VrGlJBX.sw¼)^r.5c$͎1 xO쥁:W2/KbJ] y;Ig0}l$8z38TJ"%6r XR[O*3W !Dó.3]KXo93FZZkzࢆV aC0b}ֻ# pV( Uδf(W3B0b.Udsߢ2Yk!JSO@մ}?e~#$AxkQ`$oIIg8nsK$8JsUIK!7cPq*`+4gŚ(Z$\B\1쩦t|p6KeYX8=k%e78Hb06oZB<LO‡XJ]u88!~.U""I||G|_k*py,ӢII@8N1q5Afϖ !Xe: #EҠBN$zħ2:LÎk}F.zQ&䬼s$8FDaR`MUVɍ!υܥec%ևa,Yi"٠2KVA`,k#=]Mgy}d1$J;)+𪎨M%#h 8{%*(npd;wsjRxְg10ZL#*>oIDq,^Bn$ :"+oFH:uwHpc*YBIsᮏIBl%KG3rKYS O:4|NlBqIg8޷[G%!Ե22}[tȽDC+);dWFy^K?ߒ䵔cq̆9Qi6"KXL,Co[Ī(c%mQP}1 i$Oh+)Y4ڰ<-=#>m!5WߜJf}'4!jj)G 8(B eb \*AR 3=453aZoíy&)hcʯy5}#r}ˎkHc[uLZ9ud*i"Fҙ&ϻO48znCf괤 6<9ojLg޺Nej #[1ު 򺚥_Uҏc m_}u7)>?a*F=I0w|#/oY/"iۄw<24;UPƋʹSsqǧ1$&O$Fw"WctHO#̓mCW53k XCMP3WZ+A_߲fQfbe4`İ#ANY U%qpc?\:Spg#xQɖ0r~"~tRpFq}r$8;˘Ѿ 7˘k|y r0U5z4>\J)X4`2F1/-dU~o:Z~ͦ0/: r)MXv@$S9tfI DR?R|`‰&ArpytD#rL *#d6qU+ÿIwZ16[9hSp#q!X}$cl\P8.xU z7W3\˽t2| ^Ip ۹T1̪Q`p^ʹBzd[l~e$Zm`ݏ=d9ǯ6ev/;8-q1B kI-e2WE:݆`~RCxbE^Wt6ɝ# .6 Jw3x`jK\4yU%GT뱛T9yٙt H0vYk6?B-G/#f-Eq0[DL;W;C(ńUhDFҹ>D==;By2e#D3 ]2Hxv]4`<ɳK<.{ֻ&*o]w/`Pu%}uI9ȫ=G`*Lkrj#vu j#P lث@wDD6buNk9!>åuq:U"|ؙݢX!*za7n 5(pI.Ev3xȥu4wL.Mӆ5K/=D5;3x *ᴁ?xDy S lPc$|B𼾪, D t zy[2l\/- 2ηjm+K3 -n{pQL8d #̚<68:˻{w#uhS'!H9uZ6`KO&)|r O̗Otx&$rATWE7gw2 Wp[/˞$w㌼*off1 vp{3"Ě,*Ah5/7% .]G0(#x|tGc\rJcX0}!R=\kR%!-q4yW _fCHw=zy]L"{8n,hrW53+ `x(GïjIki+~p'c.R"e "J=LI[>x!?ƹSk:3:ˉ^Sd9WǖBB.Y 6UC _<[&U5ɝ"L\'>XjtcK},;g!8T2G F˫EOg~Z.v40wMGj3L>3d0%%,/fʈfw\j_6~۳%q03*H 3)oY%xr\E1p{Ippx|>+蕑$8u)FfWL O_p*b {1p\=?bpå'vf̨BO?)!hzf$)$5|$-]sTUwuV2Mkף2'mM7iw7_lYn6"uA`~8 hf[xَSDxTM+/IK{ f8ܙ?:e1L $BZݥ4.: 6WYݝ!mgSMSJ 1>jňXJKr/|P[羏$fO(lA[5_JBǠr\l8HB6Ku-[3/wqOf`QyzLbZ|'qylzG ?'J⇠'8b 1JF9i3{y|)LLIX8k?$W@X'a8.Dtl<ײI=MaB  ~xdGii9#|6jG+3J!<*k(䛳Iid-oN:7&_X33?"b7F_9}d`r`Ψwa#ec~_YpY9т,hi"OSc< !8;;f $\۔DDΏ4,ZzKV/'_e1WL@w""8 ΙJ 8pFoj~mgcϟ(\6tx9nc<>3:|}vo~Yoo&ɥj=oqN7M//\{ N6LZe_|Y6_2(NʈRbx 28?&~%W(]hR^޹LFy8GTš h ?d|T}+ml?lr|p:7?A^N%#E*E'?O१w t! 4T- %YڿALخmwl/fNfh`5U?щa4CA/)IwZL]ubV{N}3'1eyݏ11h0[qDAb|*@I1:+0^o-}⾃E"--"YzNv̚+8~_#03s{x`F=F,V7_&fT?n[;!g]"u\ܨDa\;2v۝39y[}r tFoݐ$$s@iř3YMcڹ N H(ZYyYnĬ;<_q?K+;.oi$Špuq~p>ޝ,7 Wx~emvի}W{3_NյشkBk?DU~ Bxue*+F1"%] C,)l ,21ȗ˙䟝ҪB5֘YnL￯=wQ><.YxLJ=AV?AIt>(WwCd$>R:bI;,޿Z#c}^MGEK- FyrʢO: E/z9snWWEFĴ\)4Yys{WmJ*BԌQ|M]qNAinG:TD@1YĥsA}ww-z$p-}w;\plo2 ]<ts[L(̓ާJapʬVJ|t`Qe-=8"ZX ;cq5y3C}nhM?p .ڜPUŻ}K㚞2+ϨC"Rr+5y/}6Y#Fde*>X3R:s r x{-ύe`QY3$A;\f*Ě-2u:j#TX0Gkf1Yc8:"},FE:8݆ņ7RR|PlV+AЌn@`5k9EX!qd)K OQ;˳7lg˨hAe6m`l`rіU(Z7@/scx}O_Bc\˿ZǛ6(FyЏgnw igA+fR0.掊 8uRtN|t[ZjHpqPgPz ^K4ab!ӐFWǝ:W%h{Z^b<50^U@&÷+\r/YCq%O~]_5o;=Vѷ`BjW5 M4mb>X0yX8&̷`c!Zŕ֊ZĨAAhF ϐZ2DFikxCǙnW<5F q!X N2Zwc"\E&|jƩ8) K" %`r]8b)O V060,|SDlU<5FFB @LڔlP0K(m Gph60ۤ hDl3S/K wrHSМc ł0C 6q],:UA$uǖ8dx}LHk}vcPlV-pJ*SBџp_~DN^)+6VbmAL6+!Z=z1bmVMaL7ݿlzD[\v*5: c(%`S$e41#ScI^koM1sox20xH@ڑ[dž2ͼ|ByJ*p)3|ƛgc82JN49LR/7Kq14<Ez5:f-z ߴfiҶ CRMR9 ^`.R) pz`4Ki r+C#c?{W۶¸\i]AA\.c]d%'n͐Dٖ#)Xfgv[|YjuF|jmJyT$$LOD:Tpf& $ф2j7GɌKԥ>e<$>;BL(ZG]9Ju8a^2ND0^Gi!210488)އi s 0O, A|2Aa$f֓skR{Lܓ!uȸԌ`uL@i;z]Q4ͲD32K$xf2,aV0g9<\d -RJLqH;Kn{@I35둥XCcZ/mRJ6dV RSzɿRZ$W(fZ~IXbv<,ׅbHǺʏxG*p^%T`k.BIxv>?fIT';;Hc[0Ѣ5ӘpRV͉8 k=7ppqN8zVY.!1ˠDqD>614{ŽMȔDv4"=Aرw6VIB`&R+pOЉ3ċX0N'Lha%X?h>XY]ɀ&d[*zimf ]r}ݩ`Hhf mLa6^|ټOڠhϴA1'mPE8xVЏE8ߒplJɀ%'6M6na9zUwf7Oo'95!JS-Z"],BPs/UxSlbMT-QHT4qynd̶,œˌIq˜Lj*ZN%T8IC !QiL(%TLs2& K5ȅiF= S K{Ϙe]UiHpa}.gm{cٰv9I#NQF0Dru!f [#u cSpՀ=:||3gqNTxc$wMH] ^g sfJ3Hz9.5LYHTԥ"5%˙#"Kpֆ*`Yfd:Ħ̷*ՈLHtxߌC9:ӄZɨ6HֹK܌ܮw@4VK3}! :8?Ji=ZREo+I_Rkw4 hútH%~EC.mQ@~{Tu ڨ+Rulܚk:?.JmIJY-rpJ%k yJwfo4Xzˮk2kD<"|zKj)1Fqj+5CZt #Cs7_ Aud VPv߈`+/v&YLUD@oary|KTX%8?vX"%X|0rb&cma0xhkG;nyOM a* $.)I#-[Ha!OMXuބLI]G+;Nm@"?ny<%1ZDJG ޗ”ԇX^8 WԞGڜ}AtK!. 3ϮPNSJXsTb #L2[Ň´݋"~xMEzAz K |h6ӴT5l;IS5ݲD8Vw vZ|u{%QV=?moDfȴLtpԌ0aiF.S-PP,e#ᙢ`ΰAo"ș_O!Α|d(*ID8VgRC|~o*Y- %X)z`_»_7=k gS fſ1;6J?A&?O=)\ֲ&cTUUJųͨ`lQz?l]8>%? ;/z]Eoב AB.zֆw󘿿^nwG'/\S;v .͗Rѕ arQ:u'gnzLg*G|[9jIS:!?#LDQ\mSՇ˗.4*_ oW2.LvBusv#!aFB=brrrCm68[u>yy# {Mv<  2map*svר k݃c⹂ ^W{wZSoT{}m54ݹoLԭk]96|dː}zi_KԿN> )7s໕5+ 'dߟ ½">PL'da<~A%07o.'fJCguښi\D5% 7w>Yk7߇2ֻ7RzZwK-xnhF58Vٲ--A Ъ#"bN\_l7n]m787ahܶ=\{k ss5gɧ\]z9\Άn4AN=q'\GŝKxZZ9hO'l8ece|o wJiW_I;.vjpd;t8g71F34?mW](Q'[i':@LĎFs2a}M4[68Vx9)E4JG$[Zl/ݚ)~t.<>i{>9 gOa\vVuj<CwHğ% \'ch3`D%/E->Sb }ɪAvF@R:m{j֮}8=^M\~YԝBĽ1֎n5 QmK&ލbw 3tKp9~XT*hsb Yvv: exaVԴ/M.a|D3Xj?b$*r *KW5L?]@ђcO&O09+;39*񩶙KЙϔIJ3b[Y6ԉM:˓Lrx=ϓmr-ޫ?w߆+StlφW3YlܾK3m},Yv블;Ղ^˽;OͤCq ~KBJW (PB !gӫQְN}I-'[A: G2W(w4|YD F۱$i*5/h>Gq%/6|qT|5и.Cj)2-R"\n—Ga窲* q}QW^%5Ej*/aZTw^dVkpI95 (Pѽ$B=)U뢕"I"Ұ.>4QX ZbqPXl\;fa~"gc- n:.2;?|1Wôl=,[^y8 0 ~Xls2N/Cf7Waדo'E|v˯}ev鑢M7W_x1MV}} \>{;u㼸,Ԗ _~[|'D˛TywF^&<Dž{Q zw+%ӥqDޖc#w}ԍe4*2sݠ[i(Np}6W8%1/jh7RODKl ь'A T ZP,)咜 1$>4յE xwQPYqy<ͷ|o"Q8spgF cBe64|׻Wgo0(Z^Tp[[2j.lbXuR+e&4Ħ" 0JKZyAn@]֮6#s;v4/S̷@&ޏʚY%:b?"ҘcĎacrv Ä0᷌xxs{ nحq[3cx,I%,&L'G|mJ_ Np<ƹt[RI`@fLd`gӄZcXh?J#ipn+Z!Fvdj6hOiA8%8~e,1]ɎZt1]]YIr+c|XtϟzΦ?;rlm\ɉ/ﬢkl_9BWD &pWZq$5^\miy>ap ;:&Sǿm_t%8`eH5~=^_;_w]?!2կ?Xnrw(n\o n>d֔pkcoR٩Jۙ/EuNtu|ﭑ},8mhw)Swp':S}v99N?'矓srq8N9Dx(J_>kʯW۝_]yܾF}qmְx~cXќ^P疊Bd)HE-YF\_~~^1s3|5z'?&]߼"Jcɤn4Q ]w\.nX:>paS~myD;e [Wp b-b9D ߴ_wi1Gʕ19n "u$O5LR},IuVw"|ȅMFȍ * X [EJ4ɑ9FGcEl"ZҀ].RD#CJњ ׉8r.G-+r;5$UX(NRA <7NgD阋aRs0dg߉s}GԸ#Et:V&&_Ԝ5Y^W|`Ŕ. #y"CȬ2*py ujwd`_d%9Hd}GI,f&#TBYx aqW53^&j$3gKXBxͣgQ uD\ݵclf'ȘDP2@\Lu%hN:"ht#jHBEe?!$RgYͨl-)/K4xNDdV$QdN+NT6梎Pq uBoPm NR$BXgUTD:B-)TIw#CtC"g9D Ԣ"2xriMd.q%w\ۑ/nX ڔ|UG D "bFH}\ۑ/nX*:/ e'14B&%((+ n.8aF-+ }v%Y.C6"$x o% F-+XkYn h9447~ycWDŽ/o6IZ^+``rH- (qyS,eP?hn]͍Z>VBNFvG]KX5rA˚D)^C4>/b%D*kKUI1ޟ\ PNJwsX> c6-9@9C1mTHCA@-)rMyS.Ix4$&Pzd)A$aL1ȳ ;-+\>=(p)YW,փk 1+f‰q> XtjXy ^ V{ 6/TjӉ*AzŌ#fbt ehqUXԊve'V%^0?%C݀P N);,@?=*3 I' cV&P)q6d=qEY0`~J+]ժZTCgBS;a, ܖEm1LJihQѺ }+S-˺)|nŹJ4wKkՌh"0KWέWwx,%O(v^7h%Kj/AtVn"bqۜA6$ QeƊّ{wR\֑*nXv=ɚ9 Bղ2bãMQLp=qw1u[>RN9Fh7\ I~P;^K6ZA:p|-C-ёA<6-5aK9iTtO3^cc0e1Jשioon.z- $A@4>R1O{%F7wZ>R䜷U{>W{`;I^s;3@Ac-Gw{QL_9CmI[yz>B2V qD%&.ϯ4SEG9 BOr B=<N+r{\=26Tgpx(⤫5.a"09$t %r|K cG}Bi&plՓ(e_p٣2ΛqD q{^TUB=g2'XLh7Z>(6+yW3yRDcpۅPb>Px>dI3Vx͗oҁ";Pb{ A23z䵣 +i8K "Hzxw4;cE>cNv@.*R|ʦҫEm*%:jS I.'n͍¢\\2 J=ql@.kwqh @ |jsj`L"G˹hfU&2ɷ@ 3cO[ԔUBTy4N#El(h/4">lیp oXΊҦPsλ NPTٔ Xh Ai*eXzbOluN>=4cE>oF~% )݁J"䶗ERN_4 <+"O:ʷCkSl_PX%n1Y19~_zB!vz)0Ǥ.nTlι3, +'~3x )6joˀ閘 %(Ng3n9R{ã %&lߺD+CE7xxsNl)Fqx]vWoכokR׷ѭҀ@O-7ױ"ZJ V¬yv9Fw9.p}>UU;xEJ-2`_N(zx|T,md q_p>͖ͮO{L9-9zʟ'O@OhP e=28WPR_eӴ.~;uã#9po&uOz,~*do@V9r8G!hO3om=&5(]^,rbk&z<ԡ0ēvB y)"@RʼnێZ rA GZ>Rn~e>TSDOBy+r9-+F\ԭD-+T1]csySn*4[r0?y1 ewqrӌ:@>T j̴y!#L1=`wDtFw?n8*X[ x_+3 /F88$4) "g|~$( \ef4T [HXܴ洟cpފ=oE-+ /cBhF3c,~ާm,Xs:ַ̍Ɯ`{x4ܷSWt`RTr7Zs5>&(oGzx4>|lA"9 ϮbI/C#[zx4ߢńZd""l˓Ij)lf¸" F6Qa."m0lhҬ`f {x4|$x>\4p )j9aCxP9g7WlϟDٖ~rBT͇5[ |\SwT(2TQe@(vUdS4AHuEW3$ TbJxz0ITf8^u8W)a~H>/jRV (ٝNONfAs]vs^rXeKaqRa70"KUN\{E_:Wt:ch4{OX[9ӌRt͓r1`X*3P SPeb<~Zշ<h*gu8Օ⒴ûW!Y=i`` >+rA=mL+Љ-+Gg#ѽb~!dtB>$s[k8Η ՓĬ y+aW|v7WKks 7.b/XX,3\甚e&$l;$E~Rq<7mYλn,rbGo+ʹ-nF+/$#(I bZ74mn\z4[7Yn "˼xhs^c9CXYr촽Agw ULjm6z`:2̎1"Rdm :x|%\ xO+<[¬yµ )(LyM ﬚J8Z;'+ $Zd{1\('b>)"hnlo mԹA1 cU!nr@OI?|!8iFt.e$cB7o#9Vխo(_>f|Ayg߼"tz-/+׉L41bI￈l<:x|HJHntuJw;J=֫M,Qq2YVԬLXʨ ^ 41.+,% mb)9^t%`?EH`ܨVsgK&@17EhnnoTʙ[OVcW~T$nyҿ-} q5+7 u~ 4x˚Bv2]I`ח[/A~ dC_hb.oޏЄJP"7|;|d/OX`0gh4.F"4FߝO@M _fe3jsǥ7qh5槪1ߞ$Un+9'?9yoX{yg@iy3Mb|ڴ`ecLc.y3"Kf:kާ̟~t"@:﷋>r1LBWSLUx Ket55#&n4̏&FOw~:*.Ov9r0:0N:G~:{Gd{ԶDk3g»"LgzA\}%ڜvš_m/6'|u'`u-Fn|le~?ԯ\.V7W:x -@f3kOjzW<AJـ갉UD(=b bNT r»wFwVB~{{P3yF+1B+&ލ~nr|wsR6;8c`*Mehɨ'EX!~Zd%%虿.~h{.Lw5L,|_r5>E QMiGoaLw{߁*g1xͨnc7;YEֵr0'}NITDRdYUƍFb(Xb5XX"ڸ "Iʼn%lPrw%6T'[f o]mYUSqW0EM$i(iY8\f\[jCۮ< 1vg]ыJra>Lf04{n 1 (.u(U-c׮PO:d Ks{k:Gݚz ,_,vnvY""󷗬C#ZSovcn!"ZwDyrmD,Y{8K Kq`:W4&o0L `(.|{%4H|C@NZ$k0-7`j0w]aU 5nxj"4ȴ.I 4B3¤H}2xV9:zN ~ c&eRP&qr/2e9!b!ɉ ս&`2Y$ãE u,@yd_=\@:5[ْro580ZU)HezNE.ӄRL t(! x, %z7:VSc7غlxUh/!BPnQ>!>oǺI#KJKt#y*,oB'ϩ}/*+>[xªBTӓ4}L ^(j\&}P@/~!_?qhSDvd'rψ[&QqlϸǗJտP*XP:iUbWKr8D5& oi!ƪu7zQuQTŹَ?555aQb#!Ł >xn‘k#!+U7zA4"II/f [4/^) d8qpwF=C gD$TItYɤ& d[d-2 D`'I"F ܱ:FXJlSPɹIvE?+MћmMDF bmݍ>6[&s{N0SFB/-=P1M{2xEtK#^}2d/FϡBQFD=%YVx_hD5˪,A%btj_k{0.Ȉq=.ߵWVE/pj*&vV)|(_;w颋?305ܸ߮$d_2!<@?a;:|y^]o2O~bBiZ/K[yA2Eʒ\XPv?ezcFۚ1XwݍLB4  AJ9$ ]" n Ȣܧ8 lo0'nzW*2mW5bAI.P% vD%R0]hVNRLۋ S}d%R!$%ARl(YY`)g1聲1'@ݨ1 *tcOQ1"T0/ uD%2ݛb_utP2K< _E)V)>,-)gYtG}6ED'IiIci((<3,cz'5)\f4 wڻӧ[Jj6 CJnO^AKM 8$HN$ҥ.?}W_So.btpaՍ~QC[Y&Gwѭ/Gg;Pvc{;-q^&,\a, ޼ܛ7O/x hzwʄW[jU_܈/a|v*V;[ںXtNR~LM]GjY~{R 3 ߍZW[n`sP`|zȫM$W͖,ؚ$ܮ 41.+r3j(=y.Mv R:0.4~<фJP ooߍ MߦL&:[#?$yX&ZMw?Δ|Via۞?|xA"3}+ @s_ʈZ1L'bOŠ:zxn!33;8G|:nPב FqH@/t{:лʼLS*]N)D0nyd&Sjb2HRNa% p'0lB`p%HPufo| =b_0bMKUݥ'k_j34ӪaGsDBD*O'sεE$2}][HJS1'"h^PLMnHʄQ^ YD\FfYfL[HSkZs8T&?mOd4$[@fW#y> ̼ԟ X0\l"_z&sKV/ImƜZ>\Bcz[:_D7Wo;bцlZ{WȍJø{p\XlLe| ɶ%Eݶ3_[ݭD.R5Y"YUdSRcNǻH 4'kw ֑gĀۦw=j}kۀ;װ1ZQf]6zq$px89EILxenX|oCܠfc8lBA%S5RRAr&ԗB cr) U5޵!#J|4Y?h|c{nƸKڢL4RxO CrW@ =(qb0>`NF1VhFJ/1PeWX BLxލ2X=kjU}dM֤误IC}Y1ƚ,eF:xs.u5ØYe`8s.( G. u{2_2C4lU j4j l PW)ΰ2=D}MO!ޟTQ~H+OX܁G.!B%h B lofUG u5lxT|k& K;# IHu=X8R9 HT~CK w ]4@D}4@ q9n4#k[< u;a6F$ը@2L}r+/S<t1FDKJT|} d^dHӵWDJ;E5RJPzQE׶_-:y!a6Fv,ەor޹O]]o(G`Q!_t;e4B#D_a9Fa#Pfbo0D'*ꌞQ,T@J# `AǴ_"gH<ҋ8n Ыo`ր#)fVۍ>vkrլxy󱕘g,f '~ fE"~tk,*y{N?Yk'>10eAa E7&CMGɅ}ͩHLC|"㤅ֹX P'A&DąV &[3~bX| Iѥ?jӪ23;փ=k)pyfg6l4PWt3Y}TB $8oģ) . фP"gk0L0)hi-ށU<\C݅o-/8ac[U"\ BP1JDsk$-+͉诇͉!b%Aa|f{hB @ !-_m"+`94~vtLnD-.pWo^^,3jRbdT*oۏj9\D=[C\N\ɲUHWϓxg+k'[c 5te; bMUVI!y [^Ǐ!Fuib?4OWgrn=S/[z~O uC7Y'^et}$u{8H;4]s[ MuYԳcL.~IOK}y(={&Iu247+pOen|60dWݨ}ut_;.I0fP7;o>ԍIRum? _K7:`u| wy17HЂZ*\-*E%VHY!f29Xu*)bٜBSsܽO̟" ̻~QW˜lu'md6)?=MeH߼UkL~?,D}#f^C?[Dž󾮑vẁ"LgeHB=l'%7+Тz566'^mk7{`66ivQ=zimU6vwܾ*g!gF`,'Ïo'e ,^ڿ!lrq+ʯA:L[o}jm[w/t$λ!j \6=+߼i\O5>^ 6cVhw"B+m7 WբFXt%׾7#X7ֳ*_d7aϼ\-XNS'?\ ns{0ua68 $$Q P?ہQ5JRp[-OD=O Ww=$e(][!~Nנ:T(g\Bn7͚.@DMݮC] fc7%L.E)JHbږa(0ac+v+k68>88lfc1 {"`B@ dAȡq;-,Oґ~_GC_Gv%AM Gp7f&y@,~=X{j]/Z8닯l8RS^ҍ{ؒt-/Γ<pA݊ 8<\AEk[$K) A};`$pJ#8%gP9E 0 3`r#0+`xeNgqRk0~ . c,GHa"R{lׂ^ ['ͤ[]x!q $!øQH8-\9nMsB H6'D_JaK\s 5rU9dCE*1NPզ_vޭT۠o1(<5NPtf|-#Zߣ}' 5DkiDH\;0chD"RUGaԛJ FE~Q M0ƾJ $gbWj֞>?jDH`hDPkkz0av jU/]77u1Y,L Fn*ʖ!NdV0T70N(TOŽz:QO/":UVɩyZ\<߷I6;RlPSzO}+8hO8o@QvE|nfc IQ|ֹrsv77k"9IR gw˦4eyIv۝JoQZ ! ')90hv{qu9LB0vWkK,I"*]S]'oOd/[Icfx`7/e `lbYH%[Ŗ%]- r%~M*>YŊY hh)r$;3@EYLzGEj-I&-([`",K*hu2ZѤvu.jjA1qE'w;/>Yc|Voxͷ^2ҏ]Uq(mKT5)JE` yfq6!WOo<4yXr5`Brf t8Ȧz< aVoi9nf.u9Hg\@2_gڊd2I3¸B FV1m'4ČPw\Hc4ՒCs$PlXoPod!:#LWfj5C yn#%|_bN'sr\6C KH\Fb;cUp/ai&bZ|]Qq9ި@udYƇY_Rvhf(|'upgƒsk68yY-TN*PmOe&4f'VyOJSVu[zǂ2;c=4p+JA7j;xe8kcyL@ynOk-h)H"^Âq"C..qYr>@.;`^S%Uq7)ִ&51Ymw2XKz?-28[l҄?~^Uି&!Z2 8f@l]4qIF*ijʐYKǟ/& T5'ms*To5ȕ$ۣV@PiN@g$GW rmh{FĨe.HBy+S;i!XLzkB՞<cܔ1KR/ǭ(Q5eP5ez3Gӗ҃GQ{l;¶Ws y#y۩ewOz|8Ye#<8 ^Skc\?P}_w҄1r(cCF]p+6#0>/%AѬaDFρ+QXq+]bq_!ԥ6(#@4`sHx#CFUp$(9YU/'a$S|pӣS74{zDGeutp*-n>g7S(h<' N4m[E^S' o~u7z]@<2q 7(8P*ywp7ӻ 譀<&a,6e'>dwp kWGPhh' P RC,icAg!.8}&k˜N_#*x1le:׷v׀7"` 7X[TQU#tPD]hL&m!ٽblzBu)7U'+aɉ="12,w;em[UލC}LZLbhz:`lm7b.|pʨxqp(9.P:NajF B`TB0{aǀkaLqm1XC\UǢց" &fgߠ7*O'i _^V Tl<ǫ/xf\r*G%M9uJ뉭eT@^*cP[E ^|y,;u-Lc~{5J-vjc ܣ..KU:d"U+w&r׮ ܸ>2{8PP2ꂣ+}LEc{ 1wUh IMKD,$imBDaM.O ^9V%8p1My̪7_7bR#Vv9uc`[bfr|6D 3pe$ ﲆ/Qp`"L;ds72 vxmQF:6,[4,W!]=r_0E;d.! w(Ȍ^@ jSՌ2,7ՀQTĺEDUnOoTG8Xhc,`Ժ p¥J5 ԼthAQFUm2 ҚJ-;P{uc˚ҽbjwFɏNEE~[u{-b\p>E*ܵVZJ8 aZZ4pJS^']hYwt^4|})ᶜEz0\+Yxlb1Q]~yXgnBwl\:Z!|qҍ c(bsX% (:wQ]z2탂J,=o i&Xlk1bdTBf/WKuȨ +-zgu;g߱){E&f 쐌Iz*ŇF rXK0F;{i5’`-qp"3(wdoʀ(5|YOLփvO߂dy#@Keֵ5IrT3*6' V3pe$; "Yj G*CFep_:vαA/EUFK/ԗzޣ;*CFep)Vt$QS鶀ʰ"^W>P _e{ӣ) wȨ (ж~z 6sY:-Q~͟(!28J/oؐ' VGՀ+QS;^ I0\vJ/ w% 2ۤk-Id@훐ZduF2]Q 鏗J(!28feЋaXg]0I7|7Ԧ,B`iu^2(L-]Z/ǭCECP}*H\KJՄ.f'l2B({\Y uȨ yQjH> T9TȠ8UǸPùJh8FΐA3YwQ(CFep uǎ.P S]g01P2]!=赇j!28ꬭQ=Zy7_}rCFep&E0xL9X"&b6PD$w]}rywwӧ-Q-+(i40zHne=<9ȸ=B4{ꈶ1$M-aJIGÁXV&t<,lx4_THBQ^>D٣2!281vjΖb$'U*Chwp(Z&RpUSdMVF@Ϗb[P̃Ƕoy}8IZ!T@jiF P~R=6=Rr1:dv_>)b U I ٝބВd\ CodT_9_CCFep8+UDh ; _{Q)2*#X m Z \oᯀ4UtV<ѡKBel UV!CS~)Q?^OʼnQ@cȖ8,m 8^ i1A`OQ㈌Kta0cĘ2Q*]HOD2*9WTm!w_HE]2*#9f 큝f<̃D % ]It9/fXˡ+Ca}|$n§*נY?w xKeZ~4#fxP%2jLבWuG/hm6-p$%XQ5gTG7޵=b.2*#dnܸG<*"CFep)hFWƥ|drP ֤Z(`PL}ZrٯyQmvVJ~~ruwwkLNk?rIanuO&޸o7?^c|{9~ہ.V(“?c̀iPPab8m/#r)9U9reM*HbT@UHoG?yzʋN=t,IKMA6RE ѱ$eLۖ)يhL+6~t,?cy{ss?`noOyV1;f=yS/5?^.GF5lx ю0S\X>F]&ˑlHB]i'>zO}3Pj˺4ބ |(`ZQrx`wtRdž7zz X㽰@e.QY+d$ƹ2dhN5}c\8?-~|Y}k|9g y~fXT&ən'kӃn;Mл ˈeu9&~Zr[仛 |pTp/;oq&h?}z1LC|fՌ_Lt:Lrx|n@)Oog!Rfܞ4/Ww^Gxuob=ߥ7#ݛe0_{$ p1OkJR,O럶 y]^lzGKUX?67|b6XAM櫽]nWEgwa2`?x U7u.z_ <Xi?r( iyD 3J`sꔗ,V#|K>f1(Qi[L~>6*)KsaiD3 +9?JްMK=XpдBI-,rꜴ@$@Jp<2fh x%X'Zɛ:ߜu<;l 2;7ͰT:W_mգ*e𱵿!8So-_8*`/a3zy8]Xmѭs|Dno./~"ߤ>Yԟ2k9S&o: M!#:(5몼5G(0yKVۮBGd/1y%&DOKoML^{ScF(1yTe}cJL^0JOe#F(2y-w_(տL % dK9ɻ\1$[Ki>WTҋW?Fܼ>">6舥;/_7wh_x<ãeX ҶDVJ*DC3j: ČPJ]Fwm@d^$qѭ&1y%&-71&x쬓@`^ݛZn;q[|}%ueHdz,PCE$<Q8⠵ȑ\I;!Ԫ㧄B敪|I ̗Pj0|A (Ta^G(bs1 =%j0odu/i겟 7s]22P¼е>r 8(ml+PyM/80*^] l 1?LԈQ(܅&94*X` 5gLf>K(` \;%8+ΡH& "X@kW9_B\=B`cMc%W,Zq,d)|>{A (Fb F1Y [ YbQ"6BY|LL 2NvmUҘW@Զo:)活fIAQ6`PgCj0*ڴ 576Jڻ 0_@p)ƲCOf՗7 n)!rيwlP|3mjÂ"{2^[d@K(` U;2Hm( F$^Jo0oy"pCBU/5۷%jv`I (`^ Ϊ3ݖ"x"Up< 5Lw`B u>Pf( O|;$°9PB)fvK +,IK@B uc4:'Ah={k&P1J(Ta^UO,C\4~w%qt .obS6|K*Lsr>-=CXNk r`rL\j&^*0H#6x vt*c:qqm.miw^1ǷTeD;q]6˗<=}03ѭŝ{? b۔PYBs˱rsOqZ} dI¼„\uypT(E,`7İ;mSp3^(݀$o:)6ԕy 7dJ @v\iI.ъ($w[} ]I8l,"XѠh#M*Ү5@rLRH!Q1K% &8"i*h*r% DPnL!|d HBdM,njF _w8Nh6zYGiY_|)KLtY@>d_\;)#O9~u'ŭ(0Q}^~}A mEez&n!o/&}mQ/üz?If1q?St5StFpoQE߀ .λɯ9=)N#Wis?7م)L{8_X;ЩdN?Y,[=`#?MOoxo N`.$a)cU@/W4]{lgxƟgs8Ύؠ{7Xqt6b#P+W}) Lȿ]ۥ "߬㇉ FD8K폊:~mSdOOJ|<˲^𩥀/s;^cV׷2dnT'$ޖΗޤή6m uVHscJa킊"%¹HpVi͠ALՎna}L>&TolRx;Ky#yGoShc43c,T`"X-,52r,Q3b8)In:Y8P|w&W, \hhPF:DO5;eLL+|ogk٫I{4PδkL-C^ܚQjå- xR&4ךƃ@ʥ mբ 0{q,_DP2.ɄJXO$I4#,R0jBT2W؍MR"&kb2 b*36H SqC| 8Ƃx/jkkac2&6aL~.&w6͞%b%]I2<:|",E2^*"Xʒ ue&a&R/[&it4`҆@r̤0^ O R `L&nDnmA=IZ`#BiNgNSe :Kŀ5*Y/iߔuFTps=(WO2t$׭D`A^:p.% e@QʊY2,*gMǞS,{]smA[k'vgPm)7J[|-)+" un3%"'ZDm^>j/kyzE^m]Jk\1Uxj%]==hxkGtgh>=xY*2?T4 5roPH-&Hcs^IE#BCjR[i(4/Rs@jDĤp ϻbT(.R04G{ۜXVv;&B;(7Efv7?|kF]X_7\T{y((aCu;䬣'ǎ>T-<58y( A(czfl 8 r$NdIr.<Ş8b>QOT:H sL|{0+OW/R[lI!Aӈ-;e5W?~x,2t)~6h'8|BQn}3yq{-F$ ui%#VbK\?=-կ.|aruh1k[ '%Hxj Uu=E5x=}*[ ;"P-eZ> }RHGZ,S(ygr-7otgE,g yqw Wbz6~v w?zٝ}'J)eypU0p̧:2{q鷧ݕUo_B^\n_{$/׳j<8݃?~-ҳ=Ի!{`zȗ62|$wփgd CIbV.oeƙ6Y$^tw~wN&9)O&.G'G׿yw=n %v^ 87F]&VA{gx-5$gT lÞ>g5c4 DO=Ƒm Ε޴aZYZV%vY %s]};m-. yXZal&E335_. ZKˆnR)IW zpRM]ouZ*X˺w|`W9;xxdc4эQc @ќ<|Fs%%gnD\ZMIy=bМ_Z5T|3 I<>^;`|YLFQ 7D@Ur-e="DՏAuKnY- h^e }\Gg)S:*,;/6̺C*J##L*iW׷*$bb~0͓]K_ ok){d @I/[{**9oFbYSI7 W?0_n5Ku7n Mn_7vt3Yv:0y$]sߴ3)%n c05 S}"DDPJD^s@t3{M:5LikDRYT[.wN qs r*f50|LJF*&@R\ }O [d4"(& ȄĔSp4:F'dLitL fL?dLit2N4:F'dLit2z4:F'dLit(Sxy5 ޮ#G'1%\FXt80e9ҿߘS"U3} zDJPLӓyz2OO<='3#dӓwJ"'dӓy2OOJ 3P&D@(e"L2:R1 UYiiK'] 'Ü%Q [ZcI"mLgujS/b$BQq*=l&x9 "rbrX+"D4 P1t9ulw-KߔSw`'S!Ľoa+l3{$G[#@qmVb= cSr^JGקlQ4CxKyz U^8?5w7mG48 ^w׉MK]G >Lkq2,xBB63< #8!/:#; :,s$[Fd =!"R?X{))q7Db΂w,7Ƒu֝w@'NFO5V^4~hE8Um> +ذTuձ?1͔Ļ4-.˒7o'waC3M4\4#{Yg h ss|Ξ9{>giiI̅2ʅr\)jʅr\ @bvz%t]ѰRDT2o}$1 #{ɬ; ɬR5mJCR<u/;ar0$ODӠ,>|Ybs'>C$8wGT0|;OFo/,ݜ0cu# ""#r^kJyf [ˍ 3ꄓDkNS> h:֕B7Ց xC"%j"ʉf܁[hO6B ؟$:wXO0L=ŞW7kt%SqANE9Fr1 iK M;&>{sDh<. fHh>$^ZFv#m ZU "$k2xn8-EX{+h`Ti$B²/wbHSl9OgEX1.ܵOaǗśi g  /itv]Q߇?|֬blv<—ſ͍.4? "5`?cPp֗}P%EXVdyNϥGYmdkc<iˢ!UX>Vf l |"\") 6̊_)> 1Lh8;Zާ _age1Eݵ`^/^l6/~`j͖P+u?4O/g5MErSHRsZd=KGhaHozoڞNõMg`{JVUE-6`M@ZX.*4wWb 9pE鏤'pV qdϖfׁNv />mn:ћP)Ԧ&pbPsS*%"ZP{=*=@7e}>=%Wh懟/p{X&0yI:0,avr1AuzZՒ!u i+8V09Myf7#wk}=dZ? ϯۏUK{ldCcwfvXʇء2$kjRZum͸tJhrb$V`)>Mb*^yGB; p(F% 0ZM-FIN ՗/"30<EDqɍ# bp;>HaB둎htϡa9$$V8m0RS-yA)ı N&x;}ug?[W>ñ"^$) K 9D>"pNGDEvKtvj[dhiQQb .$FkσLSĬ>k@*őp.uXERLLUHT )d4&jIuX;=d/WO2 j Q$Vi*ɾMhS\oxc=ZYO*`eǬvTm[6_;ju].^+͵Knvvz ^!\iu j^oL"Кy: TOߋ9?@,gAgI@WkP`>+W-b?ś_rq񕿩_` ӯ{E/L|6kBڄoH) \[{_|lrtNHۓjLoкO7Œv5H: o)H׍NP !v*X{D-؞7eL;*LrAfZH *UYq[Wf2MbdSnS/_ThoN_mRnu]qDVv z8 xw]wgf;5m#`f6M^ٛq߷ƻ|wnc3pG .,u~[We%}֔սv­]6MtW)AZ4UŅf#8ڴ У**z/nRNb0AVh=b{'Wxw֭C#QOư8AdАc zhF0r{Btݓ띊i8v:m)mJ 6J@Rjm3 K1mR,/s~1Rʴ4W}Iɮ]Yw sh(,5(V8dY瑅)D7 <;8k*Cvȋusv5Vo”GH_A@(Ip6@ $z9bRjUґNk`"{&CDzJ~ KI CǸ!&"ŜeƑkt֝u@'NFOl  ?4E8U;m簌> +TDs?|nєĻrZ貖kDR T[.wN ,HRꈙN ՘Q7o'w ޝh*}]oGWrɝq8\\@͇X)L2#/"%R!)-șuu=V-+)l$K(v)v-֒YKbqʈ澿n2I8WTny@&P#lͰBEװ8AzytŢc1bVPȠXW7g6Zyk3FM>qPZvؾn&1"NW {~}JVribW,`,-'V)b;8ݺn6W GO[Bx:]'-4]4z4ey/jl]餋T6JՌ!^۠LmL )&g:[%PHaXx#r2W^O [d4"(&{nvft;09Q/qn7=Gd #;OaƮa{0|rXсֈRgԙ.uKRgԙ.=KRghJ3]j:ӥt3]L:ӥt35R:̴ԙ.uKDD3]L:Rm8@%EtI]RD%EtI.n7xW]= .2]]4ɓ0l׃4̂JuK\[0 vjKm3b"~4rqg.=c IUp;o9 O=cF@1g)A< NVF_%d-q/i}5iݘ+Nr?vIRDT3o}$1 #{Z"8;o7kP{|VUDzgbb0q3 IS]Y&}HLa8x[ē?ua\Λy bOÝ祃,A1HaADdDk n;Y*rc DkNS> Ju z3GG*< HXwL*"s ư4N+b}!ʓH&HG{ ^mqÎqa`S5C)M@ͅq> zm`e1z:u ͟j`kVf/W qnNhMJf"v|SHI-VK_haHN=4z^zzm[ gV85iatH\9Pfr ;I'p93~d6Ys$&/7O")œAd}>={7AɛSxzq6%KrNt2/]D@dә8|Q$jp'qRGp k(H ^ 3G}귟ìk͇PM<3Vǵjϝm87\+GoB%S2X9ATR)tA`nm@ 1(R۾A3tSgBaY9Df +uQ DxZE`HP-X'Sc\X0R|zw{ &k?m_ԯZT1`o:'W;7<$3^4:_x hǂE6=_Htד߆jHÈ-zh.FApSw*qr`f f2\Tq<:F69Q[F, hlCDB|I%dq5]mEiHqLPNHQ$MvE7r+)Q^V|u[MtDү0qQHitIR XI%_e=*nou =._GCJ9ō8`1waʽ3øzFhNPcoa~$E]rBlKiDvѾb;;k'I]N&מLI)UtIE 4<0kmCA ƙARL F&x)*Һ=孓@CD#STF`d ) 0iDQ"L0|kԃkgu4L@f^ܧ㋈Uw$LОs&/:ǯkhxt. :ZboO/m(D(̯BQ0Z`D$4ptXp$jQS6EJ D$h@Iş#-&P%]Kp3" DcKƁ0n@ŜR2#ȹ8=4*EO\S0n6F6'huo>7iȇ/+LAZ kF$EWEDp4rsJI\J1^1dREy;Q)z{!ioh/rL,إ] |[7h+};d<1CP|<u8jkRjX)׼c)BBMް2!k^ǚdjrK{lďņ!&c¹9AB8[Q6.bZX{yTT(@ P^ynj5w8.h7d~h}T}%56F5R4GG-ڌA T)}8?a#yNjd}4L꿼&$xIQM) fYhc&zG+/bҲ ͟ze:spܙAIkPP>'Wha]^l1@4ͩ@Zk㱷]r2@Wa:x XUjدG5<3f)-0z)g9ZSKl0;P sZD)Z3lm%5Y z=_EOcB)7jI欳auՄAT}|D[*Ʈ_]LFe*8]tTwRsc'ؾj7 Po}]>\ax^g˜̿oIe_?xǪ6ܵmEb3U.rȳrl-˳o[DcA'#mBI5 =f8;Z>\@Ow܈m|#ЩOhTsN0^ЯTwQ~N= n2#U UA;)$٤Y+%#0EwzRP ڞipnWfy/ t=63N[lB\:$UXMZ ;I'0;`,ydF9ͭb9 Դr+v5{X<@4Ss8C{V[Z0__rSxܢU⃙8IDPZעS[U$CB 4$cRʩJ?cI"0dJ6)qj2%[=IA&"cZJ*Hj2mԒ35H F#^gs<SL'>$g[5C$s51$ę R!qu\) `++YwAHHE;)2nLyq bieAԨH H" G'x#(I ER!D n1H )Idrn\%[,W'd+4@rm@%X۽!YfAD#R0e7Tj3 O* ɘ5>c(${Ywəx[`PHyime9S![\E_F}`A;M(7+f6QEҡE ΗZ~;p4 OA0ߡwoOdEѾ8"zc5|(xOW9dMν`ͻX0"@OGw"|\&EA{ YаfDBp֡Qξ/^@f!M/vo+]_X>Â}iI!~о킟gAQh(NF_Yoyfޛgy J dz~k&+"/MrB&ǧd:W0Qw/a Dռ^L,~{+1qn27V8[ylMV(&\4)1R h0i"iM"H(P$a),[͇|u"D+Cec'X?1/Ms}*s/y7˽q^M˭r@mfjreG CdQD@B:*%X1 8ȸ&*K؉c'?X2@~^TD6{u_7zƕx17I2啄i?j{xlwi__(2}i pLU<'em .{f夯0ʠ?Eyor+w5ZO']0Ē&kmW3zEcɵDsQ|y0-Gv2v/Emy1WpdGƸǢ4l0r}c㛼}Bx؀.kHk2䱹'v* :[}܀ WXuݗnQ^1F09NڳL1W+j0_Nvn tA"T t//`t>ÚYs5QEk鑋Bm<<<<ꫨT j41* xIlmyT,l/͔ 2S&tO!ϾN99\#skW3r: .ZcpG_/W}Q!QMBqB5S_%fk^AmA6X7xQye!Bq*|`oMnm%&apQZ!&S߹k  Fĭ=0K|!X""rFvpL(sjIr< B'o[^;Qa&lNh/1y XVmVAրe0*æe^`Iض`]Ѫ!Sp!&\|FtmzES& t,o4a 7=JE+m}u~2%QޟrqjFBkCk8+8<9;܊6Uj I~a]iBk;Ooޱ͢2LnY'W1#0bYi}@$ʤ1O5 Y S`87OhPe h2z}JdQOf>9mUs~&B`i"5kbi3FA@b8IŨsV(dZS8%!g뒴 rsFzw$ 5ת\:iusIKY( Rt( ZHO>gNۅ=L?އ~ָ'k[ -'Ձ FCʺ޻*u5:Èj4J-m7"`pF؆Xzan0w![ {6TcAMrnS_lxR$^КiS_WNЯjSeZmڥ.N:gӖ(+cLx& x}~nwx ,! )~)mSl$"8a95VytXqiZH.FMn!cQ)h.NkY""78~![;4Vۖ&pʟ!%Rx ~sؾɨ{#*Vu~"5ӜgM'QoЗĿJO_~5;X{wl/PQ^A|^J ]&-Ag9L%v6I27tg8־bߢ@v-#Ӂ~H99sVv&k5.?%Q79{zkNPp'v`TbNbġ.b/ j N!CCK&%jjH/})PJ*'+JZM;->(l'?Ы* Z2* B@gC@#P pG DO<32'd& {!mBn~/)$`$ǧf_p=CDHS?o$>BV @ȔBz!-1? žA 7S zҍf]{hţ&cX:2ZOtK+E]KzZrc(=FBҧnA~|p h#H/TpgC(˅l%:!d'n[H.јa_=z 'x&=7!@ =XWBf xdyZӡ^3.w [!9m!gf ZB2S"$m ;Sd=&ULg04">k[ ߮cڞ =жckMh; dPJvH'C-dשT$D9 N_[eZ߇bp }aaC0`)-Z><Gq۷h:溧u-iqc}e541'R`HOHc^^%'~?jїQpF?ErQ4,s4fǢXO:d2%z|ڷԼ/ʇ%&hʚx2zXYvۦIƷSsrz J>S7wjnM~R⩙]bVv[h#xݪ7=Z)N2e'˫JX*Y/tG?ʔ@b2Klx,%1 uGvo7l/7bw^"vUgj&J6^x65^V-pvvnoCٮ^!С\mz@9^("*{\H7*doՃkI^CLNZEx'tԲ)xÙǛуGߕc.4_]|JT|B?cG=fݚYY~ Ty_*O5߻>+˷L>bIe} .-cbW9~/-禖|q>^G#'^)+5ab䓾;5I 'U bUFw"r L"̢?2ƮQⅪo\ƶR{ۏ kvK]ӏγ IW cp=z̃NJk߿S&.kxgʥ*SW<{{oo{W=,-tjbRK"(@ca0Occf~j'(%c5bP!ʐ™0V7p &c@y :2MRGPO2ۣ\)b4\3 4V尒~Z!TT cHTb', ME"0c I҄3B%<BSQ HN92 CH ?eNKgIϕߗ:}59dŚ'̌ *JhİB*#9`Y&A` cMAhEfGUHԔ~_*ԖUB&iFGU#&@2JMl $F1~,IT@53Z@JeVBd"Z44)d3QY8 E{Diaz&f:[jBp$LJ[pM⨸{K޾Y=v<7-/`!7̶-p:/h|o7}gdfS;%ߎý0-䖙G/uX'}5s-.hTѤF&j.qٻ ȽY=r #8^ `1N\F0i".O0_LSIaƤB0hZpE`G>A:W^G>9*|i&n#0CG#њ\3V 1<=71iE/q#)9#:tb_닥 Ʒ滋~\aPMF%e.Kg' oU9>S8FwF_P$iz_H-]{?(¹;7aqB!QCKuQm.P Db+==k_p4-h 5+ҨGw02pc8\Nj3_~g}HBN OKLK*0&Q5aGyð6҃61+!ݷ~I8+ o-ÁQ;n_qXcgVӯo,z{}z[.h„lKWDdśq:[%gC:Z` (|E$pcύ~<?퍁&d7{AM_4:kh@qt+"ŭw"2{_o- LYv_'luwy%Oe8c?+~Hۊ<J'(rhz,3^:mN;dv` )/WTK_8#?5 `Ha3R|Brzg}-9;^fv}l]YoX9L0ϧ72Q.r|P؟;bOt&\َ~H?N}\NZ*9HkiΥ'd͉'j6&5m!_sc!Kafzt+:M 8-|:~kف߆[+9M (Avvt>A վ__ 4G ?Fij "M 9TI&BI+($7+(P'B\t/VAP[+9M (ThBC'EiqB}VkijAVVh10dhĜ|]Si2w"52=/Yt̒KXҥp+Onb4xgW?4Do6z!)wW3=303xH H%di1E!"Ulg~o=J? X]Lqw EY4W%}4&7?G<> p0};p<^ ]R ǯEge3C {--:24͓^Pm5oVb%ZW^uڑ;e $QrZΨ?p4"躶1I>k6OD+VT;qH=m` MuPf)Ks{cO;s'=KH΄&tHۑ@cΞ7h.3qnkN(j2~V-I$k_ʯ[s"%;Z?Xz?#n\7]#Bx(]FC׿nG~}u 5H]P@KXt mG g X ^`PP+"([8cJie0h y,109~mUC,c-OZ5U(زӯްhy}3ln|=poURToJԝ.*PdŚUbd}.Bt'Q/}arY-E@1g<Z  9GYԿ8p"?Q8u'~0OsutfQCQL:fYmvYAFn$f` $y3oiP9 s 9 s sTkbKg8NQ BBKkB!''^*tߴFj~y*xnIo\ D>5Wv HATڦ0Gd & Ê-8ZOce,!!Z Cڃ ՚l7̛v^ I9y 7AS"})dXo]TD-@djf>z12w0F a7dyK$4CDA3P*E3d FcJm%cufg76.8UxԡuaEC0?>l8>'$Oo LGHo=3F$B*'Q0 o ֔K[s5!NL7^Yuxޮ$ ! RKad%R.qXJl )s SpfA+8.(UF,Le eeq 0 q(@$K-dB7r2k )QI 0 uKBXAXl$ 4$ R&1+*'Da/$" ɚv(JƌHZ <V@P(H?ŋppMK@ѲZL˜+Yٰ7%Kʸq w 4jOKeX^v4#+,=4dqDog6"5!7h Bņ&j! + GK˧{cG·qn5b-yfE"Bu #F9L5/@hLg%oqL{]ſ;#36Ly;aUc '$@9VZuJ /,,Aʪ<&3K0N!ܐX+=w$Js2.A(q!C"KBiiK ɤaykeyIg짘^0.s SL/(lZTl[`]ly͏eόǫU4Re|nܧW? S,$AzdQ̆.Z[ p[¦9?Owtif8)q;͆o6T}l&]|?7:jegBc|X7_ALv6KhNv} > tmF0Ni€:5 Õdl8옸+E(IZGBel g]4_ŹbL], T :X0 FGl?БZyM. Y`'y^b, Čf#֕ #O 81 b%'BO =9ń4&i,IPm!@i%s)dJ!*1+drg@%L ' nHBM^ Qp:I^Qjۙ'Ժ@[ NLܴFv^&91qˉ)+@b⃦D\bb))bb9ŗ 4&:31;g& O)%ƔP|3cCꊓ+,/J-?h4)`.zƬS Qcg˰¤MWd4{՜W=nXFV7À?`L&lPg[.KߗRKT*JRFl],{jI_"+" {@90Ĩ.s'rRrKZ l9v,LQ(B"NLiA(Gq""@EK҄0K e"\B$϶(A<Όft clLDHH(S82r=ZH㓿NjgCmkՋ4AWo!c !StoY6s=Z3G&x^w^l*ZAaSI:0Igs{i)0TIB&twJ.NQnMº՗uº oW\fO??OAI'ݞLz kz5~Wwo^w-Qjhk(6<0jrIzEgn.S*sAm-Q!ĸ%Co(oJ" }Ԗ~U6k#Ccw7Rh~»UuՔN aS#9ÄrjސuU{ǽ8EzsmpdKTD!ugq-WÚfK52@Ͳk,Aå&X 3Zz=Ǖf G=P)[!=ٞ޶p۞f*hR*8GH)MO" e4R$TQSi%% &Խ wU9b]z>ׁPolW5DgN7t(pMT+"!)OP~A?cXDQ2L $d _5[nr2ŕ8-y;) 7T JzجKrP&ډ;PC u)A;η+Jz{VSCj 2o0\kSxdq2P=|C3DHT, i JCR Nx@` Y `j.TO@ Ô>frڔ/'E 8)  ʰLi*(aaT'ChBvtB6.yn7| 䵫V]VF Imdj#-JKPBiRo;JKh.)$b FXӀN%g{ qpA[:-_+K.\!Qb(&\Yr7X.`Ў{0S>>Ub|'WWL:q/b|H]zJ0v0_ÿ>)'#p>7|knjo6E{>P):kaKE6i>`>Up(9 Z)Ҏ(2fFovsx.Nj)s#=DNk3wE\Wt^!#W싋;>+nɲ1gmϦqs:3^th sƳE>.8_8aou]z٢#)y<)q꯬knF(u0Fb*uL7r`y%'kOߍVaS h\"s`diLEwɄ p+!&\ndyu` R9э]9!ږ#b^t\ 4%vtMtU<ح*%T^0S yf.Āq8sÇm}YcO0ֹv!Z *VU%@ 4^MfBQTVp*ĕ Fh@g"u]"F1dԗ N,eYp3˱{=C>|ݐmm 2#΅#ˤz9^Z` }5ʔWa[Oh9}'mv/ tdݪx< e%%8SYƏ=X$PZ*n&4|P2X}+JYX, wi6VZy_aLiQr/,>61[M<"J o3]Iaft&/7@oL}X '^i ?t($){"^Š%ŠÀ+: W_-~E']P@R |_ BKWQWcJEQ< a#40jH UZH>eW,ܮ "W~Nqhj+V]^C#iCAj3ăQNb])$ Ih"YXPF\NI>S0I)/snЉSb煮AcQk!].ϩ ऎN Oqf!gg m|ˣQн>FU`s@I6U 8&ritGJɛs\@ 7|7bχO)ȩ^@]I6"9C`b6Xv˞0~띉j(z{m p` Ż(滨׬'Dv'Vm4?e"{) ݨ5(uחSȠ ͉)0 ɥ]VY2xτKbn!\gN/n/vF?XUp6|LFN{vzG0y=* hlP?m3 w<>:1ظ޵M Ve??̱Bpw^[WRҋ{Nƚ?1Yc"Eb\kUIEFxmX#{+:i sQՖʟڭhbe'ڷ5dwtaXrɼκMz6|jrIUrGoe ͟|^}]h,^tdJ[[B/2>MiI;24-P%UZ,tD*@)Yp׮{Y7-VN10JBK T^sWt>Q<[v}Ae?E“ Hzd9ѕŇ`[߳+:y3G/ * ڼ.qpbk쨤HQue^eQM|~NU &ޞ$' 0el ;⎳"o qZDRֽ12ib#ľx2aH$RRםmL5d\\^@SN較tEa~wæ>uܴ%DY-=# {*@._\ )[Zz7) g{`4+iODlQY:JWI+ 9J+ƽ_Q@e,.W`xΤ` &sZh@y`a=}jcG6x禮扃mpytcwґLE !&XN'4LnǛ GY髎WdC 2cTȄ_R>A¡~ F$]-FR ±~A{>*ܑZ]-+k LfJ(&ZIp;$4g<VƏz͎O[JsTc?$}ty7o.Cޫ,gZױ_6 ~,S,T%KUY,%ۿ>d){oL]<?<,Gq"`G$h>mBtyU5 ţjD$;(J_>S,R|@)A `aHnr ~0'5K(ӳ jS}=d`H *=S3Dei3ɼFwۂa5ғ`Gwe;}g0JAȸnm߳ʴDXFK2™fau6o#VA|dРH+ᐕ27WR3Y}ɵrh>͔}-|yk)uW\: TÊ9(5Vw6#DVBZ=Dp 7; j)a9EA CӘ;㋪KP[*n]Y_MԬ68n#St`9< ^* ; d%RV)_cYFr Q1D_$ۑP`Gs Ϧd rZ#N'+VԺSqA#h5%=//;#yW]΂, aκ̂\۠at4SoE 94:yx6:FqT\?Uϑ₃vנάi>=<ԃw:3Y3##A"S'߄޿B{P֖#f?՗S(xt8]#ۤ'V`HE qwVR(i ơǥ_,8FȭıNC[aˇ U B`A_yLqVN(Fac4F52vFZl,͐Ur(歴!ː6qE`a@J0AC\RHK8oEg|reu{H| zc%'Dv99‰@JV4i)}RX.b4WceR]rBճ s3KG>J!ሠ/.c#ڼ9^ZX g s F *z6YUpбCC.:\BNѰ> p-&HTy[o09`kaS-cR֧QY>r2D!al#B1r3~ٕaՈ ʝ3YJBkMc9#X﷩rRjZϒMmD#6byX҄x %JFƽhjxA,FbJ7R&Y$v%UeTq>4&էn2>E~F~.8@2WEw(Du2hRHdxmLQ17kLj'<V-C,p\YA SFdO(W~=`h0sנ7k16hӗr>GT8>}V,kUhz7nrAa'iaz)( +R#CAsWgy)}R.vFT Iٷb\Φnڤ4|nB~+t,"2ȴANItje JEKEɢ}D4z>$8P];k#4= 9 3v+nDoޔIDV8I۟y8੾;{rZ8'mf<ϟf bkQf]RaduHiw~xO!A}˫ ?um$o”q[G/XX›i/ɶj0sgSh#Q,_v{|*8A ا0ܱ1}52ۢ.9d8 ϖ6hox2ŋo X[-i *2jd| _̯@ DPB7J:-_^Җ MkdbKi~dJ&LًKjd6\0 w1RW4&C)F5j#,V}'VDASU"sd+!*Y[i%$W+V=vMo)Cw-y[?n)5d!fT"EoʱDrFw)!ns`sJM&k-;٦}`𓢎luڪp k']E9&Uu,y% 2 )A PjƒM3 .eUU\Xi<s^f我RpRjXg<*kySᒽ&&nO RUxx4`q> lhRkdXݥ.گ<1v H+Z}Z78Q̀52nP*)0AsF4PWclZ)/=Afh)8tt+ Q+͠n2Qlt"P#sA2u1NF՝6BB.77ZNr)>׼eYfڵ<[^X'#E:!74Zj猍sjݮ&<*?^w ˉkȂm.Q'ImiqݿJa͐J4aRh V ﯍-Oa97+#|7$CI-q,|/ -x^_&-JgWb٫ C99|g?R?!9r4EɕR|LĔA [a uP2TG4αa  s6/Lywdo[=u.-fA d- bB!^b4UU06[&)RZ&rK$-Ĕ%ZП )NfiF6c!vɴ3kޙa#>hp.͟O\(~LH2ӥN.IY`;y)"AhRV5KȂ9$mGiT25D/>eeZm3_P<(=O*YFKٕ%j~ 05ROBQ1uMCBay[('LAkEӤ=孹-R^Z2LyiCjM3^;|+~I+p4G %p0դ<>n>0!e Kܺ1$q(=Z==%EK/#cI)+ҜbYw7pԩsj(ЇIpM+l miPqiQ_3f ZXQƪqR@)},PQ,iğutک!T,x&"(WBcHcw 1ݪn@eD9RRz$Qjt Ei**bϪN@"xoPB_%햮1 腫IzUFbb߬ujY+WI>:cdVX %NӎnO1]YXN9\rwLhav34v%N c؀`E[]5Dut"̚ ڕmy'bL@Faܙ'IB{A,~%btDbBUFjÁ7.B֚wAe2v쵔IQzSLj΅#YScCt:.$t'*©xMqEf)(Dati; ˂jdCg_cVp&I.e>LV# ݙ/t'f:/{QT0;cxBGcCCQރGv-)ڕMW)UypĝnjP["uC sl"h!8s;@N9H1,2sIZst_8p e{O pB^-σb`&AI#M2xb. sqw^Z9h^(]b$eԒ7ӝ{f(G L`{=t;TP5FR)bmoXKNX,n ʸX;”}(#xr" DnӠ&ǤGӃjV)\hdFxjdG '?'+)NX0}AiH w3iUT JL+*3. xWh'ei>x8DKշ'O4b4gmkN9kFp{n__\NQRf[+0 =qDV9 _+B|PV1"#5&bӂ8CHѠ;Fk46}6 Y&QE@ Sk8m JYoà 8zB'l FȨBCeQm'؆N'0;ƪM0é՟`' &52n$tESSp='m9v]@#Hvll!]EښI}o j]K|b)kS=Q\ C)>سynjrlll+&l:F2WBu/IJF|":5;"T&6]ܩ!QkV-Νzl~lhFbo?y;80qq<@)꺧98)љEMyfDFIpQFqktkvyx. 8".oXR}KMƥ.$=cĦD70Kexf~4!E(pUVt{g1G]TCesRT:0\aHp\q!V,2D" {f}b64 |p8es`5p]`V=yNamRV ̸w^˘fu F3΃Il]VD0;g ^(muR,_\E;{nI !K ;!(Z*>@IDE@rCXDߣ]Oːz5e LNEϴ_\ڍ\g'i;{ onk9uy`5>'ZbKFWSMc4~Tԇݎpb|Rg_oN{fZݓʠAǼlNΤwK\߾%Z]gDŽIT`hЦW8t%NcK\lBckCgz-v*,5W QЯO%9Dw Gu')>TK *1SVT02?NVHмRFojٔ pg͎=^^ Wc M4GPI Wc-DY )H3xX(4FAAD6و@pWOYmp3\Air:-N/69j12NB!ƢqNa1"Zc҈A?k2:4M(e箸>3cTC򕷶P?Q(`j 爤A$8'\D=Uc G ~^=P?\V6DsR^jX:.~^8zIMu*'G?GR0fwILM]s,[aYaocqR3Afݎ${ ~ Co<pO<+ Iv N0:!wjL7coAu=-'VR*3(ɁN;lDۥzDߩ|{kEZN^{[Gލg{8M4eJp*Qq~- t߾.2n%vS8\І\%~oy%\y˦N.j9)dɝ덂 2Dň[&A$`!1'd1Ysp1zk]Qs5M9ZAX[XjnkLPJEʢԉY'{˿y'nB!\ԼDJS^P{ BҖHjqUE=UȺ+ ~ݞ`S2ѥCS$pO{K¯s JH$)Β#a€`+D@?Ӡ7,PŶẏgXR>^4d>=[ ST6ZXZ^ƺdDmq*9V[\\ŏ4|*W" `܇F{ෙ)zG}'.)9[T-L' ݝRԉ.מv}H':〼BSP7Nz7S4>~d'{Y1ݢڙܓ{aY`nIᘧĴT'VۢNzk[2"TvJ\.qJIնxl3V mXc^J5ra0u>[!ۄ>>fe&mZlVMO `fMxN+Ԁ}tCDtqn9 %Z ;"Sy~#=RÔ],Ο̷QQ*H*=(B$ wq74avf_yFR >DƽAlKof|U [67up SwU>X<ֽl ؏_V6; {iTD^_B6<=+X 5Pg6܂䜪 XXvNݧƞ$gSTK!*P*?J=_}Bqa9GBh| ђ4LKXU>!(;A񼻧(-/Odž 6#FԟU3"jpi4 =HHq!葢%RSƨ?t\Ja#ϺO2c22/A?|(az#pEE>a#R ~I  -1u>].Ga0.p:.;o.G$`8Y}-ap,fx|%@bD6,f0ſ&2NA,Ova:ԌGʗ 7Oӗyfv|yÁK'~?ZIBiX=E\a l}8Y,hSk9$8!D&jjaP?s>2gQ8۩`4+(A ͝9/G'P$;9ji}/xP~;9=O#89$-ٛF`0IIrHަgTh N=#XbCjW"갸#S;_WhuWD)ޣ T8 8G `>xbEG-<X~ "HDD( DqENZlsMPC {^!)|h@2rLD`Hff{ 폃3rЄჿ_6',>qO_Tlq"࡞ "VP2*5:g(xpf4KL&}ܖXX$=(N&iD:V;#( L"i hĤZM)KCp͵2/ލZo^ P0{uzsɺ_wHeB.^R%,đ•svPTl\]s>9&wv@0 Xe U10>\k1cؑ>=盿vrś:&aa!K`I0q;UTG 4 ,>&>ߩ_Q=#=zvtieRiGu67<|?Va 9WOzޚ|r'Ytr+7T#0cZN:r.XTl&}1ܴlYhDrE0 A"N@pHDsE$CPM#B)9Vĵ-2IaHKS+q8h)cuONq),x n48^;1j5"s1[rP$VU9^s*&؟+ 4gkl3yCeHupMkЦoC,>Ѱ^5MH79qx$ʕO&~6y:ф*T\*mK2)>+_w rmfjqrg7EM?l&O%3o֣A;VYKhwLG[iΤ՚h>1#}&P6>Ue>gy)"9IyŽ&޵6r$B^!`$b_0fg$(RKR+Ùa赁ibuwuUFLy$"F&ww.DMd9U}Y%B7pL󻬴&͙1κ #.HK=Ι1ze9`* g5Ab$/0U"޵{W ! Śѡgf8KxLA-r`LQB+Zy6+x/bcl*f7ZC &ƽ5n'v3 *"]rC>fˡm6!f_PKkr}{sHV"Oakj>n󤔎Y~; 1k 8-nB 6ϯ(r;]D%׺0 p9SJL̨Wjrpl}}8/$lkFon?=>ؤbd7Jۑӷ'"wWײŊAhÊFZ^z[WasZJVXG6+)eD^owVXby|^ kK`npAb>^"S/-6e(78Swqۋ#xKRG]:XQ{.b^-.Y6+K6{xMݲ~]mW^ȋЌf̗8!:~^.oV!j FrE\wE ,j{b9bުt o^cFPM%D*Yf٣mZJ]P)j&0Sضm.OZxv1!D*S=K6cXI,UI" Lλ]?[LQĄF|TӺL1"cI]&8x81\NʸcN}_F/!*r軾 dhqM:< חr=> ό4}A}?)MITqD"$,l+;u<[i6~qќ/$$"c>]pDe2~RXOUAwgxlK ̳Iǂt%F hlN y.`v]8aME3AJ+ߟcas1m)b0m\k/JgZ$Eu%'Ts ;Pw@4,{D<) ˷#2y'tl|L`wIyՍJQ!;ˑYn;߷؉!Fg}&{mq P g\ -ge p"-5,7Wz|w>b< BR4߭mMӆ!z5^+4gش_"hWr)߉Pz.޾gBXڠzlZ찵SEc[<54[c w*&e[Njk%Ҭ3y,laQ;g9M_$&=^WԆNE?g]T+Yoқ0h*W.aB(:VLcXQ,G:RHt@8c$7MlR9]gH)_tнI&6n9Q26)N2Rk܈WQ ߺzJB} +ju+qg_IM \E qI8+n.}̦2-iV0 !7jZ*\YTCcþ aPJ+ ~s&a@i{_%ׂ;Ҩ=[ (g)#ot6T֔-F)werzt=dzxeLlX,O7wUSFY%T9s\&S`.jy6wb6wſ|bWCL\swkb~=Nn?L{@͞Y?+~pI"_Gs?_ NL+ (JUHf5q殙ۤѳOBU9:$v&0: ;.k.X!QYW"0G'h,dB3rb/={jA)0ln@l?̦h*h^w^_`aAe-n r<) ?gtR.i]pFi?>M8|=%`4ͦ0JKcU#H4?};bH;2zN( b<7xT2Z[LoDN1@83@8o_nA*RErˌybTf(MBZ%) ° 3Q(EfT vc !no7Qb\lIN~>jKf2k:r}.2?-,inS>wmMkOk1&JT8,nLH 3)8JL Po[lelȏxƲe¤<)r10Qj{u(ԛzɏ3k8. [ ]=ApYP^X"P $ \y&fj?{ U0Z0GP,YdC2dJf^8%:v4ȇ,ƸJt 3pҰba|8/;6qL=i !Y潥sB46e l۠w6#PY,^>-B+5gRQB*,1y3h<!Y&I o-V2_og{zLiqM#HL,SV7|{iTRp|aBՍϪ@iq)62vxhzA0.rb Jwv&!LDPPTCce=O'. ^B ?ǓBzTpĹxTPoG!\rpK3突 9N15@92Rpsg+TC N 2CgJ|]u*d۔ծ !g!>P.҂ vi KrHTsZ`L{nQY8mMc#k^^|@4G[z+XXJ9sB$`{|\/IN)"`5 Bq7&3V;nќgFd "}Byu-\kOc[4Ț6:pq,sdg8U94#H;ENeڬ|~yBh׫Ek8ß&s7 5 *M)JbՑÿjJirmvϧo^/`z3i,}o^W4G I]Al pT+^CwV4ބњ2OG_ɘl$-&$6}TߏKH×8rMgbZ#p5hfC<Ȅ0t{L4E_Z< !Xo*<5~g0MұMp\S 5:1/lfLJٛϑVB֒ *o4Wrfn3`J8gE+Z 5`>E+4$ՠ FϞlء ZC[zWH0]Atlwt~f#Y^}vԚ+*4qꦫp# 63) 0Ww^u) MNj[h g=(k_L)- G{S0:w %9|ǹG\̋/A^C}JmJx1(+Skn0^O{¨#oM' 3Wě ڛ3Uc`ss?PJ=C٢mͻX v"-cߕ?zWM? MfaV&~ S\ @}?~fBgJ9%I:QC>=k8J<:4F^t4%ke^p U{w@ש7}0C ia:\`_Yt Ƀ^- 8SU1H rZ0b-ě\mR8n*]C)\uzQnSR.]ΌV=3.ݘƗ2ƈѦ^~\؎ۏ)ӏ9alHDGv6;J96F tK-"yJ Ke[,J8:4GI͘$5?1nX{ /pylD`˹}j3g5 R0A+^q]+* sˇV`W: =<7ܚMP}G3/Ѥm-ls=ZY5ׯ$EJ5RW .XM}\כڜbiY{٥hHRh@u\Z+M;VJ8;`pQYQBBpL"ɜz[&kqXYdl4аϿ}F\7l==rkyV #Z'b5v>*Ŝ3ߞTtYu="/de3:R^v=S!ܜ '<$Q(=c!ρDW^.LscXgaTKR#[٭x4īF{}造q1r}Vk/c(K_ Gw1>~qՁ%D "5Ik۵ğV̧GXH+ t C84%tcto_߽arY|G2Ԥ S*f k LڷSp&_9}hm]RrA7>~kZlDe.5=;e%]M{[O5v{^Y9nbey}yr17{ޜiiZ!%._xA!D>C$d J IwTד#+|$Wo;~ZQ|8y+ dE_FS!IT<*ɗ/kVȴz/L;烖.$-[U$ЂjM!}&f&RȉZes>ݦO&[ӪkO^ /kOj8l'R$vY]o`r~9_?9OLcmA!m }fq`AuqiEH]NpzjJ)Gy9szg{֎@עSг Pޜ2zG=.L~;C~;z<7y!j*{gg' D#pCs|ՌhA-jsH6pk*5[vZ)W#]KWW bXo\,hv>ԊBC5۵,X'ۮN#@g-@dwFҁ4u=QE1(xn;cG^iN}U<)pnbC(z5Ðe[\>CӤ6]=mjth9Pʎr FNq7ڈr:_K{f< au>z&w]+yȉ[^d~ W|M| n ɯX~O?fzWkB%%:HU}8"duɍ=w뭬K=#5{[_Os~$|+ 2sx[. 1Wq/JTYX-qšjѤurh^e}hd ȵU;^t&}F\HzC.i>_*/\)T:uP€'A'h/9M_.= [@5/_>GuA5$]s<08Ϊ 'ĐncC)Uz߇MtNMQk [yǼ匙 Otɍ0w B{^1ɐ_s [5za͎"?/tb5E P,y..d:"S!R*zOqRhbEـFu:U9P?'NRh}c=M%ݰg>߯lC:{z5h K3%=1x㥭1/x Dp dZ@faI)/b$ mtl/C`9!OURQT  D_yHL 0)[hapP)%qfiѦ2l (@Apz`hymS:.gL;>|~BfheH`łΚY)7H_孤S!P JY7uty"FЃbTݯomc4rdp6 w%? R:R'Ł+>f(l,*;iоz}`nb(W.PRvv쁱ywGCu>ngO3B?-q]G9C:vRlN]h>〧&GOݟaզڙz@ݯy]_'`~9t{_nc/Ld0A=cdSL$1o`p7Fw΅\rH%<: ۨN0BrfK>dzC%rw',#~ۛͻ7\%*T FLp j.< 75@}˭GT[ ߽1>* {ISQ&&жx$%EYEHԜFP bDoh\t]KE8Ԗ a"ٳį3I)N.+,A lfyLKSxM9H& %NLJ6{ 6#cRaIRsɒ:̦_t /Zt'-ik {`sWb6cp[c}N/<Nny, ܃L;<)5s t֔.@%ID$-,K 9^ފtc1}QL# N)&;vV:)9 M-ASB> 1$Xm`A lB-}H. v<;m3&.\AsNjhFd,XN M!. ^ztH)WKZQ!Mѧ9ALnuRZhW{H@$,d(і+-EػFW] U/Юj==bXE,"IIIER*HE}u VLx01`0$2dS[N('hK -ElW=f؍Ócl/~QDK}(9orLPIT#;2[3g]=yF"Ah>HoWmS 7z Q_n~"$bk\dEPmi"&9ޢwSZ ;a0!y7*J'# _%@S4cSC5Ų\YZ"ӓEWG_+gpmMzbzXm &hTٿ}Yuʉ)NT*@fw/ҽvv`9רp?/ޖ-oVF^wl;͖KQ$uF"Zhd(Ҡ={ܢ73sRW)s_.#cr]شţ{CQo`Y \o.ɗ9Ȅ ᶐJH W42j XJqv\:P_{,ߨ0\vƸ}QսzzjEnͭ]eUS 2 ɨ5=n&b=2*reYelO#˶XbV.D67}MhWjV{S Bc"|_M2y s䔂vY RYKR}޼RmzZ7>s2'J7(Vb)zEp6JKYfN)c=YhLp34QL$`*4Z:!"!KtV;OJAKFEցfg KrOTo'$)i-$SNӉL꽺FDgHO8&AX}m*;,th}ppU pptU B-z/sF i9 N*bσC ,cp3Ժ ou|eE~}dVwfͥ))wy @L()xa9ᷲ"+,ҸJ =HYsN<|d+4Hĵ4:i@Yh ȨN*RBuT(:[5cR!,@Pоk#oZ|Ԁ#֙^7tn ƯТl !Кh0h4j<;uh8)Df($U u AvT3Wo1Ū6HDUHO|2ZLo.OS7GaոVZed^H2~=^!`8j]mG\;6v;}+zxǵ b &VRX9X@Dm&D.?DaNpDa""{v;O֞Z8Zg{G2@^#B9$R_|ł4տ_n1QFUepD)N!"c 3,r3@]L4EnFp:ptHk#rD~ߢ-eݳ5 ѿ%Ӂb2VG])ӏ|ghT20dKc'a x c`"Usғ6.KEJf=4x|]L*O}<:[x#NZ-`{RUDXΌ߮^yh[6,iTh Cm~8ou릭ôFH-< FBtX ;.ᬙYȑ5v1Z‘WKHX$R^۵yXSObuhnl[.Kg"qKRHb #0H hzV F ׊qq+V(/߫A+8풅mS Zzì"Ik[o[ى=ch֥ c "8H]qeåMgp[b o8WL^ $9+)۵M-juML-ͥ{ UH*oG =1 k<)e?zh4c6LJuQLY &z;v=NTn9)d2;~Q,n5`>Xt_az%pd[>1RJ7&7Ѽ $ԃE4iV _?N]E94_= ;J/˲{M42a9U~$hu.V|bɷfP$HeABLaڮvHR)A,)tc9u-a2D.IĘ bA$ԊQ]"cT >mBs-,ČZ*rF2RkqjQ-EoZ7]N2XZJW׆rE((4ECYC'ךPaMp0g>PV0j Tس-z>SC47e1tUw;E=X,ZH jtX(z>U/_VVLn?8'AtSXWu,(Xg~e R>G9-l/fƤhѐیR#[p8\V0{} nk"+%Ty^fk)T -{_dee1Kݝn޳/~ᕣ9b770,g7'@rvFStB fOF~l&?t5_R(r"ձ+S}2+|r'0FO|U`jO'ܻk`8 KdoKX}3pj>rzX٣,4gajYTyzo 6iF; ke\aeRa$/jyj ?ѫ(} ΀p>@u~"HcKwq%ycH9(EP`ީ <ш#e)쀎9F-zby&|>Y!YV5=X]"6(m:\\R ( p'.kH#ts؎5=|ɤXi]#Rj:L- ̳2U6}My*c wY'ee:ZkooaR{Q^%t*)=2i8gK-ZfC0!{I}*S*QV:Ԍ<]LLJ#vqJ EXPEwޚDIDGl(Lzv8 ay.J0lxN!DVk+0Nek#@SVd3fF'mݨ֤B;9jm  WPS5"J-S2]DsB%ŕgA*쪾|?B[+^RcvS|ApP{y/AЙ ;aD[k}-Zn+ImG"hMp)&:l- 8H=, n+쭏%9bķ::0?9hUe$}C`$p*{K(€|%CsÀi~SVm9yAMwf\3X҃Dv`@fTC8t?Z>rAeCkT! HӚѠaMb)iGBAqӊ%>aK!POHϱи2XyٗL$LiD +@ sD0y=Yj &85&Wݍj51Z쨵a՞mo S&FfJw`gz:_ԭ ) Af}W[diDvԍ&) mRm؎$Ctuwխݔ9etޭ  w-z/˓`p2l]Ep4;WiI:lvd|[ٵ +SWċG`?V UЇ4&-MP:1AuHT^$'~C]= P E~j/t $Q<db,_I4i Έ6 ֿR(lu]v)k &]i<;CtS,ܧH$˯]X9^խa#`67&>Tmȿ=+A}}QV}c£^GtIH|c~\X|- aB_ F7U ż#ZVNhzB[$g7e,;󻳳sd/{׿_,`+<{_?>;tmeխMĔ[r(+лN+?y+d(,.q"4Ո9s}J도>dHo2)1[+{~LttE^u-'G<>|;۔.T_L'G1e[)*qw+$ nkɀ)6$K-*)p !5 :Jq5C|pQ$WZ3' 5>[qw$Tr>7TYeɌZ>U%Hq^({Ljo.X(w*#Y1JbloAP-g)+򥌈*B̊LZT@T S* VܨdVĪ콧B=VN3N]+DS(,CdʐZ;R#Ī#"ޘщF>E-( ӌAR,ݣFg)fWq͌[4N`4.b Ln` % ha{7cq֐T@aVH5֐J.;f"^&],c1zS#e髥.R"ywWR[vs&"RoE6K*d6r+]Kte$|Qwzte]{䯺fiU8v|(r_!]vt)[7Ϻ]bBnBe5_`+ڑD- o&ЋԶQs]BW|[_z0IVm 1г15X+s$(o7kjJ|Wcw)جȹEbjtYtwm 51SsjX]c-Afjq[u)KԤ$DY.?E&T*V؇5r7Ϝԛ@BO]?j{=^} .|P;]UYb'XW+ KUKVT)ح>[akA ZVBxjVFh")0 1v`QN=tܽԡpr.2%֬G@ړtzs% è܁ÏFwc;l81z^D:>2:av6'Dгܵw]2uʅL>NA!_dJ >+6j7W!smd9Z#R\M++`&OmFd2 0 qYOH@l^K#12bmlb2,5ݵl`ozdw&q6%ƽ׏K u.DNR89I''݊Oݣ~ b(8e,&&t#*wb.Ϊr ݏQ!Z[3(& w)XN Hr?of pΙ8w)ARl{x?w2Qmyr:5 ëVwZQfCR+{2dfTaM61FG* J2Z.;xx2V梔:7דn:sܤ FwF)HMIK0wIAQGH|d+:Q޳'V="B4$f<j P+ҢB+dhLƦc$t)b`بrm+/lJRPTrEbd ~nФ)tc ,xbX)0S|#> *b?f*CjFɿ0Z̎T eˈR+X(^1u.VʹBR+]Z(({P2?6P>Yo>Rn_E^qb+ $dEٮIOSDk`V] UqYLҚ8َCB睤2x[_`=^+쬣)xmVop":im#vdʋV?&v;ۻ5"zw 8Ջc&Ɠ qrnn38+7 W@IIșLD줁SC܊H%Yof2AQ]v:lT?jN Dg|ꗒhn.5~8\6:uD Ajr Q׍]AfRr[RtS($HŒK>ChJ[KJMђn4+ևy9vYQ?Hlvl֓2X#'G2>,l_1E G%83a-jaڪW`I TA%H?&UX -PRؐhʩ2S)BYU:7®q|Ty6 OHqve(&G-}5AlUWrS]s R >qIK.&تNg݂ZjE<2QK\q^`jr *֩v"%JA0}^4z1y 0xޯYgŎlKE/ y 0 @X؜0rs?{WF_ 4@l.WʅO[{]AK$$GȮ%BLݿFʦ4A&~'{μ[P`@# : P&AAqHV$"OsMb@>nil#ee^*h刓.ޟ(E䵷{:A0=\m 9ug3oY%4QVl7Ϋ3XR`в(ùWLIMzrVlOCujf1qys{߲+F{|)A?o&x!^C*FMgS#h*kMBWgP9{95|!wER#{g!x4nk#DIGb,oq/01s h iAP*賯ShtzYEη=<N   0_n!/t`" rg85ém⭇_cD#YpOK`A3,(/D9"`JD&}U d[wj\+\]l3)l{Jl(cVFNushg-B,I"OE,ZC"qt ق)9Bf1 A 1I@1VZ c("roP2~]n$:ܺVO1|9uGojNo( CW 1ҙ4W^F:h!`.0|*<go{U$rWM}Ыg?-J=뭚E~i^V3vή=;+~Z烷UVu/_~<\ 8a777lS äk9V{YǵW{&EIFmk |1Z斘7" rZ?]8fNOT/: Q@С;g*b6\+<|Hed J0$C]-9G\`+""b1AșA$o^}Ƌ [=BWj ?^XH5.L \&ޗEm_Ps_(买 ]fLꡉjHd'9)\K,qHZn>-1`3E0Q~==0bXcFog) Ҵ٪(Q^ةfQl1ɧ^+Z6;*ɿMؤ){RH޸jcrg|?9.unz0GaWCk RHoH7b)f_v:':Rfs/lolNnCskJgEj >hYLۤGr$׬ηI PŊ98^ŎimSMwمA=% áw4<&f0H`Z[(F. M ?~q2C ,Z@MnS:Βp.crB1z8~h Q{`2˂ lc1k#$4dq#N.e׽`J!0wתn-iWK To᭻<@qq2\ 2i6,jluK2rQ^lczlg=$؁ ک=cŢ-[;c~܌"C-0ތ$`4% g#qvݮwNy*Ph8]gvvݮN*=!ky H<&^zcE٧) )1qr Rrz V dNI`{A,/mނtdщc刓a/y#+"&4|@}kE(?Bw6GKe;bf\0]N1@3.obθ;ǥ>K[idƥ3.G x84X+)XA=y/XqPG[g =Ҟ绮ǂ) 1] B1+r%S̘bx#-IѩTOj<~V~>s_3Bw5f5SK0:/u7|sC_?閗'W²SouocG~~eʕÏZ2BrX֤dSXXVp㾖Z7 `YY0ݝ%H;KD{wh Vk;g{$.GKuz ^'s(оHGUrЮ }"`޾WbZ=_x?/Rf]u [ %|{9SfGf^h TS_k$刺ELw#n.g5DҖPOvX.'汱L7+"gۛ2'ߢh1)+Da&[8iɁfVژ~gE{%S #yshvaՅ+.'@@؎z X{뵙] cxj{Ԍ:?X28?#NFkECw0fo"`J" Ͱ{|a;5a_f Pw+ΰsクWZReڇcBrP\pܚW~6ׂzի/~!0~^/o2Ngu|1fќ:Xۧ9r7Ag ZU~\앝Nf_d9"y5#`6@f"i%>rɘhİHx;iE *oW_ ;  !iQ?@m`;bM|z/W\իr.yZ}1I œHELkL 0Y8`*MPt&nPMN% hMC m0`r !ǂ;&6dǤ%1* PMCr41):ULw ]OS,Eg1?b DR|q/ֵfutb7SQq؟#`{pﶯ05vau0X|zK~3D1tBz(Om Nr30<UQ{lt4 Չ*6`wl)c4y/ $ t76ɖTNipp8yW?Ivz-^BFuy/zS |cpu㽳,ShUU{g"X'm W\# n@*l-fX뱯A 8=>K܇Ds]Od :}q[|?&mCg"XE7b iӋ \k %̖nPuS@S`St~S$*$@y$1ٶ(Gy)@͖5&z<<aqޗlFu/G\ Rk{5􊌃)ɓm'ɞm2+ ug3y3f4QIGVؚɶhL&Ci;95mYБI-q,bzsZ3c6is!N9WFc<)H3\hc2YDFk41;$xĮwiXj.եx,A:1?{϶Ʊ~ݳm@8g 9ZeIVjJrH]!@6{~IsIΒNs6!њNA܇wf4]WA8wy <)XGQf݊PmMcHYv<3' ٜ -OdFMZNGjACMm'? po2 f+^onBE˪8 4}rzwХ CKl?a@%9Wja~M ̈́E!(޺V-% R,![-[Vۭ7+ hO15 :Z-G$Jyvٙi7W7kDE1h²Z|MY BDTz_̮"o[.'SH]/!}g tj&/VWsM-V-+iZz=Z2؂!wsnPH#d<ǝR9)* L +'bjRk`auj-*NC Cv!HT 6Ũի wz3i egv U jcZJ-vkw?~Q׫q.o[,j&~ͧtE?},mk[<v~8g~3?rk>Kx6S>_oU.b|]u@B&~u+(y%bY"r~'H=δJ]yMn3V|.,UPHbUgQJu-{[6dH9p4_\֭j1_lkjJaKm>@?d/Vy(TZV;' |q؝Kv 5gȊlb&k]9aL*+%ϖ+[Yd0)r01"!,FT2[ATXU|l6%YcIj;7LgB[NZP$S)"}mi&2m-  Q6?cn5Ґ,kVuӏSLPX'h_IVVIL)Ֆ<\d\g%(df9skLjoߒ7d7@$6uesi(~fa9ZIQ$`6 .ke"zvY]0䯼zE[l~i֞i@$9e}{ Fi{ki0+:w׵޽_߳}|qL:u>uI[#$J{7ޮ7+tJ)E1Z;tШ65ԫf@Y j ڪճ+ i$cɏKLjf %[0iìQ - {a5Hrf+p2\.6+qɖE͍k}hy7_kg r p n3ٶL^-'߯Nx6Yk@/?#ƥ_7߂%P7ځ?PX}Pփ1}Qk-kS-h ,h?6|{CŠ;EfrO%܍b߽;N?ÖXҘsL /#K^m]t,x|yR^Kh`U2{{ %! VmR[L09OM[|r4k__ kLǟXR7x>njNDtp ]~T5ِ{In'n0uHMnͲ>d<ۅсa2 1TZ3-Ixu+vKӃO8k9] |:a 61U"pbhh#gr Zw֐jl;6q@ZfҬkJ.16a(Zg.e&/T3o]LPN`S,XOSOkdNj·ƃۉԃ;\zG8vZ݃z!6?Fyۍzɋ ?8rG] sS c0:HrZ2TKrX24yi&/RfQ;2^&P(&ee4nn&gFpvr<F#vR" ^dLɓV-~be2|9fX^bhmhjj99úձVú& y8L2'kz&'#q1+zJZ'=n =LNɷ1`|=KTR63xO.ZVϘKU.#%|.3h85yꮧK32莭<i]j&v<#j{*Y:" ȃ NeX䭓j;hCo:'Ks}dO^K48j28h 벇>=M58NK`k>*Y%򿃝{t/SC#+cI{"KZk4 r\ DVدtWg>ʥcy3/׍h;kv|zubQ1c{E&xӷF ~9a3qQj7zg)"W`?~ >^:НAJ^o"\.b,)茁2W"f%">[{%vk@;vV'׭z}ǹNv?kj(~ix6;2( (lĒ@8'wgz 3'l<2FRI.2}LzV<㹠 N/eY-{hOoc u.9A\=5'czV؝wYs3}kJ{njZ0`).7kǓ|ĵpj?x f?C3%OEZ:FS典|w ,"2~1X]VNy|u,kChV{ _C^8\`|Þ%\ѢM,N>髫~W[ Qd="T[D1QR,_?݄Я77QA eƌfIȚb2h}h ' PNB/^66iF.sH@8["L9* $Ri0U$r$e9 #D 8Ngv99偈qyp{?1 +Er}xPK3GA,ht4ʚ f, AFxsRXZ6,:@}ai?`٩ǘMu68v= %,G" ̙}Jh8LGuA(Z'=ti),K2T.\Zʑs/(,Sh{k'%| U$%/2Z&`4G6Cx*nlS TW׌GWqJZ+d!EjcaC9_}OܞUiHExEw&$:XQEb)ޕ%)4kmXnԷ?`s +"Зi1u9$e[8}gHiċ4e6 (8@hp[)m2Z*g6hb.bElA+ŭ!Hcĭ`qU/2uX,5St}&WE`w` ~J:2,Jnqk,<ɨc9UӼBTE/LK'<5jT7ۈ\YQn!em !,)]DJXzԗ]D(J]NJV0^O#&KǝV[KQx7\z" EVۤ$! ƅ2xJ@T_ ]TH.M%ëCdlf{dHv觌ǔ17D?Hv؛͛$>*8y 烈u~.և[4(݃m"V[@ ׀V;- PFD9:emI?]/n8bTheb\^vQjUwdIU44 ޾0FtzxQ}L;R=&Ez=VR[S6Fe?~{P%R5;c< 8b&§X.+NM |ۏqKDݗDd=*j/鸮6t:}rO'TiKq\ejӫ{3^UVO~y7KJxar7Uٯ3~JGȋO6 l)LSiKƖZPbEɸR-oa\&(˚QQ|kf(e UYW_HBL.J( 0Bb 4밐yrF9Ga7*AX 5d "(#r,UZހ<#HrPQXTy<9-iqTLء8rRp(VgbL%J+%V <P8La. sFT#]5쵫L%2n$3D몹q~h !mgnˠӻȃe B, U"ٖɶJ' \c5l=#=vWSd_:֘le{lzSR039}(8b8cƀFerjͧҍP'ꊶx)azLhO  sHRq^asRCN=aAH(筰w1#&!Νgx/=h}';,%c!"X0(+Q;," Iy-%oCh#QO9ǙPӊg|Ʉ^LޝGJ>晴A;VhI;gw,\Uw2\s(F _*K\z[/aҸ$Fwh6:hpDp4YÓ >-ʪY s{ͥ |g}M)HftֹU/iµIk?t)Lj*~!WVW':IrP[Yd'ZI-BTobpޛ4 ˛eӪ#q3)Di;Z4Yи-hE>p BjiY)Llnc۴KL+7^;z3r9o_ڒ81"mae?w=^|h\r{)"xƻT~e.WMmhuG?G Qbz]vxWnnCIq͹"Qi,#MǕ#9jBʅքՕjU>>5X_M.Lrv-mޕ[~q0 *ە \}'ɼJ1.䁫76^'qM*i˫)*ׅ&?g_ʓ_2u¿ܩR,.ftgmIx Pv<|lRqƩBYozitq^~tױHSD?ݐP3OzghGo7 `p}Pg]J@Z-SCr'Vm[t-:u^c0Nޛ`grĞH84qtx$%g0>RqԬa˭iLAF3y0䀕eN7;o?q7TȻR8g'ZJ2w1ВUb_p S$L͌aX>ԙiKƖ _k@qΩ1^V][¸K4 R1#qw6MJ@DF(0U DI+V%V;ő(c>QtyXňzK2KI*qNg0 ۈGYĝ*!8˫t{/* Ջ3)X>qa!=a gsǧv;MS}K'o_~٥u)mqN>./o.hάP[C_upJ+pTD7F|Qkɬ obI2泠h RE4F gkmFEAvȼ/ aЙ a30x;ږ%ǒ^oHncn=۶ɯ*ո/oNgbJw<++$y a%#Q 7\ eE*bl͜5zܻG[5P (BW3af gG/_P,?„<ˆ!>S`KErؤJWJ_+@)d*_ٺ 0tSGFwTe'J~w5%;__(v>;\<]un;kST7Ç!B׻@V;bLstTf^kɑ/ '@2m <653:c9)AJ=3w2a(0_aZ'9ݞOn ጶ S\2ErܪUi"X%! dy NoΛ~\dc+%\e [!kˣQHr^ݤOA8q_}_/()^q^k_p(7OoE%lwdfgCRm9~ a%y|&CY:IB24\պdZN"P\;x~?hO GJR :{;[䋮@?J 4DD79 e(tTֲJ >PH\urnKK0P"墕X2VAYX4*Kq)2pҖP6LpkI%q"hDչX fa$踭E1+K<%ļ.!DAi͢^:K U 4wD1`P'dF~f2T4Jo"(jI"BF,Xģq6H\i8]ʓg2w"7߾mXJys;Yܹ}C]dSЕoJ)?dl0;/i FѺ[[/U;?i?st6oa:x6\f͌C?n>IJjdh*^eWUxüGyighgo$eoܖRG4~qSZ}i6Wr)>Ƿ~l-Z1>Xjqߙjf ]6eh_ZiʋjN g̯ӹ]g`'q<eՑU:cy)sbaЫEG+T]-s F{p86515@>iϘ47I\W 6Cj IЍ4WP\%>m1"J4P=;G3\.W JhëOl6T{ݛ_{GZ~򉭾LOoq>M?{{qAɐQ5$CV^,˛.`k4[\Inb};KV}A/Z"Bq28+Ax&ʍEd} GBY' ϓԆY j[:{PHx?M\m#&\lo)HA+&kA{ 4*Q@'I;-+6wMNZ|da?э~KeSsng휓%${dᗻ$˔$x/Se;Ip8>J$`R Ʋ8?5tM$;᪴mIT}J4rUPljJ8sn̹%8dV'=Pܤz[*;5+49JJֱjJ lNHSVVK[K)U#p1WZj+jפ( cIkk0.p:VGv$uZCP萡, Ёք*H(~ Sa0: X橏Ex2bXo.il($%˒sg=URDˀbt+Z$U]Yr%Yش(X=eY8竓! IPQ/ kbq '9еB B8Dr#WQ{Ga+ ҾUsEf)lJ&ڐkcªy 8T&&6Ec^[-%(ʕ*2=w6a;N[O3qD3գN%hax%3h)p0 w0i܌:fV "ܐDբ t+A3R@:]W,َTeۑh;Vy;vIo9 c+o=:p`CK#C_CI@n<}$:4 }Q^Rw4ܠ`1m.8vdy2q;f'!;Ie"n;Dlq}by؞ig.T{;hŊmKPoD\hUGOIY5һ.JZiklP O覧٫pɅWqT"i'u] TY"g!J-¶s gvVfz ?g7qe1vJ*o~ 9o*ᝮ,mLjE; a;R֊ Gee9kweUv1w@p7VwBI4GrIahs8!. guQ8MC3JYEH V3eu6"9"3pg]`.4;'qf7g_ fPW;s,gCb/&9hoKVӣAXW_V~!f-{z뫍7'[σ;6 [|DfFf2+(ʭIdckp2)XZNBQw]L"ߣz~)Đ$7 d"飯K)Я(5Da%;)g"R"*/jc̱H)z3 s{_2$\,ҽߑI[`r`OaR<h)>Yd ٝuPTӛݽ6sZQh4݂Tjy]09 N 0QoT=Z`֋ =Q ^ pEp\d$TFB4 N5 V(+)$NO/n /GF}]d5^"; U&_vI4}58mZAcC]1,m<BU;.;{t 4t-Ifsdd}Iό_L zd 8z+#F=D:T]U-~dv^c[31],W=*J y2$g)RiR3E%5 08v|s y.,ٴҞŹᵍ!H@{?w ]s mOxEfBPOw~Ez1&? J³){y\2E)lGmBS@,Vh%k5(w$A oEJ*f6TWX[p&A& ᐅ[BG ;]zʡYB6ǩ]{^FF"8MXu͝@L5:jpk'lQP(mhMv w-%cs$֤.2H)m{ڜhH`*йRNktkK'yjS[[n4EjqĬ\o:6N7^GW28+/RyRú9 YU ׌Ήzaw@ޣ۟؟O木 Ď%F^Eer{W*KTPR@)xVJ TvD3e0iupIJ7VNx fb);&3qYvmlmEk[7=!IM0K[ݫl:S-wq*gxRc.!jU_OalF6^/\ԡ@hg_R>JN2#g|GV)bka8R 7_)j (!k8+[YI4?_$ka3Cᝀ^0]{{yx5%p3;dYϋf}'=ho[g=LZL yeawesH͡.z&4yebw9Eshv>iaYPr\ûQe&(!M> gM*۳vmV /Z⑂b@TRKGځQa=dxJ{TrCo/pc kRFl=awȚUV=GwCαS !Д!<[JAQɣnz@[9!@IJTsY"M:F9aRMŽs^ @f=&ʴ*-8nM(E59: A*_5k2Ǽ`Զf1-5X򚥋Q1J%F8u# 8A7,3"+ +ga9q@ ֒I>,bk|nL%0)j8~8.S41ECSt7=0\]N,a|nHHh~hh}b R0f3bC;;nhebj)<&߬Zo\t+Tb z7LWAKAxO?f+4O i>j]@ T=-g 1"4c=x8$X`e fmSԩŭv.z_f۔OtcCؠReśфiJZ ^λ8#ZPI>@8c+9?ͨ} =]x >YaTWiYiT%5ڛHv=ioƶE֔g_CMo,-R II%YCq G$9˜mYT}6_Z3 (mi^'kEBlȖ K [QC{>o^n xVJP8,}^|pOe j O;Ǫ -Xșo(D~3i0g!j8Nk?,7~M"he0C!=L' ۄEg?w!ȣ,8'.Eohz\/z;#d_L9-oW; wc)Y3 ^욚}}6 gyKmܿq#"f'm)A_zɭi< `~@"ށ(~)WφOϧf'lvJS3d2Pc!s멉%:iO]|&O)w%N4i$WP3ˏ >}OL$vjX8@ve@4< +=N fx j^OMϞ1qqzjΧ_h41E9J,f.,f.Odv7V:QJ0@sp6~LZ2תzJڰӵ"&G%gl.O3, &abez8Z6 - , u=+bqMf3\❕i.UH#y@r&3R &NЌfgCv6 ِ"}lΆl}6BϜ8> buωIqPKkp4Ɲۂ,p~au0Z2t~$f1 L3ãD쟾(g ^d/`/PFP ~Vva[j'645/eB2$: \b_?TVA5f_?-yorKj|Ln y!g~:(Mc2!ۃݣeHg\D{:ԪXJPAKXa*6QU&CbCڅ' 'ՇD;}|t3;" !V9Xhu/}'\ʢ0 CDvmNYْ6r8m4*ْْ-ْؒwwdgKvd)[RhÖ:C}&12 \0i lIΖ%Yْ6Re}%͞|O&KTt$wb3%-}Ų#~E\\ S-N_LWoWfL )B~{"nߐU J:Z4'3:#'E{&0ͳ^љ)A{Ɲo$˴DֱI_=Y>& JdjO8wVX(l8e\a{>?ؗo2)*D0SikmZ.㹩˘~4SReeDv~#@" KOw켗0ݽMM630=8Zt2ab0s1m^4t }֙7},f%6N4Nmm:>r.QR/91Tmu޺un}]bE ޴THEP]}9E/,S$#r4aQ3_X>迠n;m?a\j ~9;J-+'+m,RQ:jl9:r]h xD?ց0s'yw t)n pY韯WSgz π֟ n\vmo6Ƿ]ۮǷe$Cs!Q0ǘСCC 7(R> ʧo3ULg_%Gog=̓-0vv'Z}4Q[àfqªPX|P1[ EeMWbvVTrt-5njsG9q5AkY92RƅL8BHxbgס E\+ODƊcןk*FBݾcY"eBG1BZf@): @TI,C7BD  ,$C\@ws4ÑXH,39e8܅Q]ʨ'Rč Wb+@\A0О@9ZAڍaFz"@D#:Pľ#;DnHP]+`4{(4D _pH ˝=ݻ0`N2PEq{"VrRBu!oJub"|27zij [("Ȏ#'%VDzOF*8SJy}** #H@@#U2(28TkyKX2 S(=}ANuv樏f/oNe"0{!5J Fg1@R7[.'/Uilؘ.qΒ)DGW:rntZʍ_ucWڄ=L E;&W؃ `ZF0rēvxz D6N2@3\\Hŕ/# 8D$R KQ1\F08.caHIDY GL $g>N'pw=pS-CL l(Je\!$+A ԷL>N $`\RM:Y)D#$AqjWy8ƋfDZYL "dc_'޷'m|a^:63ogï٫Mg2qYz7N4HUOcF+T>as"P{呷;oU"f"RĬ[|M~e(Qd%u_ںuJnFM~( iwqU[}dK[J38&r;KbBֿU;Q+b{Ѧ2&V `*-کg|X T\!)Fcpv|cYȢ|q vz)tKळtTrHk^:Ů7p ӭ~O n YysnMf ov҈f_| 㹥]j%=]'#vP['TMOIuS"jEh5}4 <{J) `NJ[y>A&yJIݱeE2E2;O.E-ɩQ+igtJ{ "Q&[$R _ђHv[c2"Y5>A"1OH^:c*gNNm4lv#8 崥ŷ.=koF밵ײV`ܴf Vڤ.E&ڑ &s߰pia!m: %(*-Rk[ۇq *)x-ɚ%'T*+Y #t90/mwlg_njeV` QR"]hȧx̟4Kٯ/g*VBڏ6s=8ڠRlV0}PxF";d-ɓz"<X='z"A3zAjß^玴d- 3;k6Wt+yjxrZ@Q8J6˥._yު; ߏG @ZRoqJɅ#α`$EN"U y(>[Su# ?ʱ:N*+LyQζnE G>jy )?TTzWpaC\]pu:Bz׻4 95w^Ow8׿tYlҹ\Jl! r"ČJ#WX2WI4԰{\'CX3JP鵀FX=Ζ`k &N4x %>ZϬ0dNwve"p:q7 (g^5jp9S)729?K 3V[iTȶkJpI&HF_}͋'%Z\0Q0͇Hўү~ar9t9"HI_]/0ǍJ_ [2Cz_>GAVwBB4ќ ,$eaBUɀdAlFm%1&DFb#7[.8oyp b!f@d;ofP>i-k3}M;qђ<6%_HG[cP9y>^WBw+ԉŮ8oQ8n{G!rBL7Bӷ bCewCs4i(r}ܞN4eϢr fH89zTJ()rvw$$:{V/ }R|DFp)m ZYŀ75R/㈭6fc@3 6 r< tFSMpҊZX}{tOǚI#Q !Q-FR쵯R4_P\f"TfBKɺiRO(ni RT ӊFc=fcens]91|vk?*qNs7J3fp9@^2 @rS/\/{v 4hTJC?a ӎi}msh3ژ3הk]o:L_q '$;0e7LPiQXH`PXytUPl[-3֤ӰLFneACZ%||<کaaa9h^C$Pkπpvj<)@Xe{9Y"_c4֞Scun:.]&LLwYgz#oc4QT֔3[tWYJ0 Ђ~(B`,:1t}ޜj9 -4(LFv YJb=gLF4E+$n.bi5.>>2A{SE,:YW^a~6X}>?\?Li}=g>0fH~;Z݁NS!b?xg*HLnFkΤs@&,V<I!* k,+Qr<+PZǷ@Q K붜NIwk  "-ҍk XeH nM&1$ ۟}C ѡܛ24ᢰ$/ rz"JSzxI(]Jy.mKND?-6(Vù8sOe' WF_d9[rjg߆u0"}dPeBEAeqeno uJβpL'ͪ@H6°⎓2HĄ!h^Ġ!ur0xaP_ozPۇY..D7Y`ǣi}w۲WlYce.4Ʀ}u)!. banoF7d@<71 Gf+vgQ> ^YZ)腑[24ܨ/"aBj{e*r"82!`ʩH6*VUMk 3OG.C$9VU4U:eʀɫt*R3J-tKrD1Pt2TfzK^)uT՜.Y{f6xJ/ w6VrcOB4}%f VMnҗpx_uXꁃ|ELBaATBXHOMd_Y8ƕ{fYeAv Ay{)7jm:_0Vٷ=Zڍcf@h2َFB3mV6VmDEc$`llIA1nRn%Xv`ې#)Ҷ]zIrd-W!j,SsWc1K,Gɖ^>F|#ݔ79N)&QfKo"2IofHM]Β3n2ua|%Y1vɫc[ ǟ,pY+ ϔtE P(Jea0B /b$ q|M.BTrAW *~ڕޙJzFSQ 6%6lL=dݩk !dO>W2ȱ~N !1ҡB*{ZC|0 %*i<BB;L?QaC˳izL K;JChLlӨ͐zV,]J>@61*q%V 3o,QkChiƠ4"k1hʚJ* F4Cu߄Qk%URX cʺPzdd`#t.h~`0cxb%K$ń0_ \LH2`nKR`9/wX/c*5dT.6F3 UBZBusHaJoqvvKJg sFq NQ܂BjQYi\pcDíL|]FǼU I"هV́/荴q+[@^A&ަrkmIs3Al  po.qSy &՛Ruˈ5GAj,y=ߕ2졅 4_ءs?).3u(ݑ/g͇~x974ӛmtA޼ӇP[^NI]u} $5:<[6b\c餸}M.u~7_;7Ɓu^{q|}Dpa"E75)rzXm5s\+4㯿fM^Z@H:c29WUe`'w61E?m\qWRW. JLFF`\֮h9+v{w2"G9#2vÕ y%Zd2z5kl8cc.@/ۗ!i)=73Z2vΌyBvېhE:}7i)5SiO, !*ЇSY9ZΜ1jUyol9 5'I|B6^ߑ& i̵qOڗvG{xcӅ̢ڽ5ܗ?w\=lVg/Uf=Yϵ|ZqF6ҢV]%l*!crdG_V!Ƥp 6{e1i{/SI7cח]~>L*Ut燻I:c.o&$eYodx+iԺX'{B$A-??M7rY#/ , badA`/"%N ٶg2bԞ"7)Rq$3ŪX,5މDN;ՐSz UY$,o[uG17D}18 fƟQ wh)Q]BQQbS" kLBP5u&K;:_T%j# rbf@N4+^]P4h6đ{F$3Bk`I]V%F9rgQ N[90.yC[pNˢk+f'%h7n,h 2cTwb;E2{?'032r?n SW(eCXc[SY=]Ɣ®s`O&S{"8FFBQ;Ñ%@o)z2?~R?v~di} p < k>%a3hy; h9ang6,op tWXZ=S@Q (,Dhj&J#kye0t(!ۭ5\|Yi}UW|}. y6dSV%c*䣃Ǐ2oq\{\km?\~\7=4IAlyǚYNO{!-ʏ_frM2Po 7*e] ٘JC+ e7֖#.\Vd_Em)p~:\-9[$尝`+yV`Ԫ7tQaLܙSal砇Gw~.yG<8 kC|} !0{y;(N{Gka;3` uzWzQYe7Jqxy 3{z (o$";?ѵ&`~yiEE(? ݐ]MˉP(<~IjFfHIAXJ.^tQ-HL漵go!!3mPUUu--e6ҎkL;[`҉dKR|ʄ 0:Y<d0mLXm4whek` Yz02y. 8&'rI ^ic/ꗋﻄ{K65}/_\YRvPdcUprrVO0?>KyH ~SF du%?꾮Ωc"քrԧlnEH2rM7iF7+ ]9sC%JVfiy Bơ+{ݠ24E'*>;( G փ@/~1 ܉/1asq:KB +_ 0to~5O Ĉ.( ę`{5ʼnZO?WK c3~;Юxů8]U} Dž .DEz$ .0֫z䊭^4_u|IU'WCi֫"XkmQ(sqԘ]l@MñNX? ׫(^5ś )nq$e'!/$Ge'|t,z=%8,*_m{mONu_q(ۍ_]ζle(?դ.0 RPmڷ{C(ݛoEceh[JwdcjgZdM̈́)A 'F@SOwl2׻f LI2yIB ŚQFh q/(+:mѽn>Us cѷjݞ~>d{ZN}FTƍ}zmI6vO~1h)״xnZ}8iqJdo\qP~G2.4VEqL87LhB3ޢR!_.Vgi_5}JX> 'g^me]ams ڱ=vl;z}$~~Ia"*+ ݹCd/jm\44jcĴ a\hc%PO9 }/o#Rֵ8%9`d 1eW;tMmPVZJ7-Fb"hxd?!U׎{ڶ45(Htܳ{+vP[BK8 &uk֔#jDN\5!uTZԜ"[7MMxVɆzUc <$+R򩵚GMAN-!I/UN%rHW.)25^xݐ.Ry":n ^=vK&4T吐\D'˔Ttm:Kz%sfGiZ\IڊJVB$RQ fjά3tN7a&RHnT, ZK%9u; @#ߖ0b-:.d{ ޝ rs ?/'@KBHIAuMc PYLh.(7yEqq_~K.8q1\ybJho謇J=K, N_Ol>ُ:<-DKU#U| <(K֔YdK bh] ,咷3x"Lz\DZH`ӕvp(^FpD &_bD9*OJ8h1nq&,7d|≩K$CT C%UqyVXsBalV8\6AS"X^@gʰ ^\<INY 1Fة&YC+fUj5E05Zf@uFccke)Fju:7YUn:D=H7"ɬFLh7f'|)*}&Y7q,@ƶ~JnRY9 yc!eP6h%r5gll!Jv3 5pLt8Q B2j &uԟP|n2#i k!CQL,֒?"0{+f9%9%TBxP ̴{tD=\,(=(0]䃃ۧxzJFφ'88a$s/T\c4uhİQ}>S(?d2\P`)4!RPAPlImb\, \, ҙݝa*9wg7twҮCs\s2r$)f$t CHםØ ɽxLBh2C~Q!45 WrUT7*e] ٘]i-PFR#4ORRRhlZ6YFBJ;bD2%_خ@p%hj&IAYi!RvjVZQ 㵂X{z2R7r\L[%S q{(w23d@~t}G]>z iDyv^-{ g(Z{la;;`$c5xzY@Q2A8)CjiJNߥw4PG5g%_37Jij2Gg()= \2)iC2OqNa9^7+l7kUd>iwρ|bC$~Ռ, *&_l_ObW].(S $h!Њ8i)c7h@:EnXMϦ^ebWS7aWBҮyk{un n@sO}߾KY"o;e$Ss *{R y+lP+I8 d+/%S7JaN@20"$.Ѱ$9w [␂u4 h9QeO]@M!,T Lj"',Zޒ$.ys &V[2cH8.W2cHyKA% IDm~A@ \#űa;ԬRFtF[ޥL|-}D1&m'nu's\Y9`Tеi_vP+P%:5 PM]75m`[!td6Yzo#l]wm\paִqh;IN&Ka %UdLPLɒ;OdK"sqLfnw2YPGvno3Y}L~/]?~2s7WK([ÝWx4-)-$nxp.3G-ӧ{ϘPGLjzV|;HA=" MMoF!%Z f\UGڼTt6JGfwT5ucMbIdr9%5* S!KOXWnm &Jv#70B$-0ִ+)BΆg}X@ń5~hYB`Pr0UdƦg%9BCLy,\; 'G9 ~҇ Jv׍36e@dzv$aƠOŎra50g.]#0@^ҋ '|44v.gj8?{q,YMpGRULD"XάK\0LsnBbqlXF, $Q~sҼNڝONoY{e+xk p]_[{B~3-0᭲ΞUeZ U%!RI E8LEX \!@ˈE4DV=ڳʧcR::F1gdKB͙:<5Y#0Pb3;ɱҔP"LH1bĒSC i Q,抅KXrVb@,<ҀbIpO/OH~q9qc݉:AEP&jQ#"~zVm'S\Y grћᝅfM"GP E{Th|Chk߬L}J><]}l{-_d.s=e*%窄qFŎQtÌ=@cۨ6j|\&1#AJbu8Q<}K&)+¼6'Bn0 Oགྷ~e=f'B 0I2Mb1(2d|T\A> לb?qS]B7 <  Hhab V!C6T S( &"1,ր&L¼9njJ(s+a+TTuq^s 3)({Y~.L1&z.,F0.B1{Bߎ톾g>Zo'fꇳ YQ፩ 2d1 ?{YMCg7^)T4k%ɻ.0Va ɞD$dk^sخ`L[Jt3`]U1` ~4<|*˪% إ/bowH=f\&?i%KԜFSx'f :GuuFu7,3gVQwԸO֏@.Cq}Qx#*isBz}?YkCCD2:&}b~״ND|^cحÚ8BoKol0~a?߶y,h[Yzj Z*Vzo.ݺU_x/Gw{g2h a5UjFvZmSSmP5x*ur7{jVq'j'[l0uɃs.jN+HKκH0[cBꆿY&8ZHpuOJU=%Wt8GF__Z{P^5yT<=y.t*BۍOJňRTnޙs{7Kec=Ȼ-jl2$ó@zCתA#-hgP{xtoYFQ6#!F(;BU9_dWܛ $Jhf e~/,+ZCTlZI=+yXhQgpC:RuUcH8!Ur2ժ-Fj6Nժ-Q'6r>V%(^[I:d;sS:.oZ>Sh-S[XjSU OB`i-C"X*R3G,(VPfF( D.P8Q,v.)S֝_:WnnZ]5IVk*ڄ!yc_|}6 Qtv÷L}uůu7<_09Q]z̄LPF*O5[n3]ޤ~F#F8{gb:20BLjeo݊Yp^}m]w;vDP4*=<iR4Lo-Ttog2K# /ɧsk!@v0ϛ,tٴS%DXi瓀 ٪SZ%e(Y(T[vmʈ6jv;^onpI?[cJHEj9zh?((ysW P7G>O:k 0AŪ@,i q촢-:5<i[Cxr8qN+䒻=I|r*@dtMX"K{ɡc t Eu?'esyhOR`=0Y (V{}WK4 >8ꂧD#J;eᨰK1f5F1J#\4^S;{clSx Z' }fMPkAN(:/).^~:~L+>5bT6yH5`W KP/dhR?d*GQzv| F}ZІhwSj JVU)Ս#pKiϾMrɚܒXidX듕 RU{>9j(ή5!9ݘ\.Կ`>kQPK7N C_ell]zxgo6,ƫ!ݮOh0\܂$fT3y rB"V2 #Fj[yx3ed80R(d1f0Ԫa0^QB<F"VLϼ%Um8Y?ʩ[(`qJD$!(0K P~12T;.K־ !vvcMBȨXqN9(L,ML QH Pk1c*55k_Gp:s@:"dlb00 i0("I"*C(eBt}v?h*쭏 tiZ_cx`Շ<Ya$њaTdQ#@uPH%Ru3й~R82{]%hO=@ )ꠥPSmԪ1AX %M[Fw/R.5h@¤V[| xjǘh0@p JPkj}R1C;RV cCUY+'`9>SHJ؛ wFaŤ޿5 pm6ĶD8̳|~vXLK jf\,=9Q{|95"[GFB 5/,wrn64Q`v 8P:B$D&Dۈߔx}\]=cnA7+D(Ub $Xk4Q}kB+AJ7>>*\ݾJ0d 1, xHcR"c1G@x/ɱ `R&*ώɟu C"WCBdD\ Hl6ϥv)b|hYȋ"xŧ-zuj}jN]\Ť*R5y Ѽ^ŀGjN¬S] N5|p0܉C;VpH/t?!1G&aL锽w y~ۑGh<\Fo*&{,̭gt^|I.棡?p9#{7<a3N8 -hJ#!UȐFA ҄8 JB5<I:Nra’:[@s!Cb +*=g Z$uz[{w$&'.v sva6\ص=]I86/2\ReRٿ%E訉E0t]`97,Ԛ@g5y!'? 73? ?}Ѫ+ۙ'0It˽ړ|ep\s盏^9γ{|jk޿uЎ0 Sk.6>St0^.G|Ο4x;e1!R6of8g;Bs2m@蘊),0dF8zb7ZINPYA9tH2ᨮQr\nu[hڽ$ӳx1Xg&/SgPx7CnM#owL2e5wWb.$S A=xҐ_`%m:(y8I ul>q ^Ġr7!f͓NPpjF6py9&m_:EuEľm3!_saXR"Ǒ `Wg]$2K q"tO,F7\d6#Iq$XR;CVac[]@؄)Rˇ>C{䐢LRCIT^d`QvBqcJ2L \._>e@e !\xEC }J=G?f~8/T! t˓ry:B=/Rm5Ņ˒ŞKJ?~9nx$RL^?Owor<)%~R{~=_9u׏'zGO/bhfνT<^:_ %ث%\ M) k ((+4eu:;v$y;JýH"MnV& = VhXtqC;=-$q*, >RHU3 x4ԋ:..Vw;ڭ@<s316W5㢲tm1&y o0E Z8+%BÐbj iD$+U `VN κj:&"iLS!epރSZۮ :x{~^-SukZQגFTUזɜп-O2iVkW(u:]bM#C4uq0c|]~RgG5>ĉ?G\R ?Qd7+F#aF(F} AҜSmʡ8%s`|Z*XMD@_Αh]L%:n@.A6q,wiRhK3ZB/3[+9 %l u۪Z+{&qE{2ߟk!p݉tu;|Xi& gpJ5{٫/8ysM&>}gǯ^}v9XfomeiN^~L;Fv+ϔ1ZERn80In\-1.ղ c/Ų`8I%O/}11=7B\$YnCϕٕ϶<)&X+ VL_n_퇳G9HջٗQXF3/|93W%Y$FG FA%vɠWSBaxMZf>IY|%qz狭Tˇ@ʃ74%J7 >}CNFn64kT񰟚֤p^MZſ'Na6 5Z͗3յNT=DN_s jѩTZX{3R#Jjr8YCh5OuJYy} 5O^ag=1y`x0\˜bIqtJ%2#];Qێ - 8PuQiC5_P9CSBw+ ]iJCWЕ4t.NoDw(w; iNw4|CVDyqE ףlG{M&ZMVDf ^J7A+igrJrc0R(304(ˤ0L s.xB<S'U;/٣mR-~\f Yu - ~ԗF3Dz &!e`rd /ז`E0Hv*ae:xm ͍۹@ΎuAa#/lx>֖pH_1X`?g@Ki{M ܪM췇Cy;'E*5<(brJD$KL`nm@\bpN5j%Ckg58y=Y [ˍeb '֜|@\,.α&@^Tbqx].Q.ρ`a(c\(MZS6Rh~ZKC8u+ uQ Dx8 k %CR @:~Z5k+4ۛ /gN~LnAӾ/<]}A4*ǹn ?30N¿{L6X} ? d:rY/3Սz!-ށMYh75tJ!k\nnu1Fuꖱu{8m*7V[m>4WZ:%$+M! [] QebόmYڂnk!߸)]\n@!Xl:uź=|TlSͺt[ UN ~uӜļ[M ԨYNn6ogHYڂn= UNQqk!׬ [] Qebn}+)?V[m>4W[8P#Vja]"q?=l8>ǽye,Ōa7ItA>)-vl<9P'BKbӍq;OWm3y%J{x6vSG7HGB>wyKCvRO-&T5@mPj3P,b P% i8E@TlZy3QxooC2eN<3I_17ժ`G7Ё ?m.j .WLm7)ⱍ_CQAq /3y<>Ô0OwaV$jI+/mx&eeq"p+@.9wf&6dMUX !;|?_!~}{u\{UtLSE5?CFa—=v$" sEY 3nNcOfc/}0 hqJJW+xیᥳ)t^~c>tnmkj>^*::g39)1dNqа\iYeQ`%D/0]8vЭ >@Uo ݤY/ËwK3n `'o*] YtLr%Q Yњ`A/fNjl / o'Ibݯ*rM¸ ~[Z5>={.O>9N PUS&4e3L+ݍH(c-=tHf谠ںɍb 1~_<.PZߋ~r-p1}>U AоyoqS盒J~$z}ЬXL҃ -{;CQj>^>r^ <*9Wj3y5~x9 t@zz (U_59(>+`*U(D. NI1\Yܟ+/-yfkK(ohƢkkVdx0J}|q\..ekL3 j4L(qL0K0,Tx,R5ffzDV$H3He8bװY2`\agx(*"ȶ,mJ Z@dJ;!E: w(6!G8j/zV~\kTQ*S80N +į-qCܧJp,?O౪F2%wa:/GxxG +qei(U.6Ѹsac#|We["_Ii@3 AjN@H)(iĹ ̊rm'$#3oFIʕt&{nA1tl ] 1^8 fj%pV]=nb->F] =䰧 ʁPXsdnX; [6FV5rF6EUR 7W'p(N4SI,I<4j*bu۫r%G Gqp|USVi5`cFU[*8|vKN[~#[L!X!5`ýɵ0$d.P4V#P#uk7*wPXYS!PjYcC {l[''wֻ o'i0" 示?m;Ґv$VP,w.ܦReaguԹō48J4ngr:Y&GX=.O\>Lu 7pnaޱCi ) ZQ<{rMMXoRCƪYr C LWߐkYaB˼+"QRߛvDF_^V0?"#&DF!'ֆ^jTH)&i,CfI* f R)DM@%Z3*M3\DGӰ*bVpMbGr$؉(u-vTH*zqf։"y4z/Cnh1MݡEu3oRZt2;NVo߼, G'9[,R@Ȋף& օn>UswQz-=ccQ(g L1M..Z B cQ*Pf&@HGH&DܝAKs^,z;&H 8#ȫ)]ˆЮqo aC 5480۟t/[%;(VcvҲo'F_/jޯΈYVjYvioZ^~ޝ<'Y(n%!iwdRVGO2,)D@߉V ؎nP>\V'2 lh< IDE#P8A>#(4^+^5l6%hʿ}Trj?s#͏M7DJdy|x|tGj"3"S܋fjYsr޲)5F ,TkʱPXղl9y>2cDD[?ocT)M5-WEB2<~_SίZͯG[|VnlsfGxShl=ZYnnT꾒.3yRk} FWi w'&]!W>`;zA6ٴZ#=Jցl:?} !EH$m=rMOϫB(Y,oJmE(LBf1/v..OOZiR#m *7Wr%I&s JRih3=k3́K&fBwgpOTN0t?s^2;:4rUw/~7QDnFlL?|`+9~.Pxe?ɑ5?݈8 ?Pj]PErתJ,<:Zk~UZ>q^Vxa-?6Ȝn[IM8)Vi;&y\q_{;L 6P\|awmQrv{oeyEһ۳noԩ],%qa4h[>0fs2 mp.pmu?ӌ>';iy7|789QwLH*l$͢;JNː{gE-a9I)p ؜ *.x">K\o T}1Wʘa ?w'H0J }m ! f!5Z8>}c%1ݎ:_N1n{+?D<J"J% "ߍyKb9O"fi6Aq@iAؼT820T( Rm--fC[XK[X!T1Q)Gn9,d k޲.1T:~]bJi]S1u !fؕ~fosv7LJG~߶7$&7Ըf9-;pQhT;6eu;#BG #b$ݢ[l"3SGB7Yd\0G7,*t<X<ܞ%GLAZHwNAKN_) 5}8&efŴ_V giipӹ\#{ZЮbuδ*)Pjg犪$)2vh-6NqޖrO5#dØF&$@cKFho҇>|0) C,-5'لwG@Hͅnj8}gzq8-'^{]V8jލ jT"ދmC 5? t=Rr ^yc&G5d˙"ק=n@Wv ]{BDp3rJ NG}yϚ6ԝxǎn)^]њ9/rILy|8t<;,XAk,VcE>K2l[VAzH*4-}!b9j5^JbBǍW%Xotempy4'( p Z9L(*!mSPfz.r+F0@^}\r0Oi C @1X)o(U bT^t8 bN?@8Y=y@J2HaXbXh(AҊ|rWC̓[v6;P3gaԹT5H0ô;dIVZUd + iE1K"ژ[NZ?1 ä*/(9h oZvVfaA4HuUӉVP }!Aq%o{pՐ@$a}nzETf"z @HIc2c xQRH=l){}]+]ҷ?(!\9 k29mĐ icD0Am_ƫd-2v)N0I gGaz}UUV+ԝ(-OV}0k<#($2DkcU"^><<.f=<,$P"aĆ1 & & ^ \D YJUS_z& 3쏳 a_^kNsu.=(Pl6v/ ct97#n,Ǖ2.Z{DbZs?3V 90RN:ߴՆ%E*- @@&uw*AR:U_ݥQpX'cDk҂7gǟe 6E{4}NǍ#CTi 9I/M^P6'2,%T+)&9&HR! 9\8)904Iҝyhw=WdC\) ͮ ~C bgxƢhPO?@~ęk.E v%S{”aKƆxoc-*TEݍՎ;q**Dnəb~4c%8M%LN!, dd{:@0bb lw(0s@&YWL+5wnB(&FՔD)I0H@vq%)0F.W7;th۴3@'\!0Z 0A4@ ]9zYsy_>no% n$VoNh(WbVwP?ɲ>=ʐ"~=Ma2^{LQ蟙5sgٔI26T "O*#*{ep-ic%L8g8C%:~J* ,,E{ȔvqJ*:py$)'bhnf<Р֞+ኊV7b֩1T& HPQ@Pںg 34kߑ<9y?Wp6SZ_ "9Qe$Mv!R 坜\mYṞSAHB)ra\ (r !zj%6^o4i Erfv^WkZ9;@@VN]v LaTNK %B:<%t{ B뽯R >.j;9ME:g5o91FAo٤C+o>eYw(᭱T{&wqviyy`gp>p&v C = @9rnP9x?V@y^QF埵rv+Sc9%jad.k:hX'JR)*mv.79[;M*ҋ/oʊػmeWXz:}qrM2S)V,9Z2K*,Z,P )TfDAFhtn/{Es\<%Yn1~peg!+5Q5F4QI4 b?jS,MiQ0RHp`$'<NS ]y@vDKܤoJG)b~E%sCѝ E8T˂¿Gzt`%a4s̽XS a1ǷYh|p׾ȅޛ^_]85}x1|fV/S\Jʋ.O `o^.Ƴ bz&s2St jU˗z1__˪EL>.ӱd:3 OT^*N se>SϾ\Vid{ޫEɳcM?էô^j jUGx:`j鿣q\-됐g.2%]Z)&5@`ihk,Q"*Ǟ5τ#C7!jj5rQHq`Cb\P]]c$3J:%YGF%:Bkh}h7 M̓RZUD !< B4"B&8QmbhDؙw.34MEMM;f0-0pR' uC@O<ڧb 0VI g- "ViqP`kP5.تvL -IS1|q”liSN}IytOXq'8̑x8Š8=V׆y8@ Pvֱ:p H#yUGMk}`nHarSKoѷ u1ڗ#)9@;P7^đ} w|D֫d}}OYSpYD#4yHġ\g !8F G;;sкs]~Sk#`TaKT=SmȤERepސT^A0i_JZS7q Ae{_v3F3;Q(p&8)f) Dpk+֭p4Mf 'w 䢻ANЭ|.VN9dAHNGƹͱӯ'cLۆ $,q2;j|F(G9;'a hd5 9?ʰ)18~L @F"!xdN]p j ~|gؑ(| ;fHr--UH"g1$S*q"A̙1q)JS(P<yXǔՋ^}3,45ioBgńDY^S;/ ͑(ڟM"QR2iuNXj.ZX=.C?!/FW/5WwJb!@% 0^ &b6oa ؿFϋI/8|4MK~] z8%POe+/߸pw @M8vta$  ~}0XcomRkz.RH&$Q&cM+6AόpII"籌Q*hۑWokAś赁qۮ~!ekŠw&8Dq4R*$8B2J#Hj-aH!*iYİ _9)k 53DwHF-EA(k;TxY`x@nO31(d`b`B@x5#fwb C8 W",kHe4f1O| 0DH$ MeJ 5BHT *i|<˃w>QޱgWjv68?\[ g.O:,-=#̖ԔʅJ YӯL.-V-I{(k6if%P1MqzNf/R=T۷.߫]Ѻ^;Ggg/Pte-\2ԝTGZ҈9bP gDdzQĒQ #2@udѺ^Dbv*{&*iht5W{mclh颎0!G.`w7z7͠ƍâv(\P^BzmĿip($P~lՈ7+W52tv\`y>.i)c=CE@S8z$bO1rkxwQ-aQF>,H6VҴu53$d-ݜu.eVa`gU4mI,5'5hR^%0V6AW/΂ףzvgf0*cP  纥ZU Uu8!4Hꢝ8V0+0(0o:ʝmo:ۇKԌ(d'sICAҫf woK:`i~⿉T_T]]9jduE\wuE[9*%y˅Ǩ1;nP +GV;kѰ'YuCVa*rvu_o[8?>4Qy=Xd]'q<6&UexQ A4«ɛ&b`*XF|(Fn0/Oj?sr:"טrTrdLb_Od뵸2M?̾AϺ|[2Q`7jfAY$!\D=d [1jBgNV!FR߯}u{\w]ʷw%qAiBT"S K疇T$kʭ17r4*RHYϓj+e6o#m~}y͇F4HpKW@~) 2*;GG=N'{NѢLt *Q-t eHN6ڇk)pRJ{3pmɈW48[Gq0\3!9'טhH=k|+"Pj%DcR#(}cߪ1PD$g1FISm~N}+`n fyRWL@#dNUNݭR3iwjDft'1uC`U0xC 횊䫍d N+r;I (Df }8Q "jZx d4Od/c)@Mmg0:zG0&mh16 5Rwrz^:{AWdmkj怷>+RSyGFSߢoCz|kԘNɕR0A.B 01q/hY!.yTHb@;^I/iy1qpGOI] =2_O1ܢ<܋\rO?F;GSvu'#(a|44~BHĈI}< jFXpt#o vڔqBs)|@D V<#u[ "}"jTic|6<4< \Xϟ6m"v3ۄ`<#~0`N@<Q?\v[mQܒeq у'Ww͍Hjo2_ʟ&ۤv˗ܺu+K^Ivff+,So(GT[эQm˺~`QС@ 㩀3??> t 8k˄ %L7 %Itb@.)FN.3 XMrX@Oe R"D[|, 5zGĔBv* vB3^9$ۗUDb-8+cg<`:{/#|pj^#>˦Cpl*Bx9ސ Y0EYY߯{:, ߡA `ym5`._FSYfe̒4C!j)e3F#f<̤hHdc;O^лBݽw_$~_ܻMQU+"`%#*H9` YermA FSY:BjK2u$ʙ<:[T"z]V4^7 SXnZB̀jB[I%Xb #iEt$1i,NcA`Z&#X,Jנ!htHbBҵ͙ʺ!F#Q3(<#HYCj[-2Rk(ZgF^GŚVVګ*挬fPjPGOf>Z6ss~RO0ſۉ\E3 <0/^4?i7W;F$y?4i<4 >Iwлo;c3_|3;ӥB4󾃻 Xqb CE'VE\!+5nS /.M:63*/L'~Z>p>FRғOm\:E; # 4*6QF^ #{kE7 {Vtkn#|1|c"DSK܉&6Z^KRS0g-G ;.^{XZL&{ ;sR0CB'?'|U6NCٹqghR;Fx\(y@$ h\>?{x\|N`خʴu V r^)Neyؿv{8S9A$ƣ|EfBzUidG,(n^mǃڑKj^EsIq]@jZ"q\Xn;\FB󆓦 DB$ CtBGA0k2okJk˙]#U~f,4gQ tYsEKDZfp$:Qi*%X.Zp5q"ES~[ojoVzgJSҜi"9de:cnΈFp|Eq|=ﱍoX 4k5D(f~*+ģҙ\j CQP©f@c bi*tnQV4WmʪOI%;3}Zg>9oi\!мAav ) pKC,TK o8!h"UC,KK5nq›  3&TjQs3ԑE`]I/B~_iw0xs&Oq^J j$wKmX(ㆢJP8Me,l 1Э+띻1?OzqANc}J[Sj}Ԯ:trvKՊ =T9Bu,587(L&mW~?ݍ&H|A?1'x2 - PwS @DQ6HFֽ8=9ZkY)wÀQ[UM~sww-8WZiQ8‡_\2W9\l:LYHWF*h%tԪr$vVWw  ~/ju+_.)GN/6O{ QCH fﬧ_YzQB)H*kzh^KU=Z oc Y>l"Ubw<+߻8ʉ|6},ENa)Wj%>B8mޏ!S9>3rpk4^zwR/b({z_MݼG Un3oeZDϯl6f]xb7 { cgi=-*P Y7n۔VxO1< Y8C)qT6穒Nޛ3*O B*- ߬Է~Au{gUOkkI eB¼?6B*|zVK ZcLUOC~ 7U@;|}KBF!X< uSpiic%HP \Uf [byN&Z+9ۃ!Hv_%zzȀaKuEqa05ZV ާLgqdXee">POZ^3LevxްU V)x4_Gȋrk`7ޙ`q [6rۛ*$_/=<.>'h] V!pwlG$u^Յ%KFC䑵H ^+t]S&ݱ}4>˧;qw7bB m/uF'T.k5n޼]iѵ)ֳAY>OpCO01U\= rɹ{0Exs45 1!_1h[(!,D4+nG[o멁x[U 8{Q9崿@ަ*(W闲[wܪr\Yp" *;! "GYS?SJh* 7#I_>䁁ݵ1< "O6nJ2Y$n1.8#"Fl[jņƝS6(kժ6s b.N0ƪnL (yїnmZ0򏰾:3.܆l:tv*|!V bfW ,sVڀ(ӼÃ-ܱT[ء%v:j&,'&FzFl׏n%vt @s}q Ɲ9E0aar[P$VSYl-%̜r_8e8"D9pio5Z02Cc`)``LXpt S)Lr%ZrͫN1Y܂+C)x`c7h$u vo"o>u_ : \G fLa2D.IĘ ԃo\A "N6 {?~ÄAɽ3(VHˈT H* C& !ETDZ(ֈPEÑ{lj&:!kR`9F#A&^#g}Ya:w1~+u2+ITäLHuW?+Jw;)HafU.m9 "҅ Z1LLp}J]?haGmf軇tP]8ʮ( {oLG;k3+1с+GW6xVLuj.ЈwsSU$ЊEB*H,)87s83,PlOt%kc|JTO3)h=#Ԉ Tِ<=Vgً0bcLg٣ϺHWQ6JDq2pIgՃD-sM$h/y9In2sSdٺq3rјLy܏n'rF+Bx>} !,17;k*5=9|_W5 Exkग़'wiNld=|.=]i-bw}|ZOFO[hcI4?8x$;'#ffI/\һx|Hs-|T'>q|spy8_y ,1 (Zg>f `8G13{V5r4BUK`%QACʅ3JZ<D:4^ sf𛧷1SUm̔pn !c^fbLfi˱:p޹;0ۺPͪ)7F@{;#T1պm 8}2,Һ-Zrn2UƸI%rH%/I%ML?h5sm#9(ˡoߡ:>b`+IMYugH`z߬ukwH1jL}<[Gɘ@)FjHT6|ixu>:Qbᙹ­[ɒR'٣_d'(ѻpFiT8\8sRN *̧Xm4BjŽ 6Uzk8h™Za9^?S3_`J۵kԔٟ^,<֟ dQN`Ldh^K h^_b”HypGQbр+L4/Ukw)lYpŅ{1Rj+R CM wwc*e _lUU眑; (0 3!-AZP!rlbe n ؒiWʛ`K@D71(qR-q"cƒAP܏,H1)CFRZI('kyu{zJZ6Q7G@sF,oAJam*J/ ,v].ALh#̐fHɬu Y&CɷP1Tal\ZbTZvĪ[wè@OĠQ>bC\"BE/BAhu N$d,p\C!T "Ǔ0,MN[c. 2c9(Nx@5R7Fz CDiTT82pTtq-`q]ڄO/m6$}HnD虠0P6r%WA%b)Ic:1 -&rHT* [BUHH4-HT|W:c&܋wmAnƇ ju&'>F f |KljyZb.w:ς[D7+,W.r|vs>E`W.Ḿ:t~55x>Q}}-!ޭC} z6PJnr_?y- juM"[5T3G5C/t<鱶D O J;DQ "ə3$7:zl$2PMW |Wg"FKEB`c$#V07bSU 9ux_". ,nj_ rKm߄tڨ?+@h'ٱn`f\*蔎ƺMNG͵n} CSG D3V*蔎ƺMEHh֭\B8DK0%(.Gѝjh L m͌{ ¦Tq[hV>ݩ BAZcԀ,uT*2jQ`v=B=2e21.cS1Yj; ̧|y { " pЊEB*uanJ|S[{A@ VBEJ5\@C!LP &QSDpFIjyi_@o+ju ^˷OqU^ ΙYVV~{hC@,o/_(X)pڝ1`)/61 'Û99P^݌޾-jO/ I>>7 ۅYc %K?I[i!~1խ~?wҍ4p67?3df:}q^7S]`DTN˾7=$?zv#]YأFcTҲkȆ)9mg"4Jdd~% dd n̗)cPęAY%DHUƁ%g‘gGHVWmx67KA~J!F+sG,h+Ex]O?܇cL)j́?䠀¨}Q}Z-OJgP{whT>!3g+ӧ ?HusTsHo|>\laҁz5$Nabu BkA4YgH^+O(5eƥ2&@h?OxNXB0d9`<4cBpxێoBrE[oIQRB^Ra)DIdH+S7'֟,润Dtg_9ԓJfIZ#gV(SKWOR)<}= V"ը8п--L \ޖDW*M -jHץ> /Vg@+{سp {&"/p*0or0;bQdSoth5A#jzϵ_ϯRmrԽsҁ^-Ubjt%ǶZ$<0ue-0!x ̐htuL֫cS7)2gueXV׵ڷ_|q[m㇋xO-wukWhdH Nqq0D[,#ThFDJ8Js29LuLR"ѕf}J~qz=Bq'ᷣ|5p_M*UCUeO]mi:n˯M+QC>SaN,0+,M"Ak#՚"xH9V$u#GͥQL{Aspbݯ;%ݺ3oOEgDRʯQ;{DmFEF2Yd)- lr60:]lHE۲ݖq˭e'X-6{6.{9Ş"+5 dT H΢PwG`YMb_#Y H}6c?{`:b&D4JaVia_|xgm3Y ͋x4;[7>=c%t6jWXhp,A')9&2; QYzW+eG-HϿ}آ6uaƇۉPS/D !9Sjz5Pi7.Hu&~gӧ>KPd@!J6pLĶ!%FNZ)ۧ>6, }ճ|2.oz;-yx4[/oWȃU%C2QX"HdBLq6zct5Yvl c{Y6&HX>*MHK@>dpNSp| &>]κN%H.=q8j[@[{h)צ¶Jtڛڒ6E|zP6_~j^ڊ&@$>wk[`z n2B5#tZ3B5#t+6zDp`)ʻl|TD1%+K- V *4hbѹנz}Y Dg{|QS%RKEBIFֲ$+V%H$ EL&ɚz BV86J~vFѭ/  ZW..ˊ4SAfSqJ R8:ƤDI##i$m*%߬@sR'hb#/(RFEJNdF8Șf/j:x]jv"}]j;B N=[MMgV+"zm/E6.FeG\d_UŽ _Ad)<*6xN&(g0R+dB)"BX#_Ajo@: J E5eVIګS}0m_M-;KbT6Af#d&ґy.G?GdMEI:lX8,Va)xdQ{_l[7*0"M69L&-s֏)59ӣtu/5~&YKln>? XfogO'X}t_G, b)Ϧ}iPլ_DK#|_Kp86ʹ ^ G%?]M"x[ zkw p}kG: ņv9G,}/;6lamzuJ}Hnnm&Ov݄w581&WqWv0 TUu/kv)w AtL#N=mO+#{(TbѸuN:Z^hbt(Ehj( ?^L9?9 jmxdƏy{Nv9966N>.ÈZT>.ټs@rY&ν'on\{MߜsqZ-/F8X\;>`&82YOOE Z`ף׮Ue.'=@mCu}4,yZNnXk!i1~iJ0s jc(َ>'xlm.#/'S_3%9iBwn^nzc 2ʞ_G?2ˌckO]Wg+T&UAYY&g,P2E3c"2#eM>&VRJWzc*#Қfyb@Q0 Y`qOeRY( @y(\qY$ ()8_` @_ |ٰ#8!)xk{7m\ w_ƸG$ _"->vmKkP >D| ԣbutϾ͢k@|ǫE]9^dmv4׈v{/O/}dz;Kϯ>DR#".TFz$\?/QRTj/rbWUF6^[/pVG]ȗ$늏AÑz?,K˟b~G ΀vХK]rq4tt ­#=ꁀTQ]bqΦK1rW[(H[wj9aԤЏlފwS;Uq*?>:J{N[͵Aoeb69 0AP/B6:+-_6ӐC*AGB$eԖ[#u2 9i`Hr8;z87\Qh͑ƈs<Ìޅ"s0P:YxgAE;Yb6q 5p6emg Rd& Y6lWI!Bm|s7o蒌mx ;Ƙ" R 닏:pVۺ=CWM^||~#!z㲴TK@&DvYda:;^Rw 9+cOy1; gF츦a'ĀԖ(lT! i=yQs_<7>JQQA#2f|,l:|,)Gcu6 K?kdG&/9?9:g=oB!ʧ]=^Qb:R[&;Bۍ CӓWaة7f]8Fx<Se7W?hq}伈P9[ql``07#2f$:!pژ:fg)w/V>h mP `7X ~0*4Oߍ}Gl,X}nY- _VtzxSTr![60跢a6oouB?*uißvuP -ze[EMl/9E;sQbǖMIjnm(<3p {{0ʇQdzf4_啊UMyWχӝ DkӆܫTiһIE|wSaaSx꾈p񫻼FOuH5ۭQ !_Kw;{ݝ IƓvr2&cu0d۸Eùr,ƛ5'wfc no}Cc+3JM.Q- 0f]d w-FxU]x|u=x|sP4>,?-+:$D,n ￸j$Z3Sp~fI&Mԑu<&5 wax)]d:5۞mݿ64>UW={gf6OxWJߖG[pIX7J* 7Q|A *# b)B]"xM0t0λXy7]~y7 TvyfW2dt~Y 7 &k}G+FW(TE><('R5 GO=5 Gb:(ؿ5ebFa!iQ(ḫY0t}F`f4X)UpB Fn^-%;{7=sj,Իy[as I\b|+:*I*_&~u:DbTP:+~x&3$~b\ OCxK>8!*(LuO ~v;ain[g=@lCOE>[w5j_>,㏓[7Yvb:TiI'8-wx@3B O}+y/"J-qȷ j%{}Әnn׾樍qoQ)L=H#4\ 0X =x`"bpP"HpcTgCоEctื_ Chᖫ  -G L`pBVpi2`,wNLQgiǯj_ [ٗr]OX0k(IQ&F8U|ĐMFHc2R l *__0o'jǧ,=ztZԞi?]W5RWkjhTbfk>!nVW/#/VW;V]ivu`!AAGΫw+L[t<TWoBx{]>#<Q%6 Ie:uqꊳya9኷=pm ԺM[V uf׵}I!۩FϋkȊF*:sÖgwhgZܹ&>rlYKnQ]K6.C##DӴϛ dw&zv0ֶu`V]}dDAQW/GbT*\=8_Htu8qRSiڮޫ&J˻:OH`ᒄ $'heXj-%%Վ;Ź66Jh`\^MD(2%qeqKSs'i$.r9vhr (]2`g`jYmSpi wf\BSn(:Ziꃀ!wYf2bFR(r &*aBR JsZ a$Ml"SB%L,HY2&S1 NRJD (UǷ dY*ʴYb %L ,V#, 45AdƲT s fS ~+sVCCS#-!2 #_:M 2&ѺTۣWk r[OwmLչ3B5^(A{ ` Ei"NLhJ%,DcCTg$0FQC&jS`JM?7R`\:8OK?J?_ ?RO]Z?H+VWUcN|`/ߙ[/+7X^RwjN>_2GS}#7wr>98w%3ʸ0Nwwyn%2[3 p i' i;wsXoDH R[s+;?~.Q!y죪/uHZoHK]O7HoH1oXv=YߐRt! 7 oHO7,irr e}CN,`3bMJ!Ikdl$!VT{APAP ѓ Y=!d}C,OP\zNU9Tۦ,znX-RpQ-Ed| R 汦jJg4Ǖp{#%BRnYf lm)$xS`q- 1lL 8_m0,K }G {%,Λ?ܺ 9(cW P\~+Cy;d8~D(z {;Ǐ:qvӏ5:koQs _k|GjS{Q{ǏZct{9jB6SC.{bͧjMnl +" j|fNbt]~jM:L <_1uX?׃^Ox'|߱y>7S(>O~!-Rnpp/~L+p 5'{ N?eZꑐQ/^u.X7_ ;n"S[n9֭ 9q)ZQutV@A}!-NDi.5Y7,mEVzɔ &.W0nTwd݂Xq]Qպ !'.^2Xw7/UDXp ֚u󯴪hꐐQ/:6blú1,q/UDXpМSݞu󯥬hꐐQ"Y ƱA}!-\pZ5m+Z:$EKJ;X]ٰnu'uŠ 8fCTnuHȉL* &D*C[˜S֞u?TѺ!!'.^2E d|<\hw QUj.6c2VŊ5dJ󾝗cQ>S@=i LUUƱR*U@4۹n(V5Ī:5 +@`So'AXȘ-5dJ4Qkcf-fH2kޥcf-fN(]fcfVMPUDYZ5w2YW$Uˬ6Y;̚F/P13kjP5]Yˬ1Ti;fbfM`g*G713kޚE2K{}5OWQ ƓE1[ca#q68J dQ|<ܺXw7Y\œYD߹[,x" l~Bz9?D!QRkgn9= v6xDceE~RAhig-(7?{m9ý_rp5T&tO̩X&Ï@p7gX,׊ ۠"􌂼o^L8x2^\|K; bx~g9@(K0LZk䫸BLKpHڔTR P99:o)bnX>|l޿o+7)\26El=?b3T!%KKfxi&QDY,G.(_1p%<[Imz7_ ҁM '",PގTC6%ĘL0mS B0JduTZD$xSA gn3óATtp)exBX耐L3JU4G:SS*G3FxXZQOmƵ'/%#1h|'bi 1,"ZbRNm""cOE Ji*v qh I!42ҤBeh)2`/-NPB' Ra3v t֪f7$)!f%rP JSp50ʌNc3 _fEj0V")nL" +K7+ YHq&fWt>.~S9xqZl/"{ OD*$#߷zHI#f8ދwW1P\|m)NINBhFk25_8"t!0X))RDg'ueIՏ39uI0a\Ǫ:]2@M`m1 Ci(]%uMu`&Mv˲&Y7<ξ!(|E0M覂= ?-oG^`-i+5Q{jn&F`0ͧirNe9sIJ#NzjG? _P qBSL(;5 9 . (]pa?d[;}˶-&AdKZ#옡m1Ck$7X=m ܚ}Hݰ}r<Xٻ M  4P i%z %PErJ5SBMr҂J"1 \F^mu/0 'Zx!PN{Ƹu%` `wLŨ̎1s,u3AڟKk9o=g=&Qn'Q==Fy%mfrt4?.g(!+wʡw"7+]M9 z%; nC\o3WWo}?/Fi*eyq#6a >dJĽ3DAp4}0)"j(c$,B\h kP3e}J04*n2'Lhp-{]8n85rW'tg:s0pP+?ZѶ#=eǜ}D ;)NiW|Wqf/۩2S= ?/J?]?*I# {(푪W'0YVrL{#o_~k8sLrKbBootvǬoƋ h-K]0o je-3q3sB8тP1vlJ,ׁk:gofK! u⹈ rʼnIF1eTN2.C^]]RcF-vvԋ,T/yk;rxXe>7 ufds(,8!c%!t 3>ݘ<H5y*9T 瞓$5g1;&~*ZDAng,4Tzo]xTh_2|FM ;1Fej?S՝=υ;!H=)IvO}]ϯ 7"1f)4cDɧJB3Tl<R<%,1(eNrGe)i#`CE˭i!᷍vۣ ;Vy$rT~bX;NBͱBͳ0Uiv Bnc<%4LFj 醫n|n#/;K`AA+k!05Oт@+406彣2铗,F#$Xt4Bg7PceG".-peaXY;TZZWYrb~j)!.08ъB!zPpz{R(%ZhH: PF>~^WVTqڣn|K7(S~~ۈ߯)Գ+X%pk{B$\ͨBḃ g 7Tގ*5϶Q/  ;Эi r% dQ(T@=q(U"mj'zɸe+#N3D[ ^m<@Krgu[4C1͞Af!u!R 01xI> 3(4u@T Jl+ܳʾ{S_'e82iCrT nT=2B2 o"29Y"\b™ֻw"MDp@ٞ&KQᚨH皡Tj5C_dUbRV,Mm␦#.yqj%׸~U1q4e>\E!hcz58/|" 0Z1g0m!g9Ƭ&'VZ0S3e {03R2ÉAf`7c%8Fƒ \HM7g^}k Q!׺y!T KOX4i|(ܮy"1ʛ!J˽a+hIF{|.0f@\F(AEZ5Wy\}&W(#Zg ;N|CyV5jU*yׂF0-:٧ T4pu(CjLԲB .dMPIZZ.v3ʕE 54i謧Dτ`u^"{H#vAc3+!X)nqJA5z~ðMsUDVQ" @@cżuD́7WLsiki_DW{1`fC1 7 +D¢f 0? OR !5tś]LCi:!jК~kDNs =e7(?eILW| 8nj3Pq;(TK(=S-TX /qsRu9^!aVJ2oؽ:YӫbJdYEhAUQWҪ~MY'YiEt2 8[-ws-ȗvZ%zP"*GPrhmԛ*P|rze9?mA2!6!(a/Iu''VJvG魣Y>0ؑySPdt^?hcTMl#>,BQ@6èhSt?#{ڈmք @HߍJZVI+هO_Z 9 cهoֱG Nw4R ^!y{L&?@8`Dn/_e'5Mg,"^(i L_ihU,O-6L),N +PUh/hN'/y$י:$X'Ae:3%.8E\a)QK"]b\ܶà#W_Z<ӂ h*@Zh預2)E=c}.":c?{q /Tpӣ*=$J;rE\)CRT)ӳM ebw랞޾qI iK k#kc!kXsE-wyUGyY<9;3O13 %dX=hO_&Q Geץbd-}wNCĦY,fo({eݠ֪찂W 3*@| RtpSV2jr).JFd toripMlen,FɄJBrJk)$t]dDyGBF3?e[ oP4AF[>|kQJ[5C%PHŽolĺ-*h岍F2aDnTHX9%Hh$OB[-&DUEG OIIBiđh-VHzZ.0˺+j " FiP(ne -E빑O1=ziҭH%j",)a)UJ4f%\^&/4"ti^*m)(E$iB:2ʼn1Ki#" L[|-r yW HƜOJ\.ec22 5ZAGj~Be(qIX%Э]&Ui3MUò6TDqQ^9۳MK~ތ >f@ib;'r X!IF1K333HgYe\@*lSz{xֶojZ/|Js:M—I@-NeG!``§eԅGECSWz*&<Aр{CP]sO_fqa·jYl:C_l:\:.t,d9Q|EЛӨi t ׀**t$ D6DQ`ͅn{&衫%m:9!bV(,7U XӼȂP~魦!/V.qio/gT|AmHRvU#rUye\5 @ko+*Pu\A7 :*q\gdIQBD~Od'._ɋ;i֫E~=qm3g\C#? ^9Z5qit0Z|JBP?UT, |~^}gK.Tj+qO_]B-O+Ar5zro j}錟DH:b] ngPx'M;`rdVy5Uoaߜ2B=/;BH$ Sb% W^O[ ]-- ݉FݶÊqt̜}[W:teyi6<)+ĕr8!o'OMO8/@']2$UIo^՜6Zqi3 ?Ai1IvxeT8´3B&͑fp&˖l1[ch CEI WٜG N{`%=%KKCJNh*X[)XPHjwQ"ҖFES@+$|bI<%& nl[,F/E$؍*l*)B$E .xZ!":iFuXMD605x!kp!{UT\ZkHXӅr~(( +nJ/crZT#gBQ|vZFysѨ]{O3;d!,!2}^nxx$v)˦\)Nh 1x*ډ6VcwnD(bCU]So:hz0{*=/oa2nx2Ruo^əj},/j%Y!4lSRM,$Dͥ*2Xn$ȵx k3FEPJR//WI"g0 Җ2n:k󾖸ӓ֚~olPbocNDuMϢj.H--Z,y}ʣ!jwhuڻ\\E#X/V4UԣMz@]2q-vE苋GCb\gp'S6.9ɮ0T}%4; hMRZTgˏZ2<ì J_) a6| VC`޼|`H% oofwϘ|ZvG]^.?ো y0+p }s ȹW5ޣ)GaK'F^._s../`;R܏@Yp)Y0Uu*%d_[ekvn dKHkBO8@ Aɽ $~}Rq/΂J wpnۘ$ٚxRpݪr#naNDyP-Fp}[$(7mHFT.^h.Ԓ>Ə#fa;X퉄U@m޾X{ u.<0"pTRQ-E5?ζi\^]rӚ4w>׌i\MVj7WWT+U;W[:Z21EGaEW$ARt,nhCg&PhָgnT*3JY+{D4ިй*ĩw4gIFpE㆟+f?>\M+">'x;4QM_CY<(cu(y*9 &kë8Y IwPꨅʸϖ߰mf beIKTxBbFb$ѐ(1N(P \rԡPAqcKiAQ6 Uϩ%ǧGt2@8#WXrB)Bl?&LU!jDNbXh<\|ԗYo/ȠRU90\qw? * w7b?iɳ%ϲBgStӹb 6#_Mب NRPk{e]z d!XSJFmȘ֙[#(g#l1_6Z8Ė!(d_Fx=ה ~J6KōzM$nFHiBca C)ЂE,_QƁ3^4Q[%4Xp72łhI01șg;m+&C _~?Y/mnWϯYrҗr5jR#/{޼T°IuWȆ#>ukL gn)rW'OuH Pţۃ/<' .d\~v0e.Nnkޠj?n//V5˛XIyH Mp̂oKy-A#g,ۋG(%lj> -:bL};@P"zw1q,ien|Fśa̟&L2 ݍ^]s_FGciqTREleՉPWo8֢Ln6=@iMzϧis|N Bp0tŷ7אgYz2H3E- ac^}׸2d)|ښD0x/417*E$X0F2c$h0[.GV)ImCgf sڨ c&a3ELP]J&'HɊf4T+itU@  2&׬N1)yuMf5DrBr-Er}E.셭<\9^0 1A{9١)oSprQqAjP||{Ū:~12Mi}B3WDOwUG3fc~wItVl$!2.nn/4,db3jP𚟋!r-襳1 "Dܟ?I%7:)뤜rn*fAB8I"׌՚J*V=5XK@zMzN%K'**9X:yK>tX BJ 8r?JN`<Ō(զbmR!*Ŭ@EMej"XˢЩG_[ Saqf3m$9:o8PW23mf(yHv7~/<io1򜯷K'.s^]4'km|Du8mʿ/Xy0,7Vֶ ua$T7_ֶ xBH=q;OU/C Epԑ$B ElfU#ym: Ca;CoWr WjXoguѝ{]t'/YMS.h."tRW|vէGGq?=(Jj) ~jtӍ ߇cp-+-snon}&`|]^2lt@gVgUpgHuă8tmuR+DWxeUuԫ:DxXܬ[s]:,L|)* ?DP+=<~Y=//>ǫ[w*?øZ+G =DR{xׄ)|` "F W@wy*.VD8[!?)%}Yn$PuƵk HZVa}{`i8esuQ2@!iFAb9 q 苫j}51}:+5x%COwf_n|ٌ%i>Z$06^'ErJ;++A$FmTw{M^NKJqLw~m5>0認}^:YHo]j^[`")ZӤϊ3px(wу ;eA Sl"\ʷ{' o")UAaСUm쓜`@3E"eL`!zu$.GKVv|dڦQKa7Z6*D;-YDkO3"Zhp)FlplzpaG[g)29fZ|Wr!ňIv̾ }y:/Ç$5^钞{'.>wO;Uu 9%-H֬:TK%1YWe/e:JffA?)g\y&~knd+5uu)Mb^ާ_-ǼXod]XR<+}2xj9-yoQ*o~~ymlR:NߑLҔHA#\|z+G0ߨ3i0_q s.A gA:2l|WPΚ˫𰨼Er gZwxf4ϡvMfTcv Nfͺţ(&VTBbҤPLmdu`Qvn];oo< I)HNdiC&jsd|4#juQhpМE+5ѽWE1Nbz:uSL\IX-aKATM56rqy|2W|+We42d.*wQ'K47UgݾC=reK},3A)+YHRTЂG0DH .*m,a%á:0w_Q:6@p+#W[Q!Tn5ڱZ5 q.ˀ3EL^5:ϸ@9h'g?/ W$K%Sf0"Aq̯FM>7&0SdJ D ui,ңQWEswW7s"ƱeTX\޼Myt~X@dPAbٴJ9}%#Y8spː* VP91Ѡ$\Rt{`w~ወ4pi7a4hџ1ϿS4C%pLt"5\ۘCڸə.23p<G{9<U*z *ɂQZT́m ypoQ)vO5j6!zc{ Yͫc̗6dɲ=ww~FK`*PV{QJ@*g-x!0w:0GCv鲀NI8po>,w` upҫ2tVҿϿ~)pl6csl9& Ud<5=jZHT>u  pK((ec3*8'))kP Pz$&s4stx!hT]eN]‹~Akh]>d o]ϩ~%kЍa{'LAXhڋԼme)g p9p]~./2^ݔ< /")ۻJf@8vg1T7&pg I_fwjEVus-ՓAI>zDoүsfA]M[(aEf~P?'#t="ΣuD @ek*+iQg{ocs亣VΜYh wK:m:_edx`FJ`#gz~?{1bh14VoEȾ%LER uBzlL̔7pE7@JgMJg,$<.p(30 CTqߧ'庠Is_6.{ʦ~W}k=cDtAɾa.NBI5θ:o^‚%9}US]On;DHV7r!ntSu[n-mu*j\# 23+G^w4@r $rG '*W8mp0:{tX@T=Ws}Su ؾnl7j*4eh+4Ѳ֜+& (28[1Ed&1do҉gusu<&۽ssz{CM3w<9ű("(9Ga y'CR[.Jc:@0b(!(";[oD اD>׭Gb j_'7*Q.FAFp{ʌ f;L3ɒбFI ~:**dŴ2,L %7?uv.yB HDTOGDkXiB)T*vtW'ȷ@ \ljuВE>kᎰ5H8%NZJS.b 7r\'fpTpҐ ҉0ɴ4;H7>С9A?[i6G~' ..,f5As:^b17mڀgJd(mB R('zPMAM8DZݸQwFgٯƤ`[CЅ\^,DaG M˴yY4݅hJ\j BKm Π q;e)j`h}l)&0kU:60jD96@F; P P.8;`fFFC{89hԨT Z.3 ofI~Zu33owǤx0[>wX﷨Ȼ5q{ P Ԡ 1w]Ki{OIpxTqAAH,|0#)껖LK[ ϭ|T`B`}$uTiNpTY/{%`pvws٧6mkb[IN&9~-j$-6[ qUHEaZmu6LeݶaFھE5"W3oO)\V[6ʧ&L l>$6 $V3^Zf¶91DH$We(J)pS`Se4F[ .pH n† B l1^NS [(ֲ1a ¸){*4KԹ ?!>0NSmEĵ#[b[\+o|z "N;~/RPĒz2U}RY'2Լ,Rp(,ԫB g}d7eT{>`Lp|802u׽^jR7遱 =ޤ+$tuKZIVSOwrywP X + fDE}fBBb.̄(0B8 q‡?9 @@18e[6oߍb+Cznܴ3i,s)f(J-P[}NZ# %W# "T||" \[Lɐe(h.r 0vؕ\% Ji®g=JW:#!\ױҔJIsD.1Lle^ Ja8)Qs01K((`u B7HD+Jvz KP oE  ƨi!S$(4pǾ`U̷1 U`kB7*>2 fGcp Vv˘k.Mg<0hq?KFם)l倚O˜B3Qe3 A:^3 3lf3̧Sf Q+ #A]E͒c5Q6~sy-ˑ(Ɂ7 ʗ:MGKTJ,%9gYp|$%́~fvyh"yzq >J`b3/|+ib7W~y>~<rՋ=LZL+Det%go=\o=.aN 2D_DMJvD !zPu} ? {;NrB gƟ﷟v,s7]牳 fm%ҁ6zsoҰ*}$.gQsO-%[j춲k⚡4m[Y/dANM1S+5]%5 }DO3!:>}Lk k-p`zVC+1ݩj~j>ꡈ6E~'<;pFR/S6+">~EhPzv1±oGODRt< G aTԦ s.⸶`+Yݚ.>IFO,2r1rYリ]d{v2c*r0Q,#ygBt2\D뫻|%DcGb +^3}q|8U/=ʼ4 EI-WK'%Ή9͕ ^FWzfErOTwxb&RU𡇂&-(U)!O^F7Ɗ᤾K=w(2vp<Uo Eb:v{<$CfmV+3[4]i-~n!W}ǂr?7~bfկ'&׉{CbXw+lEY=j*4@tU[HOk\HBq=\EhG.]t T d|ׇgVid3^&trS A1 \'FՋ}Enax`(\0!y"]9vuoftN]K+-AKϲDQ箴j֑`:溹)ɚk'j-FO 8 EJQ@ kZ:UMwnq#la1x? 21'Z? zkq<ߟmksmEJp;ϊOu [1yҪկ %*=УnJsB XO)A96# Kdyam& ΁rE 6r("Ų~o- ۄ~2b$;qf&Yϥ';^Ջ!ό]gDXog3'멆Yϔ2z.uWM̽~I59ֲObkZ4MEɳ<HM;)%a/ 2!C>ԎgoI \Rl0.JctaLbQ %SaL8-R1`q]KHS-a_A )y9}n-%-J0V2Ν1if a(60J)5`VktZW6|sS$QN{#* l$aF`dR2kKTVv+(ʎpaӘi +\g/(CGxw~޹$ɤ֔F'beA=t(OdtM׆J1ybΛgYWELnE-ZM]#zu`&v5M_UhO!-zfӉY4+zvSH}@ޠ熠4+|z6f_fsww*VxQz/[ͤUEY.TׯC/Rˢk?ScOA ѪƐÊ/!`jC\+zTSzd$p]vˍMvJL׵垳:r'; r0&H_]άLJY;8얕gkk2U;iL;78T\k1+,-Ϥ%nqvKRREJ8,,J[>GIP G93ç0c*YhN9IœMxW5Rۑ̩(ybe+]­g~݄Jf@X)l+% ]NI; BS!ooa#SctDOKXB&֋yE}cqOg%!YcX+j<8;9ځy5=#ə4NF_aI4zUiL0 ֓\CBj5.)ViAeQ82PHN|>#K$}+*PYҜd2Z9gvdBP1D G%QX9sf%h\^\V SUUPx%a^u %4!΁ DrX9ScL9 Ge)ӎ';n5!.x+RJ,pH cdKaKq,\((F+sx (1~Ť!l.d(Z0>#"#C^Q|zP<,. T8/`"w_jNj٬-?etػՅ/!#3r4͗CxwJKPvTշwWJ5ܚ6>&Ʊ``i: 24%?=P~Zۆboؐ+Rfno~s q'pYT]ɢCr*~Fo3rtZg|Z=lf_<^RbGq}3,J~\ ;7|>ysy"Q0gn7h12 ֟ÆZ]0%{,_b@V EWzU|u*[!{5$[u^fil2]:9kbF֑fEuZEY {rGWš>d!rb VnI&ĩ$-ф ~ش%!Z6T1EL7M 8*Nۘ0}-zAX2%KY]S!=$ IdjUjq% ջΜmIǶ^c+z[~5Br:>YV|y<7lCNqtOT.*_O"GwulqCFXU GLy}9Yo }}s $%<̄[Rw~HDYȌRdL: َw<Z yxH<g7<439E,2XZUxGɯs*`t$.-N/̤Bze, ,84}6q#쏛&X_ʂsU!RR.8~q5W n.ӳwXP%q|3}9D\b f0x:bٖT[Ц߇8f݈U2zizBOĒrsW8\H't݀jw߁|UUžB 㝣H#Z)r3cws>N1u S@4~0iL ^@pl!6) A&_=F9Vo2v{1m2$秈R=_ο{7}\6-")94TG@.eJ>yjՒRӿM7/ezg'kͶ 8x7ats]wJD~cZbXjl9^olql8Gɫ8ͲHƃÉF$v82jLJq@r}twSTUX}*xrHpC8@Pml0t {{.{C{X}Bs̡5B-*ӕfQSP9OS G t Щޮ=U#l\(uS^cVuf1<*+ZFZ#BLe.hN ]kP}~lzpl:PI.;R?&<|o  .^Ox^HFXA%xUa~Hu4l%ћnD6[>Ȧ"ɦW1qHNϙlD5ѫ{K 4F9pxW Rr'I9Q{OМO.A8=[0sEWpr{0(ʨ@SI=?Y.;EXc~ `,:q̿k~K➷? Ƞy~inY ?/># 03tuo-YQ- ̈S#ИyLY|=2 H"cVrPZC:aƨ/3 IO^9r*DnM2EA IO( (.ƧX\DgEU 0˟rd,2 er*z"E +vX hCZ)톙taA_ykG ~orLlL-vpaHr@j!ֽmE%nms1kk%5mkY&$ 64 ݢnR /k(c8kkZp-UE +[/v W7$W#@h8Ô'=sq6#!(2|WEIPYpYW!%_!ż5Fl'7jpcq>OnQOA1P3l%=8ng ~9n_S[0G_6p>_hfv2Dp/,4,TIȸ{,ݩ%"͏8,!XH}5?q;Aӽ"-C9^jНD>r JߦȉgN;m4Gnlɻ1˔yNxQ= JXYQ3Gihw&in6㺯ث o گ,n RgEܺ@D|`]@33E7̠5 2Ig$w1K{yZ7l@ǹ^*s؅g&<{$7s H.6QiO%ڵ1=z)GKDQK p'}9׊/xC^!jlS1@KW9;Q:ފɧ72fzрD!Rj=ka u@XJڑ)񤻁iGj.>^[g:$C)FjXg 1XΒ Z WE} C ̖zMF7o67(`f;_VnBPVLרJ>w0TA?шWssG'W;:\K4\&.*6v>oC5LfC?u!8.8?f܀ڭ`|Eԧ"~PBrٹ^6$YqSzf#HLsѦ7J>Sk^|0 )\FLG-x8~y!hy;Ss1"IN<ڍG*znYxNJRJ@i> [cG)̨=8n}AF/^d2zQj1gy/hҔ( 1WM'GFOєh8C dDZtwNǺ&@BQ؋NW&;WG+knt}g4RK:lS܃H0>9E8rA&t e[YOgzZj[-.np`CJrSWAB{ٲa@\o6&a(rq fy^@{8mZPr3c_(6$K|[t̡0$A.\=ְ!ٖ pi>7o?{41p4i'z*w=:B \"F*u|-[t^h͊! R3͎D-x+6o]]f >JN<6Tϡov E#U8ͧ z+ho5z4^fӏ-]3\#Itv ?od FhHF٨~~ɂfqa!L'/+=XAxNLlVi=|w?{ӋvEFpMmd ߏ/G7l@σh=~цv }cbzˆ49NX+! #Z,xzky=9E/ʕ4v7k=SPMbwP",x5MBP:$ZR&C>IO-KH=v IITu H@#hRPk47VYě(SpB曟V>#: HԬKEfK,l{mlk.Oo cGQ-ajByԞ@Hifce5uWxrY9)y«,*OE][?ρm׃Nf͌B> 4Y%ǗzJ·orc 僮X *&|$s$*Vs]]r}^ fauNOusI)¨%+i=IW.T>Ϝ^Xӛ"P&l:B=iAA_f- ,T.)4)d p3/ǑMOzV^!Fq|ofh7i"݅^GnC2m-J_\CJ׫ .ѕF8i]>;g;_<(*sxwMxØ"/0ZvK+~_trzhqIHΟ6$\ pgq6rkݎs_i WvQq)S߹>G:Yk=oQ=_YDK0?YB9q{+, :^/;R̍0~D[ݍ.o숦 2-O\Rlʒ)夏g6ZV$}E֖e%]'Uef䅪v@~|U4_4@tabrC@D#^$^PcdJW̙:g[k^;:mYk1뮛S53IEF‡hʎq=/Sbɦ{+(Gd7-xY7, \b[4ZA䀵Z%qQivw~gm`P>hG.뫧_uCӬ`=;M5s@OMk%QhZ@4'mI#xji`whPX'P<<% ğBe[*C:ZZL7GDZ3uSWj!r+ (L"@Ӗ ;ኁ3iYKw䈖9q8R, &ĖwL <:bA7h!ҝ-`0ׇEϸa@Zk$3"\,udH ZH#jC  *Fit :s$%eqke[&m@Pu{MsH "HjV7^*D34Y޸?R6zPTz|LQscm] S`!Umc6u8 7U;D?~m^"= ̈́"EqDXXQ%0TJO]d(~Tt)Najʙ ]ùb^AKZ>Hb Ƅ5ZVnBNh4eB5#)@QX/,V NVk(ULZ ö-]7R2WpJ|CHu2ZdЙLl0DVɠ[])zAw]ںeZ4YPGm 9j?!ß?ljg ƅ|Gj(C!Sza+$ԪÌGV|\u>û9&c@RL"'=}d &o:k d©[j daodڠ}ȧZ+ %t7Zcro ̕[C+91L" zܙHnq8~ʈHWENj]} fbCw9L t'r '?)g{PiBʃ60=/ͭFj)N/~DX/R ʃ6=[wnS_ȑ(θ6zqQ;|Ñ(#gB~YTmd_<5׫.WW:^DDdDHrO綅FDrۖ+0Ի~T0C/i0$~?^@I\h&f@ 'HϫC 31p߯0b~.$<0 :i=8轋ObnY)_m@BTyDpq+@FG\f Bi>c|ܜC|e[Y͡"f(8Ju{ZcvpeQ/hFSLbnB=.zHk69' ,ymiy\YsXO2uXEoծ5yjjόj2VXhxQ^KLcޖQL) 2Ήk}'ȶX PV['۔^e7j@6L#;8E hsd'n$6`gPȃgyvY+.4`QNJu-cgyb{5qϿ0= Dsbgyvj \Rtm,R谿A_$2Sўuqw}By+ug񺊫l-rԥ˼&ʑ؀ItHmGҗ@dfC^sIy吡OR77szt1Z&Ypr;8qQ/=s~U5ֆ4(P5b7.։ hoxZ R0t Km/&M(6}_/QIXXm] 1kSyɨ.Kp8zEMn^W`4T?<&9 E@4DOC\l{mbF|#۝]Xj-5&H.~T? %̰/z fzc׷ Dv~#$QAʥGQ@槭NA͈6"\|s@)P,f8kyy㫋/s)@LdGIT#ʵBF/Evk<< xK:{Ù\u4hܥ\}r#hK hd9{k/~{D]56iD߾ߵOv̖:eڊ@:8N8˺k )!!E^(azruuټvop &Th32GV޸{U)L){1bz/6I`QC)Thb3}^^|7aQ OBs,{ p=%B$fe}Y;{6!;) Y.\;Х::zc4& !ZXπ2s>A)g,ڬ@|w[m[⮮A4FU:ӍՑ5]KS Odd7 ;]$A}(rg|+YZpUL[ەz`(;buO1[8)o *\[[pYۖvF1z> jy Le)-Wao g=[ĭ)PhS^j q e2%!h ᮾ'$?5bDDic+$YBo]62,Cm涾Ok &MG, v u\ԉ!BT 2gOЕ$5Yz̳ZŽKT2n20UA5]hюq zFmNeCZ+PōU0uWTj]j ?GETd1DX3)X v0I6^l>^ SaU?8\pPֆ :#cI=cpವX00Ik|@,4ºA#+US5:\]\zYYco E!Q0 UJ`Xr{Fvuf7{0}+O+~g)t]p:TnGnK?mop-jP~Moâa٧|ɯ/;G]__v"@8@8Hu8-?@o]o_[v{\d3I!|( d?/&7&K]{<)EgunrcXڹ-p[}V6)AQO?6݉d+oRptu\Rr(a t  $)nb^ MkL/HOJ3uGx:5 %n*o/IImf$n҇Ut+2'']M,6d*s`D7o̭Fҗ@JGM{]uBdLPtADA yP'Hy(EFl5ʤ̖1\NNz*wf@9WNZ>]ZWR4Ig n#ܺ~?b ƞ /|n;e;uh_AlT%u+ojp#k<<_uU#+5_e)^D)gaiF ל*&(ڙVK#jƓMqĵ\9|.Cf.$Pi_J0Z0@|hB]-Flf#p)3mݶ1"بd3eJ|ym =GeHfHcJ6MncZ>#,,q1qAh1"3L4,j*-\#'pKn0 I٘`Ww*) u6hfrA({'kr[)Aͩ\ɴ6׬ԋ%H78H7 Io7/g=ȩjl!+}Ũm#k<:1ٴA{%S=2S=O>[Fݫ_/S9BD Eje؇(9`}4kWAu9HQ DeVJ*kq :8ɘOdrpAfp.ɍD^Q}C5j)~sdxqIXo"X3(\iHw?oeȂxWu+/Rjm#H^ μ)ֱuŚ̈́dݯ$%i<J{{~ٌL2 r6 ^:b_HXTzb,Pu: Ÿ> ܩlf 5[":FD0,g'׀TB]lY xq܀ hek~B'"P grB,RgA`Iī2ƅha¯ͣ#J51כOG JXqיc@)`IW}z,%JZREIU]ҫ=^O|$1Y5KԫCv8;tVA+=,c3ZkѶmA[Ct;=wj *PlQze懷B5;Y;'t&S%yO_*D4\[2X >SιLzI@+-[X5|b |ÉX,ѺږvaaI-f4)]@9ewWJ=~<' ߳hafO&0vn!eA:H]uq7ej*.p҈, ф(آ`a Re)&Q>ڟ^I-t~f}v21ĒphB>@GҘهPp7o|)_ 'л Pf*`wBRF fc8 ~I1jڲH 0R%+(7p*ضV>ǃ7stp'$]|#fp: o"#$"R[aBSctFt'e3\YhidZ\@P0.2Ȑ;S;+ӀeRPF9n VRX ֫tźv̑,"2F$@ bȁz!VҔa0=M8pX Br1J l)|AR)`$ 1X`p0eb i@"p;s%2o(\.:‘ Htc0Q $6)044P8k\ +ouB8%s! 8,1.o+d@yK΋^>pG1# P0\5@& )Fnwpm.Y1Mε [U{r)Nf'X>D1ϝTt)?*Њ,ZSHEfQ"R+4ªm] y+TԀ N:uCph)m`JXLVK LD8xL~vAQ^{ՖtU&W(!wy5y?e`RRHzY!C mCfA2t!31K?xY曣ɟ$Ish2ޝ%}qusg m&g?9^l :x$)irtG ίyfū5[g ~;~|3x/_kqM4i_aw^ɴ> 'BL^p:;s'O._{g +޽fa82$a/Os21 ósZmi69 f'3{yNN>JWl &G}3u1^p^ ?󑗓Ó&9%%~,k989->l4ٜW rt, Iw%WɋwSY>{Ob< h;~iG'o6w,~y * FIHd.)N19!qrM஧ke7Wɵܩpůp0s. ߲{2:@4$EH=5G/Ξcc# wţ-kϻmОw𠻓'@<w2ޚyqhwt4gR?Yҭ;SgFs8? $@;::fʹPf:{ \)ZΩQѱ~ C?deEA=H"Xa扠DBbTFS2ȫT"2l 8RV#>APd BEItyS7D@@ൗHA뉙P #Lg>x\ūJg:ZdA0f<4@sDԤr2t9˱#]|t.>ʌ:@f f f fE{A(nIr#e>S%R7qJ4S\']zFUDSlRS>#TBuMX0Qr3ԭ/6ZwYTɵRAGÀеJ1O֪Вf}:$?scPtR۷'.VVWKvYSXb2$̘4d2g -U85Xs魲a{̂q88 \c.An/T`r"dqʴw^Vu4" >h= JµiiZ۩_#sKe`.2N`ltJQ HXMł1XHoe8A3M A5wپiך[eN%f9mLEE|YU䪕n%cX*դT׍ffCULy^7> V<:-6/ϲ}[ڷe}[ڷemeióH |0ZAju]‹Y*ј%ПYDc)԰a#RX`oƪW\E,?R,?ˊʰiǰ&,U<'&P ݵy<sm%{@mk*=9ؤEsh%Պ/5S`BN[JON[mNIoR,, UdU]ךꘃHۉirӬ~yjgYMֹ1qqq{hAg6&2=0B1qΝ.98} s%lZöm;:p8 So'aQGJj ¡Rtz? |@=ד]dlVՖKr0oR.6OTp8u=9[w1Y Y9fPx1@GM8iHR~$tIMV=IR5.6>$ ER.xEk9XiHRn! 9RhHRDr TK8+j1.IcDNTuڰD`{7J8+ǵ YK De&ILh''pӫFs@,Gjv4m)RՍăMRN퓴8׻gNe48 Q֊2S ε*3 ͕.n'D 3oJ-+nq{P0gx/08/8e_WRn(Z4bLqưUd+7%:&JgtH"(Y7ߙta!-p>, Η*x 9Hؔ+z+ɄdKMrg z<ׂ䗿p?vQw -@ 3&Ee .ZDA@ՓSD1ZTb%AA] g@zH h@8ǁ6h`uic}?(:iܗO6\>c@o!/ ȝÇon]Q;۱9ߘ˕%I8(J=(%MZX﬎0Ƥ jT6X'"M}GʚH\B2CbZ<ٛK<Y!k1d-6d5bbiܿy:g(h*l Z8Fkmm?cučn\r# _ XMhVs,Dg:~JM: BIM"őR$#yFWRyK*L #ݹn]WS>-8lo;$U"^3(lp1|ӳO~x=cso\oNY1= f޾~Ѳ??U|qcEyu_`?>^a80?\9{/Z~4_ v u@D]"MlL r'RY7UהR8֡S}.)am7?Z TkT`qRt"4{l)P yL 4E-dku2{̵Fԟ 3?wPmoo@Fk`dM o 8PDpٮ;7}]6a>?MazG]FIm ՆwvۢIY|]K}<[s[irjbT=-gw3[}E١{C0ܒWe5i|ґ:lsSCoϷ-4r oK>ot;S AdUǁF)Lvy󡐒sg8 3 WaߏGm>lT +[ X`jԔBhzG}( &jUA68g?~Mo/?y'?P%%SݜUI&sQmeft?kcN]4~CG_Z|;Y*T(bRņMJfY:u/R,~+L,6kJ4-md @ FsYR/*)W{xۭ&gֿ]Vۏvh|Pի*I]8L.}]i;T= mFofAmE|^|lvGf@ĊϟOf)8誽׬Y-Zz UPFjrmϧUSNI`-O86p4:WoNO^}rm.4 .gؐPFZ?v~ΗZTb]?GJ;?uL4uI&II~i̗1y͇!O ӦO|Z~XZP8:;&,A/g@Nj#69or}xyz(n)Y;ȫ"d Bח 󵯷l,h3WʹXu<[dsl[>V.EF 6 asVTɫ'T~~RyK֠Vd}LCD:&jECIi dg{^& ֡;wB 7|wtiׁh!(>,(&jֲKHBBx'("]."S񨌲Z]XM92$69rقр.KeV45%[h]V]ٲ֥@N+yfE(DUDg& m =(ڽl">rRnusNjQGFP  -]Naԍ@Z *;B$v滬` 1*u]A˗$X{XaNbOb j3W~]9frL  42DŬ/.,I=R2DKΆ"Ig=nH$ںhPuǃYpZ!)D] X3b/,X l1homRV5P]E:MIE0^&VKQL\ J0hskm0nZu!Yfpva[( x{けb&Zi3%YOkdw٤A.b Bźi`hGc˭vXDZih7kZ~vnnV[72GK-k',u51)mvXRHvHc6\0u^rT"X^#3\a9 q5#35~k:=Q[skkԾ$] 9}A6ix"PDaJ#Vr.,۾^u~pm(%t#aոv֙1h["ч$ZZ$+mL]g]'[Mq+G㇧LYwZ:4!;=ysc{'mfL$׸I˅)A5!6T!R*E- ˔16sr>Ә}|($(+Cdrl6ƬtҦ<r{q?MӠ0h]ŢhxBtղ 5Dk)\tp#P|H5Pz oZϬڮvMkj]3_mz֧|v!jp8i\dy;Pep" t *&1OFI؉" $pXA^Xё|E12$2_,rD=l272n3/.O .5_q fcPɒLjP+xJJDGm2h{_V%UN.G y Hx"zD`lbxM_is$mho* ]*+I Y]n:$Av ږWf:U!T9(qhM2r?F2e+!"vdMTtUeWS:"G! ~Z 8]W m!h9Tҁ4nJ )+Rb -^cRĻ.0-k,mtڅ쥉PTTk+zK)&&Hi*@94,JLMAQv- K#=DyFDn@#{|]&aAP1kj+]CI5^"b"lMk'4s(z??ٿ-ym<6F$-Ω1%vNl[{QIWhLg6llv+P6P[XK33>:^iq>@Tʏss"D{dub{>m6,UPU.0+(J"'H1rJSe?Ӹd8CPqXm eR]VTD?=.o&83{7I^q$:Q~'lrNNg8g_gOa&l1xl ӭC3DmoHKf|QRStjmÚ, ɀ:D%=<sjT)2'7ԊҀmJ놋K9 fvq mѮ͐M5vvri2%xͳY"֐珂M 2.:6VФ@9뮓׼mXif'5(eȦI%ټ1*)!?(&CPs{67ͳUK)[@4r2C<bp]U k{ Jd Өdf Ev+$ŀn1`A] R:mHtqhEYXUf }oX۰N-lGJj'2}O 6٠fmğm mhUTX_$ze\`HPRdL37$㣆aV7;6i4> Jf*R} ZRJQ0_[̀~ƂS:i9~nOad.ZɞT"Br 8׼mJd͑dNզ5q3PZVH%[%0LpP [Ěu7 +y*JaM68@鯒 38UbRM2cN$V A`n#vj#oxzx:$hIx-otepuҫ攬jT7n_w%qת+;ig)TCX(qzjڬĶd j}~tpړggh.3~> Lk7qA,0˧UGL{,&p7=~}ѣrYnds/yq79^ k]sտoNO7O'ӫ̮V<+l Ux :*6$k@_J;ft' {vBiң~ëZ%2+XCcF:dXJHAeV?'j?͌x#-N2Pf)W. SĠ[ߘ[i4J G`Lv,`"RΝ:BttwmZ􆈦N9 x3W>+*\ӏh4*m좟`-%vOe DFC謟7mZ3(76PԑvFCu^ZcwbhI8 Q||ޟ~o1IdUia]&;oL`E(TD [U%_=K?U$WZ{1Q#xWOtb1?чPh{ ], v8y!L|F7l5ņ9Xqw1K`I[N8<׭Vva{9Hv+mӚ[ne<fj">Dg.TgIGA#t I5&ND6(vM3|ȟ/g 8]Q_c9DJN`RSR^-ZZ#heL5-; =*)vaEFb5؆7iBZlH0v]4<4ƛmahd7mZSG 4PyC_柲07ܫ".˗={\x;R`K{|g/fi^̖|'V(Ns\cλ\BzR|ClצݱciEе!_eg7 *}YqH2ƤPg_detP+Dƕ7|qy’̳)D!/2lkEWX./+^Wy:X^Z߅uTc+_=_oQn1_~WuA 2c|oj#/dޭʕ7+0Mo?w+BN kMy!<& ly].A.wbYQ7EgF] .7>K!AxrUDf,[_b'8q^-g쬒nϞ#C^ya ;f}Cz_O:$pͪu7K7~%oWA\FR頝lU8;[VKCFj򏼘ןuLX^/W,Z-E*bmLcUŎgܴq欵oiUiNtrw3h%EqQXrFpOɦc80ƍu!Li7CS;DZ⇓>!`:fk{(UMT\d{E ^e XܮڄZ6k Sg_4xtJ`D}|j贒ˎg|bAF1k'糵TRa$FKi jS;R6[ #䒳!Fzq)/C:;A9L+ÝVC4QE),JAX(v< 79mu/NXP [Ifm«G|TJ{coGt۝Ɨ~BR ꑜ|G$&< 4ySN-M2r' R^I5# }˭#e@ Vi#% nnscN6YB.IW$^Khϕ!ms,UJY9PSUz;Q=4C/΢X_jX5Zkn氽%@#pSLՠ`wtѼ8)OygyNכ|ӔdY\ ٨0c!ƐjFt'ݠ-5ļXwmR*pځ&:i?$vM]q|ߊV߷]w|s+~1YJy]GpO3FqSbjOʄ}Рo;}FYgzǑ_I>2y`m, }٣rζ]Yճ~8Z֘%tȤ(j(k` ؀LV*xrSa*(GXx>t;ꔪ@GyhyGm쳎ں6COrVT٭hN9ĸ>Cgzga,Bu1~ad(# {rST+F&C?1xq3 C1GCQ͜Bٛ(]>_m~[3_*0n*߽{L7wEv`~u;sSqC"-yK+˲9#%E2-sRf”EFx:KOa7·iA`Z>Z25 Gě:HE{1"p;J|(Y/NDXÅӹY.*:!5u \>} ǥFBԄðz*ׁ}TZQ9 RVmV -W?bz ٫+ (zk+0toO3A{;ޤxhk)!G[:ĈD f%aB.?m:+BM'E1!HB;r:MjTw3eAg2}&BñvSW3!C)4 FH{DH:q:UnTԧNRK3:3E"˂ٷGh-tKre#t[l Nz[n<8-kLA&QMDD}7co<(ώtFe@aV]X!XxRz…hOB5O i<,%uKa67Q|jvNhwLw03PQzo5~g6 sNP8&7s9Y^/OԓKt4MfqF(*gcàl 5ls(*IwD5ǬS+f?U4X?Y\ȅ@8c. ".%]4Mn2Z)R+.L0ȔkWoCWe/I1@sbQVַ-:̈𣑢#]ѻ;hD;z Q1>]PE2!  4}]ئR]A~A.6,vۏr(d$4X/M@a)0CYU\$ydVc,M)P.RBXN}є/$)4 &J YsHZxjyOeզL]k~V])y[aZ @ȬDEU?"M2"+e.h/p[lR AA*ٵ͠9~C0n3}8'EV*=?ey5O=piQ1n8-wCCmѼƴHQ"nM*c8~tx,Ȣq`8 ZploD*TRtV,IL8:- PIo[wzKT8<\K[P_G= Dފ{S?)aK׏NG\i ,SLg9QH΂g2B`SS )/9;+w;ێ.؂BA9Q =X\ ԗQ`zK.q&pCl8.GmUW /2F}$Gl@ԡ)$R.eȰ&8!8䔌\2! R&*D‰N@^z;(;>(Bq@Dd ,+UJqKYV)B^ hNТ϶h JQiYcI>Z7^\LJ8L^É;C LSgzN^L pFa*h1h'V\"D^bOrPe8M]2rS }$k fX?cKU`lr`ը2fr#cӛ[| p)ƔU=Cո2 Nhgf (*hftK,X{īWnRH o.++軠?PnW[39N4r,&sRB(kO#2X;k=@BlĘ95iŐd?A+9\ƻRjJ-F%,ɨ%4KAbryn7%\].g&s׼N$1vL ̉SH\\$=B|ݱB.pn1*8F%QrRH(Hι ߑ]_@CF3lMP-9%ZQ>^ꠔ-ǘ~pd>4y̧x5b֮] %["5i~EbOLwjfon:Ȧ&AQ"#"-egyӪrm4z^}!7kH_BișT*4Z&HYHF@A!I3Q&\u;r֚7%Kzd Hh跠/5Zv)0_ݗtY8+zcK#)C7;-|%b6yf^>+rSz~ Zj3B˯ʊ|lU

{/ns.jP:Q#2|ם{96jҞ)b!ocIIj~ eoIcZ쾯-3'~Zָ Tcm5< >-^"؀۲>I?Qfz"l[ ;!#ygCq"I%ǡZj `ޣ:t~23: XL_vTDK/S~'~f#žsPva@ډ8j) iB!Pn΂O'%@cpy "EPA pV -uc'CJ]i(S@_sV쮴}6_F:H3OCrN &}m,,X>:w4fKJvzj9؍Iڴ]ץ +0,홤0䂘>Wż@tމaOaT:hi!$ I4QB*C5s[<r,D%$ȵI SNy!R 6fCRaK.p5e䀹KNDž{'ל8-}޸\V:{H]LDTS\vs`'77teӤ(wkI*!|5WbljS6%MȸhdMF5a5ǭ`IqMq |c1wzF-F>Y1盪Q*vW@1P̉**gG1e+Y)WKWO_$ptѬb/d2H]4ŽZRԘw{ MMMP;4B2Ed e\KH^!RI5S xhhqE P)7rE (Aazt$HwuI:$o H&xU%ǭjWJ,~TA6E ](s5 YsO忿%>Puy2oaGV$DPW'#@+tiQA4UUX,e ؀s.>g[TDRw,8*2ehFTEVENmH,7ZK AׁE)ΨH (%YI I-=鰺簊 ?'z7n1$}&qbd#AaI|]"ɓ#Afð85 {2| 7=a! OO, N$=u$(2bu|Y2weCbBl ِM_:N0p jzjr-kLsw0N U>,` @s CUŠv'"8eEHcq̓{ D".IX·f>4E\kHs>'Xj4H19>"݌41Ҍ>+4#\yx9oޣ~Ҋ0t T [ l#Lo.p! TK#hNx6t;ꂪT[s+Hᐥ %*7ZXߺĻE_͚pYŚ+F.‰G~AOWbqo4ıM6=TU9c_@z`64'$imrZ۽Кi1B@ Gu* |dmcoGq9&ٷa2bkRC`GAs=.N`@{*n@%QIF ŕRUʤ}zTpnR{8A2de'N9 $ænI4ML2E$ e%BތYŦCƱg+-_ >r'OŌ &KV~(g'Z֘ !)5/'>:]l-:>(Uס2}(=P*x +>ʤ9(y n?j6БH6@&~@h]cBX1ӃQ%_m`RͽF2BtzUmF5Q$_V\R03}k6Dkwr8WRa=ZsСKS|:\A0sV.<5E*J(X)HMS̈e.0-5ɰ,@3.˼CC߭VSc@uHU ] ۛa="cIʎ"Q<`-:ښiçiqL?Dq<&I(`O~]G޾mE3?0$ a86q^o]3?sPOׯx=\:ØKyÉ8^gIEKchp|z*|28 ( kǓ0 & %6磏pL2~z/~9ӷ< /|o}n]?e}}R %U&pt_ U/ܡB[ xTr)J}[D7 3rܜWY^?to|Uy/~󓠞{av5h ϳ/ ,/L‰秽ů7``HO^k/a\? ,`8 /3~Uwƾ\=A+ƽ /{Ʌ?̮gX$==|{a;ܜf8gB~a Gl(g-zV}>ya3yE5s O,I@×prgg-^R\&蝟 FYK@ă'Z\C;z<$c O}@ O-}r @SJg|5h 0jxr0yfbtca:g2L| ~wN/At8Z6|aR3m2=n.&@OǟEK W.}Mn\ g/hЦr4?'7Ͼ+wo Kg2~5upXG|>*4~yqtM/P<}YG>UDQ}e91XĦI<~R18 lb')$[d'>FbTpl DCCQ+k#^@sMKsfDpT28/'8׷{;(}B$N*zX?-蓙%ȃEWßL(3P tMA0`0f k u B'~Q'zeStd2Ä#S,$1 jJTIK1~uS}.{Z^^6"X̰}β6Ÿ́e*7~5`}̟%Y&1JRŎFJ}RQ"Y2mm\D 5~yJjeR'JHw_>|\QWwn<:xw`9J]by1ӘjdKF?8<J^޷y\yffd -W:8 s-hZpHq#8 B"IS!~8&5rFn2+I#1b-XbS̑ƖaV4ĥ2Ar}ifFPGNSakBQ`L%f ^26 70q0q0q0qa\USUqW}2'^4f:Jc4Omt ˔&O'ISqbVuyd-q Vpn$4`T:&ļjC&U_MRpAewYShu{w@* ~yym_')'RfU0"+; n>r Xmb|)&!SIqS 6 cxG9<ϹkW3?|{9Rs[sE/FՆE䆎F \MT؈[p=w!J*.kd8pe<S '_w$Nd\N%hfϽT%xjq\kK#ozu(1d,(؋ah0 L}tA{Cud'PS~Era `OMgh<>}*wÄ1uK8"m.9o-Sw"UIGPSab$*^Xx1FqmY_I!g' .FccKtւGzM}bB* η(P m<@] J3N9s{F5o!K͏R&Cyi@r励e[~6Dk\pY[Wo⪑M@XhInxJ+PD"Pf+ uQctNNwm( #&݊Ix*"tccuo1fw`!' ɚE %_GRPbb3n)؄ڸMQ@LѬ Pw9v _ X?xM9x_:W\*9c?7{ E)mApO$t9}u -1 !Z|qDɺv簮 #ԞuiI$u+$Dv: B:''sJ}mAq: G.ZĩN?]ab -g է ]Ҫ]B;{ֲLəl5dW"wi+ګ%Q4"fV$K4I xkLTӱ&zMn@i%#̵aN$5$pA'Mt˸iP*#.z}sk1i:Vyc"L>R^3^$VC,wKmM\Q" fuK2X4QM +7:&T).cAΡNRZ#"p*GҤ!/awkuYNՠI%iyi>F/q(q(KlXբU8 >ݽ *kz{E9fR,iBTR#->Yү\:(%g`Q0Ucz*z+@HR"E0LvK <{.֫.Av1 +-%G|m {E%]/9-JzCXNfڲ _tUZ[3j%_:ָΖ0./yN|1iJo7Wְxe^(e, <@G1~q$(tEy+վ:bm۾ q&iERmh1@r>}+'8LC1HJ(o* t>/F*E5Z a֊{Q,$onO.8٘Y%j]$.mj.ؚ< ]*1 (VFvJLe݅ C-_uqxqEEe"~ET--HvZql/\.aX#-=t_C-$ی*Z_ YtgkQI ϔ(Ƴ^YBaj`tӬL T6:Bee޻Qj@BٶWl2n 3F#_w_>fOٓ`DԑBJrsyCh2M?O&I/]#'()D vt!v'5ť)e40RX{. Z3ZΥ[ڄBTu;ԭp=[Zt! n/}K/\=^ sKDŽ`5ޙ%sS4 /]&vU"6jv S!*0.% \u8[!$" 45Tkv͛0~XRT!`LySX2u2EI !䆤iXN{2u1$V[O=San$9lTSj)'Snk-_VMʞB1õ/nmOg ,ITVA,>N6\qmb8=b;ڬD+%Ɓiƌ[SJ8c26a)6OD1cP4ݦ+&ZRݎd)ÊvzH(yXlN k{lpdH>< } ӿexye3A޵q;"\[x#q6F84]rjdIÉkߐ+kd4rrfH7vS, A힕"B=&[m-?/jǶWT7jLևi:׽d'>jؐ‚ ɅTJ.hI`mNrP}Q^ 'Dz7[y,Aٚy{~ïW߰4s)WқL꺪H& `# c&2 .-ֆi-%OyH+,2ՖWi.hxnyD][.5[AQ]Aq XbkHaDCxݜ:"hC_h7Ej{l(oYDwƍ8n75nٞz o%VW3@HoÍ W@yD\KfW?#$a sq׺ޠߞ{&=<`H"&N(I)5$.vJQv<7f7'U7PiBsRwSּAvznW!Pbsƞ 'T L74[=]ьŸ r텱Y,T/gjWXy[dЏfĸ2\Ũ kaB*/J4AD ̽1z'7(;5- T#Dv{뉊 }#DpfR4QZ)v9 ˴<^ O68FS9?&Q>ʆnjOM=M+Xˣ$fSwU뀑Lw6q;xU/DQrz68ݨ!6hqvxFIHm Wmo ͎zax-WHe6a[-z&L&; $`o&q_^J1ٕxzye Tz y)&^w]O$:!mW;JwV!F91VWawzC7N״OaR8p(Goj$=CMhX7L:/Fnbݑ2 }0{ e7WQ׍G^7ʣZx02JD):>s'g c ɀLw3 lȓ`Bv= a'0@ {;_͸mf}68S#0r$'s n4'Zx'pt8ǧZajt]72@4ǯ<{0&8txi."EZm ( +I1 ɠ݋RbMeAERα: iV5V$ؤ4Iژ7'F <jE۠R7b PƠ\1ߣPgcBArmʗ rJL<O'a !D(:Kj8ZL,*> :4Qk1lƤ$E4TT|i܁ wf?Y^lSkO3˜|Kx-33 È]Y@SO)')EɁD,WRD9hvF391H`jp;j@,Np ViҒjE$Gʗk$ۄtB#%_9Czf2Flgq KN2,K0A(L/N8BRXlݍv 'XqIo9H!fV-QsZ"f^VU%r $ooW00s9џ@[$&poHQm# 7'>_AK%İVI i4qLPb,Xx|i{*$PK$BX&Ru1P)d"T&bQR5;U,stC$_x Vye F#`@UY­sR%U syrur9%TYI՚' I;05VIRSf@W$1g`T"'،ZNթqy>}hܗjUsx]*]," Eb񰋅MgBȌ `2){Zp K g<.Bf̈́JB xˬ~)8if*ZϢ9? rj=,j?b~RY+M2?,@k80"gd7sDU w¿w?v{W$ -Tؚ$;$vܜ^s6,4Hh8} ,Y=߭wL'Yˢ0(AoXS3@ VM8zP"r1 +X0}MDs`VsL!bH5Uw9i `X43ަIUH=Y`008ϼ=$:FܛLPVN,,x?[I`UKx֦Qmi?jwM'/AZŋK&˻w/\;<~K>ՓgoɥKK[P7LJ&m:]\vE71H&$gގAe Ͳ^Ƕ`v0[ֿPs?DBԦom;Z}`8GS{13Vm+H@D`Sv0Cҧ  nL JyaJaR~YϮ}Ow\$)jqlĔ0'6Q,1T&2EdyםSL2T 1;-M\m01H cqRí&ZdQLݹNwzFS7򞺑ԍeYշ9Af88(2JNj0d[,Քk .`s$ ?DZH02DZurL90zRc*^(ᆬۏ&0׷&{a|Z2uL۔F!M?6*|sZȳ2YEUu̪:Yu |0UB0I\w [3Zj@$dԄ'-Wǧ|ųz6;v7yGG/6,7*T &N7Ӵ-ĽqZy'j87>DgTY):D{ z2XѹmtTkIru3\[UJ+gŚD׌R!ҁa',$IYZ1cBRLBDZ3IbY^kTr)#>$>!5X;b(.ѯXi+IVXQ%n!u~0v~7ggIAྠ]8k`݃2_ +>Lނy&[-30'_dYVn/hMr]~ОFsэx^Qlb=koE^P7u6h- r-7hԸ]ޞPh.Ht]1c3iZ,,;V>v)K\0V<6G.N/ wz9esfh08sщF{g\C&1PɆ0fh:4/ʐtϦ$M,y!iqci K{?+jP\`Me~Swsk̖ C'Snؖ}l"WPV}Аj^ZQ96gDr/..޽CWn{^|v׿x'RIE%Ao'߹dճgª\ >w"frZB/n9@\8&T_t/@=߆_^~9?{yyWW?]Vihs[VK-k;Q3 "ػiFv[ C[c݆ᛷщe ԍ' Y~ (aHC\X|щL7OM_Yg~p/n[w\Eq%``lнPU]n S XopyI,7y qᢕ n>簋i }F:]oʳ~!30~M^;]c *_JT"Hd1\ )2kyEY .:~qYupRZ8nRFA8Z#CB K|^'eM;ޅ0tDkcqg eeuH@Пy1JW#nvm10yiؒ$s[$=shafiIRк0I E`|F$1F<`<<YRG Vې4Ic4Y ˜-m Z7-Dz"< ;0~Y&`=&zK`A4^[7L+jUϲ6wS.+9Imј]^8;;K^F 󊦽"tQY7gzHQaCz}<9R,{_;f!(>3舰ٝ_o|c{[.#..%=-tUU0Ɇ/?[Eǟq/-{m&~a#F.FIJ ;{1NFQŜ`—4j[vЃe*BL I=m9Klg<6]{94HǼPx%(JSH̭XdO?;;VFɷ>⛮yCHЪ!lH6|iҀyiW7X$6uz )WIlR8ú _?|/Yݿ,4]#OsC {aZ @@3tOn;%m`@̈vtcY1Tvbqv0"zɦ8o(W,]6y%QQ'h% 1PSiryқt]O;2o6Z"}J\a^1=ڠd%N!Cl'l_onnΏpٽo4(EʖUD uk.c+R}k^+l:P_dܠv- pw4#JPhĭb#pd$VȑD1%U|>D`Xh R+[sB `g k䎒кr2Y4 &6:#A=]!{0tǵЧC ֝VSx͖|E PeXSu֌{:غNnwj?!vN`!]7/5BxyMGְ\f nlq(I]ۯQmr3i?qQ"KE\)r\aM{bcΉ9rH-WG qVJhX j|Ek+ 7h#|#1 !}?!1"_=Đ =Zohc,L| 2-n } h, ־Y'ȢWA,A C!dA MB5H95"X`텫ER~Fiɘ o3Է$fFB%JyiuɑHSK$#,G # Dr󣫯i |8JM|Olj$fbƩJX2D$x%2G?{ gbOp:v cB:hU0KA~YO_zZu>'O=2h]K:l8'CfcDA9=ovT>W, EokQL%-N#]hUHf>ΈV %kw 61,h3ZYcd9N.S4Df1X3*sY=sjiU%L:ؾqzIT%@x-.b,yD*X'Y1!4S: !Վː]g\;{SADwx:@NԎ:`nLR9Ǔ'ݹd<$ #F0qX:_4q"a*p9`X=J8eX*N4ڨ`>ʞC"k1d̰ ǫ DqȠHŰ$^bi5b,hjl+Ɗ`X`Rz'Vr|ʔ$VTNSFN1k0° GVBvb@a0jGЅW2T(F2fhx`Vð`(meHX&6uy3Hk2 IQi-WCT\Ѫ0c'%a os5e2k.AfP*v>.}:'F93sc. k`0kgtELkmH@_nsء09[y )3+QZr,CG3C'.MQ=5 xHk\g%gڶ)a?yL9m]4ٌ\J"ɮwvNIZWet-*2|65'KmX<^9#<#KLZ덎 41?zDŽ/xhܭyjCS] 6՞f-ue=\rِd8f"V#zqY"ȚRIfH9 `f(ZGLȸ1A2 K< 0]$y>ptЅ%x ,G;\yJcy,Fz` Q1XÀl vo?8 vq^~V {ЪUch}ݳ !QQMhpDu֕wvh.=A%d`:hU@AĢK)(Y`}!M  ؏|l >Jl A#ÈDGCŸ-&a3ĖPlG>z3%g) *\-1O01TJ* GEB 5 բœ#=؎$Ŷ8(@FCpHNmDqeO!ƱF #- ޶E5=$2P-cJ^ؚ'p&m@" ;dq< 'od.m˫OPNBR)[$PgquQAF\Gg[svj>v{H8BX!=cp(utp|SrOۘť(MSB+S̿dyKi.lW"}cH\K;  )I)8#˔Ҩïbe l-hȸ܃Bql_/צ/~08?6$Y7t"8J2$TXtv@8XRCˍx^BDW,-p(:Rs"`˽d)SX,r`pQtxDžc(H*pPxq"rRk:ћ*ZF.PIR)⿉TZk[pM<&+Mj(hҁ$gmF|;ʣt֌ NzEy hp|S(Q(]|k*?Og@eI퉹V4Ok2VWn"Z!4-1y\U\WiU~BVS_Ʒ'Lsw3-ޜ.߭f i4M\HtrnzðS␪z+C~JMX܄ T*p-/5 7}oӋJU173oq3%06?og~R|PHݯGq5 |瓮#[9j_w"QYyi./f˭)"@g.}YV_/oZ7^q$|?9j ?lh[\ipΕ:fk2"K‚2ԉh(_Nêzާg欝ݑ$VH;V$Xz̠g<KU,lya3 <ܨlgVsqDQ5=hE"C 8+2Qopa!,>^9邷d`z1ĩTbh ?grC"F5Qс`Pj2D*+ z-x7y{ ptR`Y㝐cLA'2.Fx Х]/G䙤tI~8b>fLW8?[ǽ*uWqv9t$!vaÒEuhzJC\Bu\GXң1m,u8H6>}]6*3La&܇䞽<ȣ0 ĠVR]:+ 'aK|>=/B~ +${qVxH!zӭ=@͟MhMbH8 **7!^<@~-ޥ9<6%h~Ɠl"\]W6iӓ%4>r.s}WYߚ')^ZOlzC@.GfQ!Mp>q{2ؓws:EkunYWbǁY}|zj[TZG6|F;},^f"K)x/v-Jf+5$BU>vr3sz5<g;ƩS0"FF 7&/#[; 5`n/ڄoiéM%HF0ZyN ㌴E8 JRMB;eiLRؙIe5BcPc-?׈./,\mRTJIQʕJRΦn5ecݻښUz=@2҂ҙަiO?h p@إd48g4kc'FcY/wK>L8^ƣZDJ%AOҠMe,E}RV*@Zg8q?br '>.?Bo?M2{A$ *SnqJ=`_HmzK-k|H]ޛԸ);]ZoO/1LbmɸU`h2ֱM!HeaʠD:U8A@Fy¢ } 7EcH̀9'" m&8C\ȆS1s(TՅ܈aJbv h(; \*FI r1M&rII`ʐvM0Ti2W5W0uȀ-j$lWw F+?B.X|߉1gem JIcNȝ(URR텟Ǣ8ҔXN}&bF-6 բ6{0$kkQf &=HIp;N]k49O|%Uz>H?WFc$8r~~WIQ7,yjiQ 7EB>u(6Kb€) UO.W?,8B&5[ًzѵ?]wSRrV5MʨU+H&Ps.dڰ0s4[V lԗu:Bb Rߜ?ߞ\;.tĭGR4~{2z d偼4"5_~T\\§Ԟ.ug;?NpPCrt4ыqC=4H>|%+k.*Z lj|Pٷb>UtJ¹ T)O]7+4&[q>i?vɴh)R𙠑5IA[К")*\DaۢJSXVGmtfCkoD}w,_,Zk}r"y v|_`ٰ-*O&nf.reqq(vP0%vR;J 'D 6~%L^*'.VpF=ρH)5tK4gg0D)ƽgc啫H,Pڃ%KRЈnKGQTyܐE[i_ZΖ^uNhu et.jkuyvTAci%y8e8%oɞhr%`?1AK]J WENGxd];@dF*sqCMQ4L'xUVqBk :j$%T28\삲z} 2Dq^L^O^#4]SX(U%r}pLE E_K̄|@+rs%DC@9*e:ھB1flѾHQ~}j4\ᆽV0LJqx0͇(-RmJ#.5K!˔c9!QoKJ AI/yPQaM@q13:,8-7u SIo6uijM ռ".V&UYsK,=_wW>E} PXҨׇ3:@H1v x۾ Kō. dR)EJ"6~}ǢQ$ZvRyP9r{h"WM Q{r#f(Ѐ,CEHfX$",X"2Ē0s؍#P"0kU A ZjhLea+AzaRSB7%hr|+.vhLhI5|Zi@q"9|k|׵96k&xIk.@f-3/)p[Hm9(M4ibz +b`(=Ihwt"u Tn$;}|ĺSѝ;u[mP!ca}yO iF1?<7=r*8V{Ǯ//~͌/M;<%ujrmɃsoM\#_.*İ:1.[Yǻlc 9ykUYWC&.N3 k#Xz26sʾ98E9O!ם'=C~偯rJ|Z2k鱃sli-mm}qt՚V>L/:|N~XHy&"\k+5uĭ19jKpiQ2jBykP:}u 8dS,;C N$:ʝqeHhOfh>Nп_ٱ܁V6NPKyo"8?6O yVz@oRMցN+ F RF%2R7 8>q UCY4%WWbU)k;UCYcYOtr(Gs&5`KcƷ2;;z𝅘)ѹE-dfc"עꥴk4-Z1௭E. T*)AHPŵE9Ij>K^-"LΎ2T AH b)CԇI1Z(~ zhUUF,wY"_buXV UkQ. jAmFTU%2w¯Х%UTj+t+d]W7E[&V" k5-}ƱJAzGe #(_J 1(v'R " ~>XN7HA0)>w?z-Op`/z_p)"k{t\mZ+Db2meT_D_3SwqҔ͕ԺD^=?-K屷^B#o&+ ֤mWnoY&7XQ%tAOڷ$^{A4Eq. wd!펃[!Z?{tz괨ysCD ^)$cR`*a4CpZէyƴ^"Hu2 dD@rz 0~?9 }ION?ߦRh}z%ʛ9,aƏRϳ 7M:pN+TO+LFwbtom6JБxP"("Q/Jf4 /i>Jп'7]y豒i;Uor_!$>5 bcƷaYo=b bVis*!~ VFTK$ >13Aʚf xiW2Sfc[&vZ&Yӛ+`dOL6 B-H E-^0!IDdyNHl! s\(Jlj Fcm#Ha1"Rˌ@\c (`1GѥpUX8hdrM$9U Ȕۊ輎Xd)H1Rh:M܎_F2g PX*E(9Emi:W0gCՎD-t!l Cop!l؄༶ĸ=n} _r1 KT`,fA%0U9<ĢDFzXVBT^)sс* 0#JwHp^*[!4ݣͿ+>&3+K.-RU .Eԑeʱ̂=LxH#P1 <@50 hatK6<=n[YUjJ\0"ŒbgFAK4 L!C% ID@F95c(kAu+K9KN뉥hqZ)As!RzOD:|G2w4XN-ȶ% ZNzxIЭ-m-κP. xR*fPI ,R(ee($#x!#Թܯ`Dc@aU85ĈʽmVNuFfeA$;#(oF摈DYD{Zq:JuDK66/ A$N8#G96Y ~/! ڠEy6L?=ܤ8Cf<XK(wݗӧ{h Ƈ5+ [% h9w|7 g?} 0觇닕T<1zOgod*X/w/&_qT_K"b_~`!p4»ޗ?u d>R4Aw/[RL^.sH֥̭u2zI㨤sy+Tp"W$Hj i$UH#\"!i iIǢF($3ucI[RXvIF(Ajʂbɉ`FLO`K*52{ !0rL:_3)m%=iv"icӵ֯OLAiI4O}"eEvZLZLQHGt"O(SSb^$Ax#,|d -W ڗP%PE5^o&EFej,ZnL=ܖ'>w?Ig]_\~6}6.c17gdAC=mW}1]hiō/ Ʒ_Ýֺ[qKNEВ*cEF*W7J`/tp=nM )7]8ּ>O%n!;4e |& >OXtau0zs;ͽ}5gh0ƉB<)Ѷ&$XIn8'ηw)eVf$;iRwW?gmD,A'fi,8֑v]DзoKjb뫻[~vkxPbȊc+2ʃ[ zų:wW3})Ժt3.d 7|{u kTo:n|("1H C<_P]Ui7 dfT쨻[s2\? B[%u4Y2E9_vX9Rr)(|Y<;JJdq[)ۚ\ڧ~e)Dօ` b "[2lKhyD\8#e~ԙ-GdDnfy~ 굒g6=PGkT/oEG_T[A=`Sg69!a~ӗf\17[ ^রЍSF?o%; tjN.v\ ׯP=Va#zW)Pοr{Orl{5WoIUahCcrpUWmqQ,ϯ\ *_ʹ=0}Z ܷ~xVs7ܯgA-hko`)aC:3jdN %ReͥB% M_Ydy jDLA*3W*m&WsS}}'3v1ƉCBS(Âqz\bv7\I\brɛkcQ)V(HGERcT3\FSpC-1@`5`t2ұS#{Q7}]7G b-u0+emsx%8@p Y 򣼉I }\'\*W&]_t9[& J0yz?,wNv_'%Ipq^K-=ם}ộ]UFz>gƧ4q49LRj>~G㖼4yFoΓy02%ίs0pԪ1'd*IQ2`q1Ŋs"{.*2B[Sw7d励g?tFV_r6~FŻ;OT"ぢgGKB>L0#C<:LVߚqf_W<hVB`_xO  J$z_<ݓ'h-.@ ,plKOҢchb?R^טr0ΖM6a!pU"^yrOb/*Y*|H{BJ9J}]!\L%N>Ep[:D#hJgJ@{yGKH *.^1FzN2cHr$c39p÷`[J"ulKlgNG$K·9B0["XI?U,å;Vַ zθjGܽCFO[q=xqYb!O}j6-ZSn=$,.b>xt{f+zǵݫv=s HeXVOf2zb'p4>7E =;!hdy {S[ 6$fĶC3얃GGg_ PDΉ[Cs'FgգQ37T[lZK2O \Tyj !ks$lLx O E@sF=+厃,712ɐ1o7wTF 6̼T¨RlPH(I۴^;BZY$xW^ Xӷ%B@DmEonpQBg:/o ? ccRȒ"& HD!SHd%-dB`^q͂$+G\G|XxpA4פ4K1$Hh8vXI 턠})-.)O`@g8OKXXUXs45uN$R^mWir}?N<.ShU )&_Tj;w y.ca!hLcX *+ 3&()"F.0VEb9 axQ*/*Z:W f==8e31̳ -HO靎hܐ7ʦ\RQ G9(@jHnx,$~o3aG>Zf<(A1͈2xcӠ1:ȏ悈!籎hJ`#"+"q@7CVh~vlf4-)k4*9|[OM:liQH`ITŽ-dx^ԾD!SL>1p24^<$9j /z5iͰ>*oj%{dNPOtq\BuZp>aEnqm0NXjqi3- $q[:fx#nV~>JÙ&DC)}_qP*SNR\lUL#Q&}cpq?E TMQt1 T#k%tDWhe]N<~~\rpяS 5m5z(ԔYnoU\RbK7Ŕb3[V$h2e/p'qB8Wxl=#U&jP!8:g1ejz^;hߕG#%wH`n1z/}Uܥ]G!e^w'8@L l$cUvR+%ϰ]jɳ^9㌄wӁW֑GJ!1eǻp @OtQf/-zaЋ퉷X㉔ŷ~]+ 1`?EQ/~z>p?ʭE ٿI[pt4Z'AKUѳ AL$Lk@4"ȸ<$WMt?mvinGעc}g@O ?u+4D_LGQ>tO2yYccjl BA1XpZ+VTerZ)!Xs 0ZH%q9˹Sϻa.џd\v+/^z ]ޕEz 10f |//WvjVٛ+~G>ɋW4uB-~kI\hIܹ7"j}+rdsdZju$xj0g#Hc^j{hCcT"nƷ:Sg:TV&m D0hs"lۭ=+@&8(6!pY5Fʹ"Xs_\\_}`Fua c57h 2 L PRgeD\$RqGEu*Xu/5Rbdߖ5V{Y`}hqWagk4%}66igkcP=1vk?T3tasq keˑt'NW __.Al;U2CT~/]23\bkaD\!%s˵Z0E܂"N B&Mzc%GF9SAVPu00IqZ`YNeҸQ}Cdz 3ߝZ &SYR Q )hl'l)VyB /|! |yARcdtiH;c$R@1xHe@lmWi`n7Edd\ɨwt~&@E~ _i>\}1>ZWJE~ee>a1>ޅDddĽo^1Lg ~4\Sz]p>;& e۽W`""Ms(_OI)*BgPjBH]]|K)AZmt\U*p*tăL ZFWBO^fq B dq BǠű0"'f ^՞#.YA5S9 =%D:LPZvk"y]y$?(mA B|g;j|Hu&=`1Fƥ zdɪl.6ӁC {sur˼ns!ez{B2UR Lus~ršq&#^hdQ] j@Ty嵴(GU!HBQ̡+GyY,{ӏey)EfYn ",8s6Y˰7؇Th `bda* g/>BI(D#d|&0J;FyH0iμ0FmQhK=qnF 8_& "K#I8 rNvqN28!q*h{jBPkќbb,\qP )K2XZ,0)<{昮<k!Yè1fs6=j==>|=4)/1j=Y& sJPN0nÂPݨDĜJXКJ:Vh$\ta,Q`T0S -q.9Ys ^EaLi2 JN 'E`SYm+r+8OBNpN*k PQ܃ +NļāC3Y NR9`H"5Uu | 6*fնMNj¶7JЪexU 1J*ZM.a$53zgC k= qӜJ N8Z)*E} l1<+aSTm" k37L* A׫$gϹ$l:? {8=oC3f8tz_^,x ky[Tdrտ[ 8>:k*ۏ,5q1up@ۿHp;5v[  YP^ >ۃ7\=&aGG stT%ڕbKFܬڒ]mZ(t/Wok, +EdWRxЄ<Ű+RD\1 Œ!SS/sƝ,ƺVb7*$c0u׬։2a ;0Zy:+Xrjd^cJO.ƶvytRDO\J6ߢl < :暢 `W+m<ޘWE 蝭LN?x-k/ʃ`M^,[M{\~G4jpm]SAtSΉS6nQWd/TJJ|{-]xQfz(a^֐CEwf2H?Ͷ4`b0qS @9Ƭ3N\]wRPN:B(Q"Z#Dt$َc"d0$TBuG½@m=Y0jQ|Hk{oѢAUDޒE; wo5G;;?>@2.Fy4E;Û8E&Ri2PfL-(ZBGϣ# [8f+Dڎǭ[|^m_=X#ݝ^.L;z] E >;@[&[ VH~fax|Q,D,?!]+FUETqe-w(ǩwC;twn0Vm97pç:2BHv8:1D@u309 7}C15O|tf0@QwXkﻎ3Z* (tkq6& @nzYzDB89'洊LY`nVQBɼwK(鏞*ZZ(EiIi9hNBkJSVnG<>"H"XRw9f3 Ap0gE, ҼN尿FZ J$Wju!Sl  vHC-&OYCZ#O;[b.ztѪQ'DΡj{z hd &H<5; N(%Xy@޺ "2_؍6(˯]eJ%"_&ׯ#%/OC{XݓV^1Lg ys? oo|Gǯ>^}a&)E,a2OOoooa| sBb@Hѓ.qR)Č`֘Jh\?[sBn^t0}_?_t_^u/5 hAФIUM[ud"%mNɑrX˙I,[>r.0TSϻRb_?/{דɸ Xپ0S+D!,@5w+u>yDlMTU:v+u b]Ht8mR5 ~8bR 0CFD9k2϶}XmP1AMP {2-rpA'wJy1燓sAP1"TfFRPf!Uⓙ /_bv?aV~i13o ¶ իUůX~?{WF OP&$yR'uJccPl }W=#**E3<.{^< 맺!,f dY(k/r<\\ˇ.!0^1_\ξ\cއR@Qy 킢\|͑3r PTKqP$U3E;ڞٗz(V/ "lr-LBZ)8^.;or8ѕ5{]\^t]:\x z=嬀G'ۉyxsc;]fՉ4uscssr2u?qq{C5W^^} f |2e.fno~9η#vבBsJz<(/ |FRӝd1P^Q0yLS)Ri-} LJ qGdIxD,,ޟR_ 4Yک8QC9|ZˡDAMJZpxp*2T"^.0aegV8`~VAA1&ChP8[bcHfh+N-OL yuֱ] %&SJ~P5b\>Ɇoe$~! lrk(x9,VmR\%nU{PSꕋ_2|-Xj1{izj0o^f˝ 36WKʃ~DHf1,LaA5jY2_yAE7[UuϘJѲ#JwS-XW9Y0 J)zS>ZQYVU&؍zi? Ͳ0*k[8p UKp{Cse#a'>*$<gA',hqqBAS %%\L'J"ÊETdLiՖDh$[fZ*8>9߯\LSKLDZ&RILV Cb KBLFLX "x8EAxOb*\ #}- "EtA6jV*?` ]GN,ĵ\T^ @|>S\5e+7."m'X]7D"h5TkOH0Öǫ27|~P 476" # hڣIe-]{j򣧯Hм*B e<4z -[5:7Q#4eDR` Áf-;#5xRhχ2ǮRjjSBk6\粒29/QVpυ e)u^rDAV8Ha? UStx2ZIVރɊ+ + xn4FNLz\0M%C ~.LR@`fAotx1[1ݹɗqa]WKʕ2EuȘ,8 JynAOޱgή_㠴NTb)mF4Qt.v"ʋY@afd{)4|@o/a~m|WCeYB$ǂq#4gҰfTx 4K ~'5ڱB{\]/g홆y,' Brq70ۅ$$\AKͧ tzVk8M,h! $o0?Y ӭb3K뮤?j 'Q:FTERERhG <>(Wb# p` Ҧ`@qp1,@ )I%* . 0OL5uB bM$+QXͤ0k,܈5ZTJ0ODIqQG}}g}ì葿 a.[FEgb.HǟTCV6,n!‹)P ep5w;>iJ$:T|oR@KHk,p CxM)`>ep =ho*ls7[ӯUjTl ܹ RK2"ʘw!RLr=FtY0񊋿醧w2^&n*#keIM3O6Nwiߜ7Fܷ11˥>/fXMWrÙ 9p3@!"g5i B2S*T{i}s.sn* a2XOBǠt*ӝX,F/bf}uyݯ2=a\k$1=%ʷ#Ѻum9=7GT`iZ9~ Q:>ɓ#6o-;gYtKM|{s+ ^G7_jsvw*x44ڹlu>M<4x #˞~.ZA/!_+FҦz튮Up}via'?br3Ƚcbr&@_MWmVx1_nݯ߾8scl>[}_W(k:|6I?N D':1S_z+6-+ohLp`umyĦJӿ'q]i["Erq%w'ba0[`?^]X#\;>OÒ<|W[x bDa^cq-Zv R 8DS)۴>j%w%&, pRcA`&,`"#Dl kNx[L7Ԧ!`aiۋ\ZxAV_]);#;Kv/{-=BY|^SX,&Tc<,_׌:%#Jyqxdb5@~`0Hrq= VҊ֦m|x4Be:rEJ[3?W~̛ >҂aFd4JzZ4hDq23_Z }Lf-+}\g^2pVMr(EӤuw4Wr%DMUcXRl):hjT)k&nݡ ew,T2(DQMq/lcxUEFNlس]SZ*cDݭ#PsZ .:*K,Ncnj [!$Z7ܫP2/*)zRJ+ ? ,\9(]*̞w (0鎨뻥tXdK}a]fѻjj8o\[ԁ!oy)`cC\LPK^jقOd/빽]Oo.G8QW[n!ϼHy‹g{/w`{X[;hl,nE+?ĩ-be$К^$P~|d4Ns;h1kӇ~?jaR 2V%(?YvȺ߇9zj/(9g9Ubȋła}T)|K e׋K L>|.#N޺ Т:gSA9;lM^Ls6ꢭӍ22V8뱠)b^6jX3---z)YaPMHkǏlgZ'[8m K$ vA $#QNC*ҦIߡ,e @[ֿU{_̴gAv#iWTWDN<RcPL뀆-XeU`Ea+[!+-?~$IR&|FYEH4YWmD %iWOb+v `~c7E1)bBhvaSJ SB0}dD '|CYriň8뷪xU GK^ʦBs<5Q@";8j"q&罟\ ҾsL+MKAE|Wi61~P`ݘ^;h iEKȩ k>L]`W> hU@g-BW3}G> <9)\ Pk;Oje}'N&Zi< R2V:I0\JN ~raS$h_%hO鐧)tO8>u`IAIhxIE%Q>*a 0ë||mUS:_QKtp~M@p}]V}yO n ]_;E#>s2ʟ"p#͏R ]9WJs2(Zm^ 3fqKţ֚ ݁5l R}0Nَ<ۼ!SXpJNɛu6QP?vǃ W|jceCؒrx*~5zodmpBm!TE #훻jxQ_>8C,kUo_^t6_)o//'W^ᷴXrK;j~mEaF@IïG^R Bl㖇j-&u|캆IQn䌤щsb8|JjJ%t^~WJ1 WV\ ^u0(a}o`IvM 6 un L.c6Jȿn>^w+ӋJwžBU X6İK2Z(qҸąEvRpwSJqX)n+ōcqR h*_EKS+ ?/WSJA*`:4_z~^%tBFKgb܉^~^DF@]-~8fs4I?MϞ|0 j@qwvoю-qE;ߢcA!^ ~/ruZ%NRš+dH܉#wEFoi]F0rAg˻xyw/qn%Q0G+n5T;gAn1܂'KC(!*GMf܉8wɹn/Ɡ#)R"`30&wu\O\d_ >:"SJ Y/TaIFraR)2C}%K$Jrc9Z-K̀2H*nn:R'1`gWB 3upq2 Jҕ%s7BZ.<87EqR:w#DEkOR+)7SŌDujEȀ,˻8F1"<q="fY2C8*(uD۪BT8fqp} qN:Te9Qe9\ R(3 `6P_Ws;E V1ZK$ .ZFRU&Yűj&0FsW F  WڊfZ @n]<5ٷO"ぎܣAkW ͫ4'J>:LB v^htDWvt @z|_Ad@8Uo&uFaVx');qq%(8z TU l*z(ű$9)'5k`8w r6:wv733sgK`q.hvlE%<EmY'۠;(6+ ) MΒpP10*UJFT?TyCi<C0.k[j)闄2RV6&̽*&Pi ?JY Ib\:lmXזFpJBKf*+]DHiHCI iG[O֕e6>sI)(G݈X2znFEdoQb oӚ7֌XPifTՎWȟLں)@ox3&K%TV29̖@=h--qM6rč⛄;|%itp]X4("tC):ցz(v&L^fpqRd1r,X{@GRU[іvrp/^3Ɨ4|0wci{|ADıs׮wt>@ t=ڀ/DGf8EA ʹ>qCT{X xnD;1$l]zgw숽uz< &U=L)ufL&Y;O#jڑKI) Vf&(3v<m>C˫F[M]r!3M30[R`+:j^*1Pal(G܍MPa`o?]m."B~ro[Y91droGa /oM_@G}?o/oqd iUsLTMò q76u8ߛą?wJu4tm}t+29ڼzvć]1$T{uS5PaUB)*uE4H6${'v07$ |9CsmܫL#rt%պ Q(sBcc<}fHHG ND=HXM⏾=ge-"M@7rKFaΈ(s~*jN^d95VYK5Tda1z/O ?Ű]Ra&kBg-l}Ɇt,FUm8w?'Zp p/MˮFƣN[7 \tŎ9BH)Ϩ҆?!5C`i'NFc;<'ix&ʩbHg6"KkӼOⱊ$ލ!r4hW5D9D!%w$<DzncI; ' =҄鏄{m$L"aaP==0IMB#$/ӹ\}?nLg#czC.ѽ+`\Zs'St9w)bNiԢyG,D?wE=5oorH:DDzKz)sܡ<5 C9JBj06˧s X3*xRo\Nu$ ppYkKsL^}?nc-r)ݻA(*1ųw(ѽ `OQMb;{ǧ:#J$7)J+ϰkCB!(f:E cK05 9*9꜀\}1u[ tz6{uYl"t6Y59ȫxVCRиηt[8{u/N0ƾMYd3,#QI pTkmXEEib |([qҢhF'gIYu3)^$%X"ΆC J¥ڥX1I" /eR ^Mr*Ue!(SN~f R\^&wUat^ q]Pb/:.b d`((<'FWRlқ*ޱ|eć6^&HX3{|.,\ICS2[.'\B0ȕS[5e*CE试ڸrJi"IUЏ{n)!xRxs=ʼ//.\akB+S;"fܵt6Zh3F57NjN%ߍ\t5%Vttga+x`1 Us8=5"gpvzJ #B3H!F ";<0d㣬nҪGߨHksUN =NPn: ̦CQO7oq12q9 S*8#B!CM RҸa#j]sO)0T2=|0؁s}"=U t i~HZx 5d^+=McYgO% |'sN Mk^Ѝi3lU^-T&wP{ x.` p!n'NOSz)i?~}q u| $]2$' \ܴ~wpvAؓ7鄠:Qў3lh^з#+[`:/:n߇t &f6l>'M;ʊq}HABks\x 49>4AO9w]kZvG#< ZC탌Y3#*Rz^O=k:8O1tݳ||1聄_^?]|wN;nzuh1}r^FWQx?^e?iw}uu~77o^:NcΒds֏is?_~|s_\?oNVg^Uԋٽ(GLxF=[酯=Rvtc}fB5o@U5CE]=;58sbvߌãнKҺO^~M Gf  K{8T׶%wF^>'!xxПJ^"@]ێ[qr l?4ӤJ9+:,ީ~v|> eɐ\8թ]B?3EMcB3s9`2 k ٫A7~ &6V(yȌs^C׋Q!yK_6F=4;d\^E롕'j7oɳ7:۠z0H~L y^ɶ81 FO$Yvc}}7͟O`R&Špiq4,f_M+)fJyJ3 ]ܮX<;7u]jEF4@wxaew;c8Jj1AXxs3l#BOzJbkvˠd0IdPJs*+xxg ]71}||lיB F0y30X o_(Tf/GXX,:n){o] _D Lu?7G0ɄWԒt8eNƈ0$>Î:]7N~D.ϮSmPԊaCL}=,[ewRP i׳Mׅ }wib0!h~JFmg{H11g|}שּyTLjwzQ<.ķ`   O";QO:~8LOuzw}ي]ӯgOA&eG:qmqJxwtymYgw&AF % FPj,?N0: ТMqF0s{VVn+j?{sk.C'+I۳ύENn..0x(RF.F 1%!动?g~sxgM]FapT^s.*ڋ\ɞ,3EBĨ)ҧG 1ZM٬ Yxu"Ȱ z)0RӻET==f/o-b/ɷc[u,EF1a((B۠} , VPbO}eBSMf,䮆j,@G< C|?@˧j$rU\nEPb=u]GA†|msymz{s]_jddGUVgb_Gw3Ѡ ?9Opg3~doxJrp[FsGRᗾa`M/(t@;Di2iOߺd{̦Y q*R#p{l]k׵6ЄÄ|/H3v٪T"XMI-u~]-tx]l}du" i&?^R|Lel0?<)vo ź=&&mu0-rE12ۤav}ri] ٍSPP)gчwE/KVco ٺ:V@JɚS$#7]⮒|W9V]AlNp#^9Y+MEh#RQD=H2'^nmvE5 %͵Ǘ SӌRk(#¬g3Թ1Q' X;.øKS:PqbZfryRf]ϴxS0vj,{0eQ~خNR `_;A)~Z;Ƭ%0 T T/8^.! =+7ʉ тr:D=CcFvNv:yjEqH&<jۉH0ǞKW r)-V$顜9src  `8Zk\Nì%Ѩ s-ZGԯag 粵Vk MKCVO9D[{!b` FLa n.xa?D< ;EX !4]76ˊ{z8Mg[%vG&~y/;!U#fM0& ^̫@6/TbײES Z~yyu,k NS*;}k[]pHAYps'X| tm%0^`V% U< |G>u@kZ͜wʯyx5ڷCZXq̈́L_ڇBT 7^Uk)?FvuM!0gg[k=e.1 ICD(6Jىg5 mFŨȊWbԲ=޽=Il"]`qw5v&[pk܇u6Rȹ-;@UV }&ΑRCF4˭F~zKF,~/c5EeE%+3RfrjF)B?dx ƥ)b˂웋 =8KN({t 2kB۝˵"x6Vjջ1Dyrp+v o'6J-]Eջ{J!.m/*$G 1OTc W q)-E}x6].6tb(Pr$4i8-@m6U|P$G^Aer\Hk&-EM}ѽn_%`Lqp?${y0ZDٻmW4s:Dzqh:NiZ{lNߌ$@'俟)K,9IJrN[.,vUuELReꍔ30^Ma 5#Rz]C*d[ny R `8PJYq\ˆw¥gY5YbJѰLhԗa:U|Y%|e*T,8hfr&,|nÞ+"tr4St:'NR}/,9U\T:\ڎ1QYMqDe%(w0 A-AMԶT)ޥ[88z%v1X5E+RmN$GTpgmR^l.iE[RmmN+$鹋6 ؍ZYj%:/‹Z),HfpZ".ZS`A LLFOSRӊM\ֹ'>Rf%:9vVuShuu\LQAz(PX`!ll(8S"tGc`etI,%%oN0<堓_˒gT$%Vh/ Ap52.⊩,0|V|:DVQFUڄ@Nac3ub3[7C*]`ϡK.+0F<=cq.)[ /oJޖS#eQld|'FT}VEx/şqiYvEaR4h0%C0bEL+-:3lٴR6<|H_PD24nMTvk*UVe}1NğE\t/9o׺C9LOkgrF3Zl]]ZhJZ#w<&os]v΢,OۉHSEQpū9/գ`mM9Fo`T].|krhje|;#H+lyeMz,{>[v~ Uŀ-E]fk֔{6pه_$m ;?,['/F,Ċ ɫ iXGmId6HQ Z?c`1Vbv0rA; ٍ.pDPy9 V'$rQDL{udL$GQĘt+ .@&ႠM%Ly?ȔfNޘvDX\d~b V}ra&%B??hǡ\67p7 OsS\=W>eU6eϸ0 YF$a ,P$'e҄E8a$If9E:?\$7ѷf9;5PCsVAEr~E=bk.yH gֆ-yxتŶß<}}*3ޝ9cY@.Wo_՛PCFF 'l#94SytyO6I" 5" * \@Q 3,TJ C*{:r%CQ95FV5!GۆCbC='x$"JI5CiUP@K"Eu-TB!!DZn'kHqae<9 2&3bhU.p`Y9 #sa.OKM`䧏Ð%*ť (@<{:`sF@ *Ym ,ҤDñ$V,S 1j0_(ޤ7&rnjZ.vo_JwXvӚ;-t$8)BJi $wR,([9R Cu>G5<+sAP迊0ߊ@ǰzN= ڶcxu+ X-؄JiDRFKsVY#XH䨤T"rYR 0R*paY-\lX020jSL~i!`o"Fs2gVa ˢD۩9g1 b .C(ش0_G H( F>&'hjr%(6AhDؒP A0,oAMFHE1pH& D/Wbw哇u+>k:M3dxܺfG r&P\;!M(Tj6i0H8ozuh Ҙ(?up̄QwAl"ɿ5ɿߴ`z,;|@{O"ڀ 0Z|0B0&rQ&R`&;ภS&B {-ob:l)B I(JkBR67F"M@KƐђrJ3]fʳa˘tT H"¡xE;D`#DAZF[-)-q-[s.Oe< EPpFD0A#%xD`#ksR+ sΊ +qpRȜFܔvs"#MTr!bQҢP""(5R#.1-CO`epQZSP\T/\һU*t:7jxQA pT3VP/iݒYڋ^sSUM޵\>c|u\o"!k3A֥9ĖvjM|ׁܧ8tgf؂[_>05G!QΨtXvY'dkv7 u6̼F'b˘6_ʴkggC#RX?yɦ2mᠬT8ְ])E^~@e5t" B@!=6CMP]vbYf?-<(VY;P+:btkXi}3p"aCppUd3q24&tA+`ejT8YB ?? c`&lԫ}8j<NЫ. :GR ^L)R2gvGdd}zx*(^auVOqÚ Gn8|2v}Tsf^dk (kį W FiE9Y TyI3j2%;'F|6S1y'FQjF[D5? ~jfbzIFyp xl!.Ԁc! 'qH3heYj=iN؜jd} ?u-Jrˌǰ7OpM9wQ{xXvGnشkðWh`\Mx[ ࡀCs@lxERz1:1%L[JV94qK5qKb\XKߋpYCcM˃ڭcsK!wnFKs#i3r1Y-6kg]jJrN]pm/13_;$EۥU<´M74?Ip@2&ZZQ(HRT *N/'ZWJECFc=kp2,;ݽND}|{c]-wojqчP“'{]?{aZoc wb ۻLwWݗIm"J6bDi00}0idr/ GOvB8dk7wZ7=2Biy 7p/Y2{03h};_7h?hwCNmMan#UouCŇɫwN:.ӿI^o\rt^b4;6j\xZ-g #-.`-!eKhh-lE6?T- (1ɥPJ翆a {wt5_cO:z_wAӹgwM<~.x|7xxmƝN*lA.9y";ϧmsIj&Y_ٻn\'NAoO` ;`<")grҟmd&@ t-fR?oO/ 7 zh _  m&z`L{ه;yMo;Vyu"mI@G]_of/G'$Ln*iTw;lu;xLǼ߻Dryy ..՛1o;^4)B{i-ƠM/F m{Ӱ-w7q'4IpK.\xӫL'A3Pm)_t0~@ϛgv6ڟ`³*CwǦg `){uwd^PhۯO0Mɻ3rov> ^w)ڋХJ~y9s s,sYy`K$riy|'lPƺcwr~[ώɇ}}[7#75^~REm{䶍juU~n*yqj$KI+i8OHD@"͉\H @/dBo2n~q\bg\b dIq}hp\cRcdݲZUJEuh{H0F{v1ރ;mhgdK?/_>B9Fsٕ UKgEGaf6XZ;[u)vƓ vthW&Gbq~&5FU3UbTqrkAkQ7lT 9ōtNNϤ(_jl)a-@ wkՒ7nr6Y/@M\)Hz>2OLjLw[u:!s w*c3&rzDo_9Mpޣ& ^^ӀD=j`wA~j9yJ*txƑ#G5gӋ? `C7R.gȥhP-أqӡJ.qB碳>p a״#/#cՈ tঘ(v('^dxv~V0NM/cxp_T!%[9>ܪ+ǟ1܊2:/=:+i-L{R,X6X5zV_ctq(j.F[Ī54/K|*>X"w.vJS秼I&'V$S vf+*5Xvmj,Hsq<$zisW@&{`&h-`(iLamp*\KYilzX.5\,CkG8sUޱa5֘Ii&A$q%EIC(.MR Կk fhҮt#HP-\'$[_l{))'LY7n fykÇ S'&fVj)C.퐬FܡdE6^9U0k^AC"VC9/$> '8 xZ dz_>c*'kd1x ~Lh]X46O `݅mCX;(\ݝt2e :L'n-efWGկnUۯSKqd.dǐ jKY"2}Ccш_&,FNq2t \Q)cb^]˔Bty<INp5@b-uEA;ACDKZlS)cH LX,MbLMie"kH]4 D[=¹vǍŋ_FESޛOk\™.3A=Hܳ|7VFE6–-Sf 6^,`65B,~lo/=8ٻ[oʬx὜xqƳk37O߶3FF"R=]̢ S֎ܷons_|R$"0viߺԈ YpLRsY811ŃŃ=W)bcqB L"42Fkb>aZ>'AH"BR kNZ3 'ń"%NBBid|?]TWi#68U[?qhMf6\c#ΓW3#.`2#!Hh.MG: i9r|oF^ W)Khx ; +ze60 SK7=~1ϋ/,ϟ/S?7íZ:M'tP+ƷpS?{G/^lW?J=k{9Ew9o|<5KLF 9-Z ]~۷e4b}y%8q~u>YiQ!4eN lY 2M"$x*I"9>ehJwȐ A~wYw1QI^Vп W``A/2D|S >\eMXݲ>pxv9nn;c4du*J6!-:c1Ec`$%6b´ ۓA*>8h_k~1` `⻟Cz2{`9fA4~&d>ӝ!e,(61pHDc4u%iC9I&s(& K L N8M! 7)X  YiFE9T 5@ZeF.09("S/iD.u)iBE`6RC笶f)J,rMiօb|݂kug];2)T AFlHa8(8 SSN-Mű %ĉFQ$ɔ'F3srjUOWo[PYn:6 ^ +t!|n#;ё)!2O{>[zS%Vx 6 pV=UY֙HQ*LpEe9 >QՁZWAʴ,%h\thȒ*B60B]N\Yf9l(raċL5<|Q{b0|p71k)5 `T#EP4J/)58M8=[zS)xc1̸i/~@b*ԝ!p @2%fO~Rձ7*ATMcsmoTx1ٵp"p7Y'8Ͷ7f} δ鍣Ł}|55jV)3)Ni C%ªz@ k ]Sa-׿h9bAG)sĔyOP1C.S&:G%SR`3lxJ:D?J5+0~EtqoᏱ>L[K[0vjVzڮkNSqN[y{wM= Y6}ceyc< _N|WkfsٸuM'yم 5U)[}</̊%^ț;{!a#xX,'SrE<ϐ7bB Pb_1pTXTF =vWx3ע6,FYϫ(*k)U9}rޛ-Ֆr6Lx~+Q/@ٽ?|Ü@0fFԫ6vR\i '-:aͽ.el$Ekg8J6U2ݩ?OLjLwl3T !Fe'KZ2kH =;^CF?Bx鑇ed-|2_$qjK#.T瑴iGi~$O%zc!ɎA=i %?>~Y&_ O3DSu QVw p$+90uyOvOPٹ.V>jFɖTVxڱD/ewʚQ=6ЂԱtށU~j_g;i)9PٲcUpkAhё.m)&˲_4Vw>("]BRX8/?XP|xmg/P8B  _bI[20-9ˀR-S 8p^wm/} 2=v7kVN1glNn5V6:_%;xz>fjo3qy 鷫ώSSƄQ%߱h*sTb8!cc{dP+SN$1XVV %S:`W-m4vr^j g 2ln]j [fY` X&F hs}F(Gl)W!sDZiܳه+ٰ%lUSc]70l=CuyX;Q$i:GW~LW_ݻ햋]yRE_\u<u4%.8 B(ߟ:"Bx%)wci\uC.[_Hۢ5ƉF5U1 5 ]غ@"VSN<<+Ul8rrS)iX8CNuj$AXfVFP9ۑ*u{޸uMq!RȖItd_\yKЖEvчsk쫞`ƀe¥,IL4}$|+!UZ皇rݽsOҐ !TȧoEnN?A.nD~еDĭ$', a rnlxM{{u5VjzJƺ-Ra"q;䎐 &CLmR[7DIҵAsFweYWpb_ԓ]n3c1vAUdUuOIIMdnA2`TsnyB0ňޑFN)[ӳ#'6&UɎ6$E&Rqr#N"ekXQ^6r2 &ѱw,.[Jv73t'y{NFqq2PO  oK|8؆7##gИ޴@0mL)WɳYPDZw#|~zʠĽ }v8A.8RFEkSpԶ갮O`I&@ %.^'p8k87˻aj8ohXGvNq"g  BbHjވ譄]9J)Bc^@dN:)C)Tkd@;*tFJv$;2:j[ I@jD2ed 71; \Bnkau N +XLI9<ա!Uc.١ e$ 4г .zVjC㖛b@XMtzIRܻ} /!:Iܹ1 c5ߎƏE[~{/,j܋T`XѿQa: 5C+?ޅo'Ņ,uԽq&|6_a#ue_/U3a6c3;[~S}s 8 <*=jŒQÝp XJh(QVFdVSQ9õZ &aGF EfT\5x>*ˏMA(43ދ`%/XOބĭ^ɸxz󦶮9~?[ je[ ul -ngGڽ)9ԧ/mDpVJړ[tmU}L-'Ŗ?}JbQppwA_%A\or֙"I$w3Eʏ7xs9CINڈIn}%2=ڴ)n%~d~y#ҝMV[ yb)X$]Kn'+%1v:YU۩B%3}^ڗ˾4݃X#A*_dw45)2Y!)NK nFC®U C[?U6nqcȾv`KU(CXvIZ:U!ԃ! -# +JQ]V=hOB쫩iѸxs#I}EC)R:+؎R ! b9G )72T QYBUۿA@8QBKh 23(+/""˃kQTK-k[R`Ԅ6Y]fVJT0@a0H[-zs?=t,n ?_WfBojH_Pm$8iRQVhlEi?qzGoR KM}QPC,uHCCsX\[Ⱦ`z~=`;yLH_T.<}!D<_,s6sUaüNsʧvQe}ٯZ_/փ~}tux)ӟOyd؝78w yɠ t<]OĘdst0 uQeto9-\k{tH85!g|AaF{1d̆9a Rz/; U]aiUahc)5u&Ҝ inU@x!%E,wr{_R2T *5tP&vZjL[Uxi=}wV`c-n$v?Z-+ |iV2}J:HM~IS[7KP(j}Z[7'Qi::;LgbGl;t^KTKJq!(`y:Y& 1l aJYZUo?EHdSN"dԁ;ㅈ;<ѳ6{3 + ^@kp,l,۰^X2r)Du3䬔G%Ǔ+K9`m- R]ať(ĘRa L%~9E"Ks\ZDyFzч9bLYA8K DYr];q&4d g4¬Cj.2I 3T+Eόֻcy*l1LEGJMŞ%j ;IJ ,?|,sє@~F9Lٖz !ER8YPxz5,RGՊ@?61a!]U !VUpQuOiou?TID\BE;˔%x|ѩ(q0y;.;iDE`ۋ7DZ~lMJlҤK JJMJ7@BhWJBHԅp,l ڡF%+)T hGfK;a&CQ\KZDBa+ тr\$,$G1>g2k$#\HuZ~ &.)Y)q`OqMa%mOiIBыYiP;ț$8Ӛ3e;pZd@͆^B}9/ʻʁ1Ĺ01"|~QѣG]=jy0"<XXY1h%ҺTC@a-@#*q~+hmڬJmjHYe%٭[~3?m1R'+n;WKXv=Jk2\|T\\q2OQ>ƛ\]5X"Z;`p``Zk)w2Y)\-͠CdD{FIC:C(\(db5\MYIZ`<c/xeLxzT)V`RhMeA'2x>;Z5F*1`Bn0Jc`„tQD._JY{ jA t&z"j?:I`mԆoox"h fD$%)<6]HT&g#nSnPZ4])P i/A/↦GFΐc؞s?uIho}X3?iۮ]tԤwn3J,ƻ?|I_\9h*?Q+Ҝe#Vf Î&]c-V1ݯ&L6骩Up N?뛊DT%[1XR ;WCR +mTIG Hĸ$:|VAB @HcQ,=x-TXpDQBV hh55Ahz_T.|>?fr24];%tuRd?H7ci(ٸE Ƃ M=x# =N6tw]*> <߬y9xG^Fi4w p>2⻟g|b:߅1!g`kL^^h?u>C$(~vo9|1Zc61]\%mx0nonojW7!p9(jhXNg,n =+9i~d $%QHV:\P鰐S7 )YS;~YfqKѓa$-lN Oa0C]A^_?M?}ђƑDafCHWxD0M'Dwe{`P]ceJ)ZH"PV…l+p* Ŷ`;F`/-cCEi `Kw+t麞ǿ8R=~} V4P6$1kjL -(94|f$*L54(P@/AA4ƅSexYhc#\cX^h4PE):!b!Y:9 N8Q4j!y)e;2"0heSS /utD:)=hֹ߁4L;Q'$KYyw5\O >)Wrт#uI agakFiGs"#+y"µJQ y⹃QZHS t*)+Z-H=^w BUuSY5"RP|?A}0ādEW.= ֛h|DZq 3]ݿ)秼}cu9{y3?(<4&03jvsK%0(O1&\>HϢ~%1G`:X$\0F]210ՔqDKF0!faz\*8g-[W A8D!*ѫ襄?:) f{OV1kh m^A Ai% uE =$z6/sfqs? D;%|T+htcKZ P@TZk _B<83 FU-Vmleum|N{Dž2GKQlyb<+N@I6 ƔaPIk·k">G3ٰ\.l<8|uhE(®.,9{Zio:AA9鍅#k C 0 K\ t˜ ȅoXHJD]6R9Ƙ.䷏q#Jt0혃VjgJ6܇Ul%Jbdg&!qM|<'[_xEtH$@ϯ{BhpNu ֥k:] IU?qiEZLPz)\n3L#Âh&<Y qjGB3ޘ,]Eo"mNn}{y|ŸMW8)b?CB>"|r/U-k-w~Hz\i7A<.Mw0>FDTi$NZ&L4z5&t^tE HI..딞^ẗ(!,IaY`J)A`7ֲgF:PbU+5fW^BDl`o2 #[^Ȧn2D2*d&; t7/U ҩwh~ /=h"Ǵi!$PʔQV"ZA6rzAOjA}h渦aM2WX^ +ac?)N*3" "8ix,eF Fkݘay B8n 2sFIѐ9(R^3>;ƞ Uo>6yG"Rl   JW6$ X9O@/`pMX8a0pY@bJvķ~_y~( [s_ +\RţN&7jܽ3nnGm~0hOqf};9~78ߺC /c }('VCνOjxuoXM5.s3g3Y2᭎ gA4LYu,6Mw+ ~v>ݸ3SmI3iwQ^21m5ڞ֦/ Psߊ/>m(qJB7ќ7l7 _ SLIlӌegmeÏnQ#y5 ~*&-ԇo߽9>k`w]~Wo]\]}Ӫ:cPk$ft{h$WW.̄iǃD{u&L0<)GaaGoI&GMoRqvtEmyEM<]K8OU=hL`o3l5n{Q.4 wMBHH 6hKI_bR!_0tz鼛 ao/-1{>C[`pqP?{߁Aɛ{U|4oQ :dYNa3K~k4YVFV1S![t=5lh[+QZZZ֪d *|4}yRc) 'Kz}r }pGGh.8%HL;}9-P+3 \@|DZw`wyMf\Œ̥%Å~ʉ2M}:@.k ¹@=RX&4O?ZG)2ӥnHBotAh35۴ژxy 4X i,USaG\J ҥWNGv;?;V+ju<1[3']uHHa57vOѳKWeZ BݙcMyrXZQKO84ߋ#6ǟ$ gӶ~2phORx=Q_G~un^K9SH0Im0gęGXe0FXKb88)E`D89"Rv=7iNA5t~3nk7 D1'%yJ혓rpɎhJkg5`øB ~P,)ńh4Y&~W"^afcdGm7}j6;Iғ^]6)ef =@y6209נ9wqYU|gC!}}Q9D'{5Z$'3p Goobm,n@ޏG~064I\l_nw 4˼Ph& }v41BӇ#WeT|a6ƿ87g<sT3Q7GwaBt3t3sʎ9F׽ޭ0t@N%/BDKTig!0Xy4.4h>dBƮ!%8*b^Kmq)s¾iE}JΗ%\C[=-p9r`x$A_<-º/º/¤TUό72K){H 7V<__${^k+>D9XS # 7޿vRl֙ΫMPR#~' crQZqY`Wz_N<Hdu!ʪQBP{g)2DJ Cr[yu4R4$LPR*!aٛX*FpF$R (;9RT&Vq=qD9"job丟 mZ6YW!wlz ȓOydEP}#gSB IeruLwC)pŽrhۉϱƯU+;q*,C"R ^w2JSHKÃkwz7[vy놜*IdЖdY©V pF2 [ѠPhp% xwy L/-X[$]X2;M5gK z-=5AŰ& FKgUٗyxp6O"4uNkQzu_<\htZ0l 0S A,z4{ %^?!#d8yBHWOHwB|Gv>ϛ4\Ͼ~n$N^nz_??"sZ4,{݄0ϖ RP׮{ӈ%C6gT&_7b&) t?m_+y-nXgzc>enSg4}MϣN fP9X/M`'4 mU a::œ4mB@7NַW6JSbG" '#ďUBƅߜgR%2f*4 bq*PH"APjQJ3ďpt1BXk0()$T3Ip"BX`E RiΤUY&w/NǐDjaa%HaBuJRPi_B2F1Pmc4 "Wlh&ə>j/Hk" 32 )E'LDdT)$Px2dOc*%4,C)H`ɈET~Bc)$1?dMbMeSQ\Fɸ|lF˖wPo?$ d/YMJmOsئF 3Z~ Ƈo׽w?r$"6SjP,6G;+A m t &iAm-(-O}f.טs\ffg$U!V*H:P_Ӷh 9]6x)"wf5sk!|QGQO#L7WAk5ì:lZ"7_s3)v'l%A=jT /ATfYQ GPGfҤD0$HQXc`\5ifaf_Zٍ0,m(7b]s2֋l2ޖILėNXz뢷64Bb-7Ԯi8QDdF f0N;Vŵbd>6;NeTpeP% 3 ET23Y|Bm*Ah^0++}juo_k#dA**\pmA^n=.X]ظq..a/ܚ\Z QGD/zqDxK'f0ͷic3 59>%C:!uj^R.=[*nFw[:hl)[Efoc6l(-` V3ƥ4vH̓'t-b$Iݍ;:Fy4&&BƤ"Hـ„8wB;q9^QEcryWLkܘu &њ}?  ͫ~HU 'A#[N?~;ߊ~A7 $qSDB Nd4U 1œA YD3Z2Iw%IRf8_%cw~9^zE k~N0gY+28 )Nb=ؒ V93jyXV^n l/dta6{ӟ>Z|h\QΌ͏7k}(lQb'Cm_f4g9H&7ƛBR$kz?ؑ`Gő^ ZÞD ղ%իLC>>](R.,^0 t0DZZfKڍe7[|[r A{UNЖ\EhIuB J@e|Jp뛋=._y\'U.u!:0sb+좽GvXY9AjEj'#OnF`N3Q`'q]-8 GFПWap9࣯hبw.,0KmGCw-֑n#޽T1I-gQ4MS95>Q = ڱ]h"`ڽ 4\"s"uph*p^upZ.F wv%Yp48%24wVB  U]&kupv QbhuY;!4 `ɹm^HIέ@Y\?A(FŨj^H3|AN۽Am{A UO^357kaRb+AO=CH"DUJV6;BѩF9Vs{vѳi7{d1SMevLƌG=T##+aӗEMoY&O< 3e&cw[Ľj -6E?hyн8!1&q3{]*& ԣњ (q*_AD%Jk4@Ĵߝ]ޏMk0ףt5m$:e첬9[2cpHJ[[#Ǭ|]GCᣘp\&~ *0b|= , eKVC%tFG[EֽUGkw.ZAâzTXi@t E1VE0%%,HXv~n=1IJ޴aHisiy1!-1 tkY8:"| "TvoQp^1amh%%|@>ɧz2\:)$GE@.ԙqFeC8;B*jKx[-̳xRE7 =w ҳAՖM+WK24DDb $B\Ng!=0޼~pD=~~vN,ye GTj'^J]-&hJX]+#TsÍ͝WM|]O dOX@nf1\!d(J5az!άX-ϟ9RV{O_i(@b<"%̠&V2N ./%B’q5jG:uNyu&| mA6t">~`oC綠;;ޜ,R84)O\ϞlhH\P&+PBº]\mL /Iʰ .Lwiy m2 PB*&6gA1AWzvLdvO5f5=*oǣp3N}h$Qrk029:_ ޶ /fCb=&f/Z3$Xg.!R;,d\PAt]1pi<=j,meI\¿t㻆Cck>Of,_ǿ͌> i]/4NOFsVZcd/i][,6r O_/Ks/s i5u@\t*"w=ͦAZeW_~IDiiD" ai2I:զMm+o|ro7 _%L$1)M0fY&,TL ! IhE A`Fly^seCbRm-{T]PwEŵ&Kd@X%1N.XRj8@@bHƉ$QVJ =WK{ ,]ma!k&R k"،1$=M 1fzS`?=ӯMX`h9|V,c S)4SQdT4cRSn,5AM3akg +8՚g8(p'YjF"DOzyv9VC,id JQ"@,4vGc cτƔTB Jna5n+(' AA8.\E#  ^qϣik c&(1/ lXFe!ͬ%[}ҍ'R^GL W8BbS->E=[mj DBLYj8b_f?B-oKr':x vꛙ)ǚ"@Yw&As;T ZO{K/ =iJ1 D$ӷpZ(Mlb`7=|-=>V鲓ֽtb|x|]ǟ6_zdRbw>dKQ)gg F<4}GhA{O_{`ׁ}`Xr)A`YvoA~  xyHa,ap'!J"ןyX;ƅ }m$I^v{g Օu8BcbǎvfQ̱DiDmM:@@%t;| 2++ʴ3]ٛûCuƛ P 9(/͕X/np i5$:!SLh3+x^J},_(UsQ9[k],RmJqebeND2;WxQW@ I9D]&8S+9x1co_Jf3= :dO1rYr]OBOʜ<}5;w_Uh8V?kqlҽk[m?h%x}'eiVk6&jP-HB%OP'|)^B!u\-s{S91wAM;̴i`=,br:oFJ,P#`~nlmSR`gS6^\^SPDne@=iml-KT>cQuB0/&֧(5b"^cᢛ9WTYlmF2aB)EyO&@HixktU>$e6_Ch3 1RDrC|ۀ+cנS]򼾹M{'I@)˄ i#yf"c{ǣYgy p-sէx-KaKOfD8fY @z(jvW)06pMRyF;h1=C1^QuG˳Vl}{8pnd.SZ3vQt{nًLoEͽR'S̜!#w} 띤,KR$4;c"(sm"ZiOTyQ?7 Az}Nub} 3*oJ:"){ *1o4p6<䍑z291|Eja  #F5| ${?;} GHAޅԄ^Ԯ  Ie|µЦ/㟩 bՆX OInT <m8~2Sc8/Ye|\lm~+;2,˄IXF)ܧlĤSr:XkaXK?|]51H-3DI9{$=DZ^ h _~sdެE@_pip u).wǣN&XMxbG-n8fo\D@\lBe;gj26F9j$ĄD@ 'v: _ƳNtx!jT-0P疶FT=mB^qix kd#: n 5}pr.'e[J׹&~>^kWwUpL}هq&JaT/4vthf('1\ՀwV (PRhëQ';ˣ\v;뛣WG~x}8Z9)s.x5pQY MB2EkU: 62vG!E>1?ݓ- x?8hr"=+ΈT Z<#sy~}-n wpsJ?|icϔ3 Oef9,hpZո ON;}I+뼼Ҍ6͉ͻ y SfZ i"ek,WxΔQtt: wc #- VY εi>&B X(1FA$x TT3GsS42Z+c8^C:RT6ާf)zSړZ1gQe'IJ<.#6&%MQjw"t6N쎭Ԧr:*fS')JeBy$Ӹ3L&$0\2*.]$wQ V`5XwX u{ 2%nj8*dI8m.wy9weRA9t&,%BdPmLyV&w`@H'&(AـG53@c[m"0EiE8$w\ =)= Qh퓏ƺ!0HMd9ST㼧>%AO)N=ͼ/cfH dZ l$:@l(H5.^1rP yi <}ҐBA}Fq㛫Ov'5F&%a| E:W*(x?)S߲o>?*\0u"S/Na2#,{(գ?oG4n|<9;=Oi65"W;v:o$~w.of2sW6gpc- ?ngqMa3w߂8 XsmSY#ebҸ- mfn<X/>YxD!t)E aM^+p1O\rE,Ѹ5x!p_0Xx9xZ2x&Vu;FgRԁ&PTSKkq 4޺ReND2;WQ@ IXNiG˟-AY%e&s!&FEM40*4E'2S( Q,QiJ^Y 8-&{C[P$C3VsǢ&p@ de y2id0bФX _xVt4ĚϪ7ozla*diXdBב4//oTAhJ2;H]ƷiХ_|4S0Wy,]?ADP^򹃣 &rӜOba&.?~N@"^H"OZ߹7/cLaT؝_ՋPь\:Sh N:jsi&%I%U{LҦe qQb{#Ys5H=\Hѱ*TTZ}.378ggK`֬y3JR6iSM^I05 8bXSk֦; &V%>lYO# Z)M0 \Q4XJ[ JTBW>XH;qB8a%_q.D+aؙ!.!Lv|UxSe>X@5!u~6(N4JOuJ h7Z򤂥̥*_fd%d\!Xp٦ve]`DDiqjdP]MEL|o3T>&H*44oG!o&~wڵ^?%:OLb(]_^.]neU܎JEY:E4w^!|:˘O{V!XIh'2Pk3Bh-ewY>ʢ!,ClY~`5\\_D\E_S$3^oن[8QgJ!*Ӝ,M,$#+Öt^6D;/ґ bq.O !Rx2ye3'J.CzjKey3(챠ˉ3Q>"W3kXD<]f)YiOr]hu,KsPV [b`o'^*_:lk\Y`W g-eKoi p}+pMC ڔJ5>6w)QАs9(I~93G~ˏo?}S\Y7>ϾϧOQFrZ/6_#C/,+`O 2'DgXOXB&m<_T館OV 0legGSIq\:{>G-EiD<-ř,-+Tʄ(*o61S{vѬy2RN">WRs7bYf}h4C .k^ʤQLeȴ<3M/T:'@(74,!ޣBztJ3d$@IwRhw:d6_=$'8Cix *l[7gIWm4[w-7kØʌ 0a~-g^\e3l][F+HK !IoYlړmji$;/PTy$ C}-W>jr M~&JD),o>tS+Fc*8R t23[a3Iѥ03ΔMa90~cc|FdQU6|~e*OqKYkzg’f 44?Bۄ0%iy0<1 FPt~b ^ktK F(ѱ F(,R@T: ] δ٭d|9%AByeIĘڒ]+i4!v˅F9sY(Sp(`GU(** ]0΀*Ln FlRÂC,֗њyY)i*Y˘ Y!\T\*e-!YLZkp((xUP`kđDSI^h]a%A_Og;IU0%63AG^m ٢$*MY, kv)rbﰁ$s ?Vt:ŢF4i-GsUSϤRN.,DepHJ/CƥZ$#wGD.!>\4V+oḌf3E}:nHz.d'7[D,3;A`;aDnm j A |F*= SOkbU%\pµ ;БԹLE, >$Y9,B}VV^ؘiPF\k S ++t'C8WX n聑C!7͆~z@l@sMh}6[,ٰV`1ȤIJwckF+PcA~$9&dJ1Ӂ*{Wa5AgMtz|O*}<WW$6]Ɋ+2'Mw3҆! ck6KW|$rv^7&bmb73Vao GO1CBy&I67% {T6Iۛi%OL~oݾUsEyN]Kp@nBњ81D8gwuNg?/z,W{o<-y,[qpF-=O=ٓcoʯg`38J^rJnV^6\:8#h%yY(87zL,|>lάCqWУW!DWgW˜m~^6){WLj ^3G{GuGWqY'd#X'4 !cy?|7i\Njf`-;g1_z<'i)-PЦ>X- VI>9 Tku]AoHX6KΝ:܁jDPY[zůtOڑ/X/ tOII%]Hh0vٛ~SѲ%~{J& D{T6TM*^ 8=wIǎ3#Qy]1r?,6`,^'뗷?`dd Y߾yMo7hR5Q( }SuU~h}z={0I!6ˈB h_~wpsv}{qs bUOYiCFDRXgwxyel؞g6A5F~dW4o/+.djkpprf+ffURcެCSKuMYyf` VǙ ϵu`퓃5B)'"]k6g|%ߗTNdm!3'cdYvI>9ҫ ج9//8 pf8OQ 7ކAna UiYҽ*p 0 pΘ4`7p1);Vm}A[!gKȏUq#2P#鏟m4JL-QD#\?H1C.@x«1*uM>HEm˧{}C1g~:g|*f>ϪZ JUQ:8jQX9\[v֎]WEc`"E9^ky=Z>iT1$EW$Q1CMllf Dz#ep+6\뫙)laKU~PmޮӧP9G{Xx*0 T9+QUk sdF,NеYіMfh-'If]60V G�ó3C{&VEDHW9ꦡɫt/ShX`ՆXřM.|J2Bb.pf2C )Շ%wz8Պ]NF'Qa#!J{;J0TRImwR)5ܞJ)7筒n92汅9볧kVw忿ï?%H '8{MV?;p ݲ7`zy%Dž܅9ײ=uȅzSWA6/a"ȻB+4QVT;Bh 2zƔb3@)bO0B EVV9z[`3rƎZ bQ8V`r|L ǗOWo6),G f&1eW9<$l=5> Z xeCaͻO8: ,4IqsCQ(D* ZYgxow{c) 8Uie)v"fv9~蔻S`'(f@dBoOs`l_OLRk#sR> mK8`[9x1)u`0fvnq(|Y[!O37'opfmzeyr?RG&U/COv(bϣhO.(.Y[vW4ǟ~;5kU$3R#0TRkc2B+-8j Npe%j]!lmT "_uo.ZBAz&+zF;(m5}r>9M G 7 gTw![T&3şᶨ>.UWջv:de}ΖO)jnu:)dB)<ܼZ?}cp6$.p?\O][뼰 ʝ_8~W>VP(Uh8"hQs6(dhe~G*~ wĸo^EnI^?<ߓx.Cfܲqtt¤e8j'+8o%ȱwC$==PAAYT(BP8G1JjgHNդ~|u{HW]}99?̈́H2aDN&hĠ('_ 7?Lw_}ӯkWM{y7|֚T@ T@R f̒ D9U bC%S81(KxN]nfSڠgTF?Zpތ&Pa~=zHC e.z˩rhx}FsHS)pOq~Bel乳0b8zkA#1=ea x"qc,Xdu ߒZ/_C]>t{m4X)\CCrԦd=,e| N8PbWKmq8{LYo&edq6_]-&b/6#|xlyJH&E! (E|(*@1I&V4NJ)ƅ_Z|䀃1HI2eW7  Q1ZY|& f[80۵``:ZRႿ:A3rM/hfՒ\FjZ[\.eUD$r4tZܯs@vmMu`%dz~*ɤ"+\%WpS;.pSi3tsK'[VP>õ;Ѡ9S1_Pqd+lÄ|ZF驃t EOZx C[2ˤ*(7cȐ+B{7]j盁kjA.akLJT.ٙz ~|?w>+%^!tKtWey5yT>^=`aZ{"}ܻf&*$ }BZ7 kYB[z%]%?Laݭ+vi>W Xc8jWv .=/=`=wnn$th,ܱQj.@NAc8 9-Llaԉeify#-p`ժC,&2y!$?^҆p9X\LL3 lq d{{*rNe4|9ewprGaM&ei5!-g}|}2L 7NSF߹_f\pۇ[!] 7qkj?ǵDZJ3*'J{ d c˻{ɘZƖ3^w+ҥ&5!je94F% i 3ll(nbw;pMCWi\1ߪP⚯Rj3(Qn d!Crw2HP7$2 2Zar>& >nn͠?!+W]NE5ʕh+/Q)%:u 4c}>M.ZeLͅs+k zf61'LtXV&g_~ِٴq¯̦4B#]݀uւ+]v0ֱt-E JVTDe9%D 2cE k*h23{ 6~Ykg!&7ϞYÍ*ݲh \+fϞG fOѝ1& O_?Y* jLQPC:r\yqCzHL(}njn :Mִ _1q= hnks u\yNzص"`VkZ4gkӈ'l $qw Ua8~s3Z]8' )'ŶH*}$|BkJ ;Cz5ΪU#<=(xW=di9']p9dL9t&/UYkeg7$&nF*a-xwd%:EpЭhcy*}-ݍy]..1E5DzSiL:%ƍ*>]F[ )nvF<~޻_MH@ە`^[>їֽ p}7ںUCRH\:fE[:2[q˂fF +M{>miA;-{йi,@S@)QI pb w^zz` v&NKV#S]ƌ62"Av_{w%NeK{>LT hրSފXASV޽LIڻ"=>y~rƴwG~J`|l?y['߮tR=\^܀%ObjoS\' F҈|oӓ&HWT_^y.7zTL?|ӑL_nʹ~9rI X2دkDQ҂(2@@C5W-#"kૠ-d9bhK!:5Qbh$ *P:_Uyp)c.bئ 墽X(CighQ&Թzx}Z*W>ϣ!1= eӋ"v3oGi8HJ=9; Qd׳xHܤǚ.z jxnkI6rTOt8z;j0!מf㹿¯TF];.Hic$U%"&ޫ HτɆw< e(BĻk=]ӟv1DdR3%KG, Ԣ,8SȚ+Iv2}ZI=JP4 wB6߫n1#b|WkSK.-?&JbW}ڽl^PgZ"hc9`1q%s$_3H6{?IUdm%!iܻA[fWdXFLGmvH:b`he<xf/N?oٱ mDhvܹFTp%}$$<[b1o#VcJQ0FmvR 6)s\$^Xe 6'&0Fw<Xɩ˜ ̦:pDE2) چ, M 0q/C Dwe{5Q1C X_ͼ!; I8'eP)nMTTTYN {sGA&XqF (a]!@=ҋX;\, [ p<($ "M84aĚ+֕`F!G4vU1!i)* EIm$ :.te)|arfb P3X&,8i+KVZ&t^,XѺJyC $ 6gꅞ9:eSp/Nb_7Gt<'{ /w  ?,NO?x[qPU:=|~xтN<x;>,o-\Og>Og]@|H]I6_eS-Rz>L]ߕOlr8D J,a_ﺗ JMw)Ur.v/-ez2=$v| 3INw|r)̗B3֔2NsA-(SOތ?E82 ŴU ~4 Ed_+]<&&\bf8|/A#byPq4eυO?<]OQ;Rn Ql|ٞQmk5Ǘ'Ď9#TvRy\@ /(6Y>Zۇ'(2dqټo^S*HɔV\Pfܥ)#޺yÚ|Yd4 XoccJJ=Cܶl ޢZgC>9Q@m:WOm@`)N}ffR1Yzy-1"_i]P-/89G F!\1ACnXw ȏ{S@!"m! ?@lٖZCYX&e",3T$7b 5 6݉ӹnP}z^CF) zi:Q"0!=6 5 m`GvPy+.;Ź Ww׫ǞrA?dk%:p՚:|ҕ{X\S =5sU20p:q Cmޕ,[t \O&v>.GA*]sU s^FQUssHtkʣsu =K[Cn/' WJywo2AW}wԇ}n:TlU&Crq>/Fm};3BsLjI^Ngu_| "Sp98r)ZPU6N_T|KAgrϾRTDLiW:zGjXewj?-|9*O4_Gkm,53+brMRl╄Ewݘlێ{[K|sP_+Rz7Mk5 ;GFr(X&+J6whJ?gָ~@l{s!J)Y)aw'[mT/)[q r{ IY V)V/XZ*:4}r8ndn67ܚCdMgx7pc4eM퇌0˺.-#/++ꬬCDZ]Ky:t2[dA+i!gPͲġr@,ɝ I<+K }:+2H/r;M"…@8ެX٢ER&ju'< `M .yAPBBz_  @4n2A p,W(%$hBugE.j ;0fwiS)jM!UL;A42xIuƼ38uʵ6%N&jDyϋ,=ۢow%@:RVpR ԡgTʒ]8Dp ]SՂyO@q.Kꉃ Lhjo7YQ-@GWȟ]5i4$B:gB33ߣ;Irw[w~X?D~ԇڔA}kmng .\R;xu@i JủAn, AINL<3 $XH8K^f\pQ k1򶨂|d#ȣr`T<IMJBKP:GuY~ZTPrю~ ɕ,\8;ږ({y w{ňmK \:E"r6h`ݭ]Z]BÓ-*3V ]D.5c%3z|`¢Gӯ^LJ{>vbj<6fa~:eJ>==m2 N̶U\2k- 0s`FEٳ{;=efvVgZƅb t p0;%H)x&h&3!*;^~қTT 1kkmH˞Qop`{r Nv_N@VVG؋SMP$!{!FlQꯪ뢇.H2yiY5sʒz0>0U( SB2b]q]FϏ]^.KI̜{ c9:Ģ5c|B.29mGFL9t@7Fo]Q>9?BRZnuHB!5_^ %R1!C8FWR$ p&I-br>)gc6|4J\%l"au턛9#x,e:dݽKU˻ψG?^ߎ~!ӷN_r_zd/Ac{4fP8Cޣu~660Cކ8XۑlJ=c{%Ym DМ 7lU!33H3Q)49.$l4Vv/ ";.iV8oˡ%$~rIH _LN!ܹ{7#)?Hk,o*QyS**cz]H?Ll|\LZ~fuƴUc1І "yEb&)ڕWEYіJ(L0ŜiOeRDTVUv z%Ξ핻ptIQFr阣5o'WG'zX1P }0b8 xpv\ΎKqi8;n7m 1o2cdrsHolUR,4p+B@$dCzvځvDD?L筩 K{GcO}u FW¤]yrg h֌1Xev{ ,yKCc85s4 o>^]~Alp0!C!䇈4,B'[=dhnQzW$"걚ؖ!ĶÅVffK?<1>V 8Bk2FgcŒ1w,A nu_3㣿fo$ӝ^ӓ5Ҍ֭v^˭'vέ-k[}86k9sq5q~9lh֜Ͽ]sz`%3nw.fD{*#m bb8_ĩEzhYljz64(8V3)ΠGS=x`Adzw1WKO^/c%o,q!/!ƣ2MVAyy;^sZd6A(kedfY$w*^6B|ApLx6AhyѮVQf 22!s!'Ɯd r7Ԣ;"VkN(Q[sN6 #f*@eɵL.VmiNzv^~Hg-}-Pm]iW1A=kWMBSbДXEo-&h` q엣VTǨB(je^b\UKq)W1nhG> A$f +uVh$1Q$Ѻ瀠*X杣VjZ|{L5?!xM kJ^SBv^ %S, ׁ$mDf t8 4upR5~)cWW7 $=}>. ]% ;?IiQ}Q~7K10a7?j=0Uy'Ǔ3Rp#\5_IN~ Ƒ XD̎L[.:l,' "22KCp4xŗd3ZdTU0F9!j#M&<4hڒyJrIcV<*oTdDsv1|72MW]qҗ~CL|($WY4h~,aBy4-rE-LR_wy7cWM>ŵ+ z9߄d 7KݤFɐ6RzT"Gz_ґJW$#wdu,ηޚj3Ơ s{"4m*=ʻw$Q*њNVa`up4E9~KK>~M>Cz!l$x<3)1"ť*n%cc9`ǯULK`]DIyo_hhN cͷo:o*Ȓ5g~ϝe ow[YP4YIڝ{>\[$MNh ]O՝a(_Σ_1ee:glvVDuīzFn29yw~FI_**E=+T}𤇹,3̓3L?XqlG V z9(XJ=,&ٞMB >\~jsLH\;M!8عɖ{V&l̢'- ^y;Soۙ%4s#uyU \[ȎNQ8]"Qwu[CTj?>퓡)L|m%a{ϲYPbY7m촜a#{3a vxn22v79dzuš1b` Xv`& +,XA>: j:X%K݆i0_|pޥb& +*& $htm?n6z6P:Oo_xP)&P*rJfھ gU8|6Ӭ[LJY1}7(|v%\ 8(7w)}޶AܚA-qBs=4bl(e!rͲK~7n$GRe3J_"DTJ`)1pBpʃB,D xA2#$wFIOPf-LdEwRg8D$I[nf|ryQ~Lq0JMtd 7t#8 q )$sCNR'5L;~'( 1mЏ^ pQR"EA=$\ﱈټWSj{5WSj{5^jX5+K*IPN5!Hd @AZfFDx*0}VW}Zuha,*bCsu $hL 88i $suqmMև@li*Z%KPq,0ΓKYۄ]|/BCL t.Hw}4=5c2B)QѮj98lI i\ h\ot2f:=ܵ9׌smQR&JkݱFWqȴMrn=~3֦+ojgN i.9}&ի^B d&i][_l۝/ٰо[Cr80NTXYRX3I].lfZ#-_jZe YR}S}[A/D),7|8uF5ZH]vmk/FpڱAJWi`䰱ڹqKhd" uVI*WW1ғ%i㊗*D$XD\-nhQoWB(֭Ȏb"bY6]w8x5R^YUoyE]a oB B=t}z`!ɇ<_ A1 wΞs8pZX_[C[:^0q 0ޱVoWg>xɰyJhd mʝTUvE,@N{ MG1(kpHrWG/JA;!`2O"gA[J|`zԭC]a]$=~U]0ϭ-֟[od,[8Fh#W!ef4T с͚F11cІ؅J Y%jSG"Z]o:Wq5hi`=]uYTTJGbuQ!.J+JdFPh%ɔrG6fܻ^U@=EUf &/)dL =GxEA&EIyOMJΤ〪lk03]DqW631,f資ECV&m;LoyLʴ@GKd{zIXa ^MHei!pG-6T)}kػƋr_?0A&N@ i[7dzMT*)5;8}TɈnMcMx|C6&d)5cfpTZ޿AYvʸölubÇ\K&\qֵG'W qp~=?6dx H~P`St91:ψ379$IB/3%}mmCSb7E$%{1}#IJ*Q:xvC Zɪ/2鴕W_1 Zv2_JɎ9% )t[ů&$$js^\]bAwa4Ad%`z7m٧Fhgjz~0JM\ǚ I`"H4dq-!L[GnEӅn^ߍuV앹y0l,) tƚaE{&$xR~APÚak5ה9]ˑ {$FR᠂vaӊw?x7ݹ6vPz3Cv0/߬H"'TK(] "ĕ(]|gUGwLqK*| \u655!Iӭ}ؐLt[3Xx:oΪ3|AD!/ ܩ]ZeUHSm_oL4p֓7TnH0my=HS=ŕ`X9U *i" K[R!Sp Zi<{]"B]"bf2LD^o/܄TM?C}}[cV]\wURnKt;5 o.egR."ZkM1yR` ߞϬA;'X n܅=< Jw&kS]qQQu>G_c4QOb%Իv#0:%zgnj%ZRw[lh:cJl!8BlW `ҝ-|8p;0Ms qAZ>>y Mՠ^*+&^PxFzF'q؛)Lȝ6kRfc7Z2A>5qO%֢ +,㴎|Q)aT0ceHbnb*CԃF9}R54j'lU7u-&qYkTo$Rf Ҩ c5.&2Jd*$-@J3J JS>a)#04蓂-&T`V-/s\jcZc1 $5kE8@9 lJTp8iA h TPpXP(мp}r3P%@~ox=)^/lڳ19LKU$N BnQnW$M 3xt)ziUNu)VwyFɟT˶dӐ7,2ʏN~LϽ'~ fp,~ȏgx܌M89OQ?EPp{?g^_{6[Ys: 8+|t\NK}@G-pTIUjo茿돯GIw}?kɱD\ zǷN/>f럓{2>yY9vOhNǃ  ƓaLVyrΓp0l^uNV/'g :ŒnB/2LF鬥(.62o8, 1ΕV` Qn:&0w'PDsRb4VLEE o.k$ɥ_fy[׿CpK4(=J|߆;`{3$>$[G0ަ!y*)¯O~=5=p;Y0/A2Qaq/48QS/'{B>18Ow};Oφ)}P3g=h8*5uf7.ƪ:b}o)!uHr11oDZbPR^0uPeBjA7@c;}L|w2j}Rbs.FKK/.kHw. %%_ 5h9h~0+ ,PH0G ,ThYu6?DX2?gg8%XSԙR]KU|rx 7>8W %`3_MBDm2!t)"bָթ_UGѵw7wZ,5~iv̼iŭnξ|T"k-,$ +8 &K^)wb0%K?3%q+XSI$:0/Q!7{C0vz5šʆsR )9Jw]rsTrI.G<EgHbc5a+;k#jLҕmUW3t%4ʑ%% Q?`cU\@@ @A8Od*0**!sX:Lf)CNtPS|ʰaSaE2[eY-ZjomFV]D,%$n%tLXP($2 )(` |GGZTg*C;1(@cʂR!^{0Ͻ@]Y/{MrgFӅta<]9/_)4"!,ZʜO ʬ!!RtTa8|!ggwT9rA!br >O6XN'C~˛&JGI Sl9aX0`RXL9L @XF 1D-`$V"=c+ d#c,r"1qu2\=N&~>.d-8+Bl>og,LAÏ{-H ߿#4`{d*]W7T0%!Oޜ's Az(a+}g`Ɠϭ\p;D"mzCz<~w/z nL?Q2k:NXH/e||59Te9.1*ds^*a6FK_pZ3ӣ,԰mЍ"":Hq1rraMuTQir &c$}"%~`YkN!ϬS#(":`|DN[J8l SJ'&xDNJrb  XIXTVZcڪ4='bBkm<)@a^~Եheb;C9xJ%gxЬ橬^A9+e@Լ Mj!0wsnh*F39JOR-O6r# QI5pϬҗP*'|OխKRf:dyO ڎ Re3x CjJ`0z$YإHch!LDinQRŸ!7#iEX6ٓ|O+,^g7,  |v5Z\䫝<^STd"5F=/LN'Űb!_-HW5"u0!XHD0`XNbyF@٧Pg]kjt.%rjC*h()7iL Oc fӑR6#;c3X&Z!B$o2Qb{*CT;|P֩$+ UJ8.yφHENj, ;jBDDB.bA@9!J(Gn]V%ƓYmԞ]ɩrE<ɟ fh{./}.Yt.Ŵ-B_\.^Dxܞ*1҇pu}@ ҏIH_بf^s꾁9NP֌Üu8@PqH2Uz:M,O1y,aQs,ʪ;ˢL9ʽ}`QOTb&&:%bIvUx:Jt3vmuBK%YݱK][vFM+2jgJV߬fK%\zdt;XAinV;c2 v('ԟnirs-É2ğ ( ώ_o>ԜZ6Wֱ"Bq].OCXToR~MYӏ>UڦԗW'KɾC;Kv Z4k[y_XoJ͉% RKsj~==G&*yv0Z &`ڂ) I-KcO]Jɶ9/&v)rAHwLZPp xaӕǮeO?(^i *3 %W(])=pC]uETZʝz樁6\6Qc-1[2YjmoIVhKĴD zp&?,u& C\Y KgMਜ7eÈ10ClѴQh)d[ޕ6"K$̽@.&$/4v;e_`{Z<ɸݢbe8_}.ך`$b`_҅v8/ڡvjh彥 uLʵ 4 4RZ[QFaP+ݞ{]@0=rO'MRctMy- 3}oVĪQ0:H:bwUYNGGք2S)c`+&T@ 0R CEqTS \urε=11kJ!$m')Vi0v|ghυ#ѣ,O99F:3tKREnkM3%3=d<˻XlYQExld$6H)alIE)ӪalxSG2F cŢ$sPHZG-Dd5ƉDF8;%J4$MВR ň?,PS-j mǭ` xqQ3rړE7LKch \J-Wd'ϓGGeG9#'M"r=PLJJ{"$I#ãRc='z OɊf Z^%BBrQG dö؟IhzD2"dգSL㚩jv2gxtJ TMr0)7S{.P) c;f0![C&'G EQ59`eli(ZJ7h+jC#[+3GJJ5Cc8` 螬Ufn&(WFnp5 HӺng@RRr?=@H(Q @]hPaq7*Tui-%/}KaBUa){ Qf櫫)2,WFLHSU!AyuSt*8N62%ѲG2之k2GAΙpC +Rf "I8ˈXZhF (&5UEpgKIfM{xPK3&"fo4*gl'OM ֫yYX3rV\uP!>؇EKA)Ae%wn73h?ս< SuX?|Hi:^RNm3ѐтOQ2cvÊ7Z hs.5:>؛UkKo>OlzŁJ8Y(^?\eRH5Xo[JH@+>x}VQ{rH:RoT(&ϔF.!DW ٹ &@s-eǎbȅ砒sWEuIXsq>?:*i* Dԫ`FCj mNbц0K~10Wq4E] t $Ȅ 8VP~BsFvEA;Zr}ba~+L|Q6t:Z{ jxYmO#P@jJU|«O͌~Gf (Ygf$2\8*:DuUqSl<-ym`N u{.S%bG,ӴrVP&[Mz#m+ˤd)ʃ)̖7Hש{D&~0_lˆl迫𔽛M,`he<:GO&0IiNiMR5k"&˳a[ɷJ ZFׁN~XɶAq1GXj4o{X9yUy=LqpR\Q:0`6rl!B9wP/Q%CK}A=m jݼb_:xX.!x$S}s˩ ~9IO\9[rw3T9Pb;!T .&SV1HE²{ j":מqRU׳ى=ƫ|Ȗbtuks*x0_g8j2GctyuĪr5ζ튞-YWes2o2jp.k{vWi䓾眞yK.w(:J-G4WKS).IQO#Z^j=VLHz].kͅjѰu:Z8"l倂3Z"wi-VW1_[ zdAɶ+r(PE&,J#leybL&!D@8"w3*ˏ^?N>]dg, Vh5r7G:33$5 Bk+ZQ?L _e9 ^Bm['/U\ʚ~\t0DWt~9#WeS䃱5A[xL:$(<{L:} y8J| X".&d ՎpkJQ( 7DAii|L2~ރ_$iǀ9FuSr($qr'"!9 %!Z)ch%D55]JZ#mjb t R0=C)L VwJ4?Udgh-ٻ@Еh.h)q:.渁-hnIY»aܺ ]6{XNK[>;γk=9)22'?0cӹ{`}~kJ~6˕@VG 0˅g[t5c3`M%~þS3s愰a臀8zְr1t^)-90$ˮ!Q*&xAw⠻u덗uٝ˺qiݚ"=3d[N& vhoEF8vmp6n3[F39"-9*bFs\)1יCd(?͸PnZF0]'X-tf(`+'R1R&_-n?W q3BɅ֑qMf"2|<טnk̺x\cxkD7Haͬf/t~aiY3A8uiP:rcԉd`*IT'-tdRACf:Ϯ t8W\?\e(~Vݚ%߫eͻ-^H23>ad %CPvg 90!ƻRh&jݸ,IF7.j\?=ƙ4. Nt>@MDy :8LsA$CEisੱ8A@ WSvЅjY)vpV:t<+.׭GDJbguǷXwr{F 夫pzSHUIO,g1 ZzsT)#reb@ << M& ~ ijzD^JjHL*&T@sahBx'b~ۅk&9%,k71O ~mU;Vn,?{X܆4K߭Si7'qv~e?m?_EHs2zx[7PP:Pk9os )aӧ93_~$]/U~#ߡt1e˓ۯ@3ePɍ]>&T}w9BORC?}3 |}y//b`ML>N7Wv9?B3`e?>K7("ZwFdAA0AVx0@$ Di%4Kq^KJ yW >] !О 1 JqYƹ8p&hO5J5 L#"z6'5 c(Gۣr猖 Bp : Ԍ Mځ+U0i@@UZ\U5?h"*/)j-sD43F* &\r=DRPrQ!J'@hFjQ{(O@DI;k80 :`57{WF_v{%>1N0,kۖNv0>,$,Tt:u{|:Fj(C1 SPoLb9k{>dhQN.Fn"=\Hɰҍ+rw_v{#6̾0[ U8hէ Q]M 1XMh XYB[T"3pt Cʍl)߽kK7'Qh`|Zr3KsXV>,A3w~F[o/Gf|-b,&rRtE Yz%w . OMmPK4qeELr>[EVw?trx GŮޫl69LkYw"FV7툿ZQեMK'ªw>&kUP$*ӗPBYeYLWMBUꑡ%R}%$zH}: 76?n@~{KcݻORZ8D᚜6R0:}^x|K(!D 30|meZ!>%|;`"5(g%#_p^Jnn5ґY[&% WѧZ ;h4>.J NW}|Birtt$=oԧ&払"ޘuPDf9GS^ YHV:CSǰe$R)aVcI~rM/~}_:!6^3El U*g[٭\ToÕEV?䗭l>plfwqY:\I5rulbHjk]j,N&F g'V9<+⻱Vsy|pFye~ְh!b.[ѼPe)MK)l|(4{eUﲅĘ`"8%XvQJ QVx֪Vx+f OKB{Sba,A4l  v ?%Xh`AmqzHR`U&&N *x ?gvL.)ͯo 'xБg#ډg`*J$`k0 =] 6fSm}"IUI (!)/Px+8'- [\`OƆ9X jvˌs3 2XG=  "}Mfm_n-(]-͑nΑMk& xe]EwدZ q{Fh xi{x7lHV~N ΋9`1BL"d|R5-^|c-l~;,#7\r,o\\~6Iz8e91O|5b7W-> M, ^|\m&O)AM/>ԳxBs4mgn U0G.|.$P VF!(WG9a/>9_^~AYd۪9;(zXb#0恹}>Yc70hj~2 ͅvhM1Z*IEY]mbBcbl8y &NZRl P9˱)G N=x$Z|,y'ʐW#(RR(m(NSϐ"I5BbaBR040+;/sg v< Jq0/10"`ԁBJ!GlJ"JR:sJx$όfPX#ìTq›bg!%a6K6hY8 SBSLuKo> =-$=B*]NiDDF}@놬uBLjQ|SC DhkHcC@ڤzN(˨b!;+&+p@axk ی8(#$,] j -$bq]U^o.7.o!k=, }]XD1X,6Kh`8rx *S泚oѿ?ys`7W '^ĀKLPY rW/nLa?YZ̀o_=:8{0Q\G!LGxQa. rq1i(Ɉw 7 !`]`.>CMzWCx}3a/}=h3(rQ a ,5Y0+<._y^?ޏ˭]`Vru(U/TY(aށ"H&`2jor15F`a0u8X9"ϼ(3u!,pmN J#҈_wuyjU2sa3+dLM„QGfcRw:R-@F 6HH5o Ҋ!;1 SY{lZ=,&Sb-żElO苃pD .ZBU4+/4ng5=^9;0Ơ!?ծ8Bx7w?!u//K?_ }rk(|h}OGo>M,G/4q5cL:=2,t4\-·sKZ:gH͙dA&-h,[2xDˋ1 ẗ!X{mSW~7>GU6M!(TjX;`4z`*֫*\BJęvgh7I()a6[=m͌cDoՓ椱Uh(t&*>NZDǤsF8<V/<HmrKx5D)߭.yr96CYfdX͍8!æ!",2o05@VVw;91@NJlGքgG^k&aYJuTb^XAY2bˬs>)ә CWViWr!`M@݆ApT(öވR{gEPJh!M-A-b.ti4J])C ~Li&϶l-&͖Igs[] ibFm[KcT83B5¸p B2'}H *29Ӑ·vU"&8?z3 7B@֍ǚ3jcn}?[ ꎱyXUx//r0^%d=9D$kI28XΠqV*5?L:]zs5!tⷼ]$Qr eO?:pΠ}kU#"P>.' "JR>F qѼ|D~5xu[^|8*W:wv¼謭.pݖ&*a[5yUY:7JkV\К Vб"Mz{IDg`&qV_296y@D7rs97h Q_M7`DǯXrt=\i:Lq=Pi]ꚻw]u]bT:Rad$f09Xv))iFrإu'uX]"Ft"^ńfpTVц=%S>9 =WXX*0`vɼRe1Q Afk?Wh#Y#.S"5bQ-l i+$\ +Xp K1bDZ}1 v.&J"~?em"C1wl1UBpֺKlP,{ք)AwEGr3%֌?ox:;KO~ ryuyј`9?muNyUPf |Xdw>8duU"s- J*MXgZjG3=[nViI.v}3Z`6k๷76o .ݙ{PTf%)=a[~Q [oj*t>.>%vd\)[\7*?K0-'ೲQ1,3JaP 2*i@:&UʐDrcVMQ-I'uE FdPNԊPxOSQtAu A}AP1]㕠蜠_PE3> қlo- P|6hܢ"Rc곡4we=rF4b,$h_fd`_lyJji|`2U]L2uh:2#dNkV∦ᱠm[6kJK1 `RBCeD3=73wN~8kmlsߔ5d$xo?E1[ C2׎zDZk/oeNNSrC/;2Ros:; nX7 ֍.j$yt>f|gI~@'fk 2ͣ7YdRЄލWWObjG1TSNe!۩ѐ4xRUΕkmGGRYn"nۏ 1jTDh<"E8L4i?ݜ=|,-nZQuns J<|jm"sxf0_G01dcs \UjM.*UP,tQhec)@ƑP4#-P V5Fj#Fv̈́0C[؍L.&w#Im1>p(Mm!4Ƅj =;H*g2@cD鱍g}]u/241N 3L=pRS(5 C EߣW*Q:q*U癷 rjt9',֜^*x>U2{(7Yħ\UfsMq颯Ť]ߣ4{R\1ܡ b5{X>It= J|^A]ԺНhւ5b5%p߹;/= zq"f4d[ͅyͳݕ5)9M iBhSb6`I.hfM[S5EYI)BHEEY\>l5ϙ)+ڮ=Aضl&x h}CW(|R)PŎj1eׄpFYQB#tL1aE;GP*n8Te;ΓL ;iBK10ZBh㕄JYꆪT82I0VImkfI}Ȭ`E֦gFeֶq5o[ ik\ouY6&$kF , 4vBnxY@J{깛!a.qwonշ61laoO5[fܡP!間ÈZ$U(\PPSQYXpülkPB5Z.@Cv됮N~QB̲NKd'Yq(7Owϐ:Ék.P%B&GȄpݻ" Ez$wo('4 g .KK;\.*)/'W~1SѨJ@TG%lJE!.B%)qȢT/p-Tʳo ܥ,iC(|,i#Ꞙ(OL5kرon tB 4bX)0CZ%T% ]s*n$ovZP;McݏU V^<?vܵ_bnt;Ƽ^kAC z_)OV:! (WLϓչk 4/&'r4GR&SA'cW12.˟뇲j%yᄆvWHz,4 d3~PJR.̴30BeyS5Ynr"ooY:dj؎ ??JRI3S9st<N@$L( 9H2!Є iן,fbT*bddu#i Oh)fJmMfs2sـր~~hh߱.zZ7_YQ/j>}tm}}Ƥh &.9%B4,a1]sFZLTLj5\}?,QՇeubCj]wW:)Zz4kz\>s<-ɸR~`q;_{;g*IN$yTiڪiT\طת́7ڻU@ ^S7RRs C<5m9Z{s|y96$}̗>m{M}u%Ջn^%3飁eߏh s4>GutFHT8ia~GWĺkx`-kTe]ְUy7j{Ȉ8MŪrc{uWՉp{xS,]Rְ\%۟^\s"ͬ<A|7+10~њ}o3&2 ?oKkB붰zk&_O)O7_U}mb'Wu%OfU/JMlko7,߼ۮDWmc!|޼xL8T8 dewSDzͿDF'gG~tT*Rd!gnlij<6>|i;Vd-)'c8TM;$3nc113͗!;=/BXș2NVM6;.X |Lg3nŵ:ucn4I(mHE]/IIS8]Ǭ~IuxK59 r2xMɵb bV55f{EPi.|n?T ^w<#}EhrE/~'`ЖnZtN=Sɜ,t_Uu`ŎME,P4W6ⱺ^l{j"A}1dm8jzO ^ Bd/ZP«uj}O4`Kk9pŢڏ0 d/΄6-aQتv)'ghf~r΁3%ނذd͟ߥ9ȭ/mô:}zx h鸌+/M&MpSm!·W_?/ʾ%;왪5ۛ/߶^Ior#S,l{ NgtReYUEShrԌ7J@"ׄPFԢ "ۇٟ0<Ѣk I~Th۩j%1tP3A7 UnJ^RY54"N"(JU&x1 9a.eSquZMKU.LN6T)T좥|i4񌴕`"rlňGx 2MɆϢ= T}-9_}γ"%R9 ;m89aڱFDWc̀C3s%$s<ݔFyv޻GڗU6I྅!lR'Hb$B:%f +NG 2`ZiZs|G.bN犈N+q N,^MFAN )cfWET#H¸qcݙ0k3/WC YIPe4π2Ud#,3^c>6 U@~^8)D8 ZE:; ]j8jDRRMṴ%p ĥYy!jWJ+A<J0c0 @Z-nf +TԘGh.yU s gU=Cj(_TXs5(a(ӪЙ`cLEmix ; {H$2ie@nHY"B6 17@Y c~3:C;Br'eKnvc3n&YnE43ܝ<ʘ_jQf&hb&Ьx.ppY .QxL߀O\"2_mR'S8~V[Akc+TiW뙹 4JzH ޵57n#.e/LRs2Iv_qu"^T*4(ɢ%EQfJMl_7ѢN|>S:K.KOS J`:tX]ݟN#UI>figF~}Wn;QPmӾmsǺzQ'n[ЬeiQ!^(O \b"uoZDXSGxSH{S3m|m x1ܷ4=ӎic-0ģUUO+@\c3rc,$?ܕTOuH^ޘ4gOɔ:x;gHp6Yg^YQ^m/ax~v`_3ψF3@XkFy! uR+C9<{7Ap,4y4,&hySe \*>:4*K8e.iCFBWo7I W.w{ޏ4Wiz}zJ^#D^|k~>W˰;C}3;p"j1lo8Gvyû9_%o=_cyqI~ ?=mm9ńh"0"B+lY.T֔1V8FJg݁SDKg9K>NAB'b{Fe)?[ǝ"Rp)Fsx@9E Fw,3`#k1@ VԕRi)"'3! >b]@9`.d %`\V s?\z^$ef48͸pbCA6 &0W!!0iA@I@ӊ@!I$ڇ p. H($&@3:"4!'L%A %( %rZ!ItQM"7wOӆ -HS@?k,TH嵧)l̉") 9giSʰ"j%}e]. rOg~Ï&.HEȻfUoэno*"y"~ \px $] [WeWYh2]s69sB⯘R2 J`>o j{8Qza!|у)8q- 1m-U]ƨ2FUFÚ0B΢W=B!`4X&c9i±YF[׎:mu|ga_mz1B%$Q.c 獙˻M6e>_=_3w7+b\E9GV_fs>OZݏˋA~ }yw_^=O z>_:N am.A09G xž$ócg^1 MwlJBWG\U]JZr*5`! H"jN4‚l , h h,N~;;=Asg8^^5ӒG A0a*PdA)hrVxm} q3^ph&Zua%`l`H1$C1N~&VA;'k=G쁃Hǥ6$pEX3zvP"oK]$w?%҆2W'*#0Q$ E;Yh8gDC­D^+F{I52kjGU^7 /zy_Q=~1Նf0|߲HцB0{8~4Ұi[IWas,V15p7la(+#`ȉbl'qQ8ÂP^rr>P0 ƕҧyzzN7&VQEW0B"#@PNi ) ~hmplj[w+WUx5{1HL¢+2 lÆBEq@p9$RR0"GMJ=B >ZU*FTUU8MJ!E\[1(c}GF#K zK14Ə>o:wscn,CAOoЂ /,1ba1מjAe\AsS+`I\7|X1)W{TMEO]&t3JqhƷWkk[v:oLAԈGr'ts<(8rSUA"^)c.jLuSv/قsgK}jsv1?#"v$.UP "Vqqcxq0KјüAԯFkƄ/ BnlipNXQP&I@8Q* "%u\y b#p ̥H±blul@]d? +,S`0`W1AA[-!ʫ-)g1xwU }ͧi8-mTE(4l QtVsiqr &ȁwjPH.9'G2F!qǫPvX( *=\ٿ=r&KU՚+h1͉2,`A$` Z  0&AB+uAI&0@9m oSFA!m5LBGE DRqѻ.XC|6|BFkˀ=Bf 1!AjҨvD%MB` JDLՁv7-rL/PL'N y$wa$ogh#oӲQ =A1WiKxU <a;e!4Nj "2eZ>j4 0׍yLnocF-s1oN3z}tʍLj-Ǵ7lǷW^ oϹMp7ỳ {ev{"6o\!ß4i.λp3lX1G3oz^h2h/7r4{x{8y*NiI6uլ\>okCguw}Xe 3 9{gκzO1l>rzQpkݰ>G_w4Jg{zԟYLdwL^4% 0{CAK^˃9'mfcRt?f L#?j8[qg [Ue{ hTV;(?-5G+f>!on?OP-6,s~fo{{?vvqY*up~s TtcNyylYгx>tLe.jt9nzh^#O=vꕨ!!_v)!N.~VK(u}W~q{v* i?x4)tbBzx>uLZdXV^WoLPaW(dz<Ժv 1b'ș:Wrh\xoQ娕Ni{}?; ?OOj~Z)![8vgOYhQ$l5xĸ|ilaн FejG!8VZKK䯞γխy{%ZjWjEkU}OI/Q"D{F봀m@c)~^՚_k.!+\ ̹ɻtjcQrY"ś7q.]WGf' Ly}18hgQq,n'-}`ᶮUbAHGUPt9.㼺2Ϋ5N`1`EQ)iW8h6:@iS.pJ)c{,WR ǜ܆҆L%'I@\1 "SiM`f`Skz˭6'3`2Qbbn PߊH j B72㬴wF ESNϗ~86w2ũ9R -< 1I!l@#mTS-fvIyjiH13DPIZRXv+|j!3Hⅰh0dW? dai̟Y}^$Nc~dl*+Ηٯ벼`- JJv</3x}70 t '3'tM'Gy8Z JÚ Ւ܍W ۮj&=ti$Ȟʦ3/SQ'VP+\@zYzc+PW+Fq[F$Z+PE)/۞oޘş7Kin=#lABgnn33J~Gۇr1Y;;nŜ8HK[8Z)rZʧKN0'cOp)pMosٻm$Weq1_A]8|`@ݗy&oGiRN`jT,O=6}WORѪF^PLqnp{Ϟg q]jIm@=@=@;PuW֦ݳ(!%t*o W|@ϳ"_?lfa="׳1y~=)NJ8_ԇ"NiầNuv_9;(a2d4v@9`0BtP8ݨE"%a$ZC _y*Kfp.b3*ANKtJnD٩89R O&^Ѥ P0nPQj Y,bh0TpTRKi R;li$e& YX Na" pտ0Yhw7j0;5jQ>j%F-IV!Ƒ)WHpVjL `1[j08aRKVq+)b,pu$bpej sWSD#;OzȠZ')gzjto9[bg9[b^Ј*=~ULBZ5 q-"~wKS|),9V}%grL3]pݾr2 qF?߄T0|madžï;ObĤrtRx\4*rJcYӯ3O2| ^=dYJİBb=TpX9jPLfРH{PXh_p̛ -ƚxĩw5i*Sẻ2Rr--V*ļӰtq-U2KZ>}{9_rNk%/@fsha݆ԅhH(\0l VJy[OBHJ+5JJsLP YV𓬜e+8n+|RC= 5 4T`MfsP\nj[Ma2^jj\eL&\N ֌ kYem!KzP0b#UUsōd3-e@N9%F9UI3ȋ%p0jg$NI5XPjGрm*G7Ә_ď?MZw8c!E\,ο Dz!޸uE4%-uڦ?r4Ҧ77m2-]ӭۙg߅ȓ"Kpj/𓫙S=᧐n{h{uC^=,`FX?vY@y*Ψ |_TqVU$Gؖ?obG1͎cKPh&:5N6G3b}ginTD'S2sA#( b->+NUkFo+<#C4`(QjȻo4haC:-K*Zzu#k~`Оf DZ 1vW_.]=좕SD9/8IO%d@>xaI:L<\Ɲr:ik_&/nCIz-ndm#RԣDDwy.GWq1JרF4Rm+=zONҖjZ!%zL B"SRaЪǶGWf ckT2I3ciyQ>y CC8C{x֠(`q+"DLx$& ac-Gmdi=^.bdJh#jO2-idG޾Pwba1:a%z2EasnRGuoJ ø{5퍿]' ؃ݪ E1p#p?IT#i(wO&`#!}?BuEOuq2Ohpk@X{RО9e-hdZ?e7@ k, N&H'cְZx<泷fMȚ#'L~>·\\"DB= ݉P+ePNu079&)|iG3HxDã&"jkN4J-mLyD&tѣ)b+#!n?z @]78X L Bjky ooWQlߖ&n=M0fyx>~ X(yj_>a~㡡{xQ R4b<{¾!\z]εYlbs[x>xL6Fh(`͟1fn X'}-/e#6ΰȴ}d~zt'*Ab#OM&kI?T=`&G:v8h jjuhZmL cyLy|ؘQF֗F1s(#MCeFe Zt7I1M1 X)g̵l;- ~&;U# ]\5"=ve2SFqKjMal0'!LVK{fIs Ϫ_V}[絒D{rP" R-m\m׈gƼn? PlE$ @'Mw6)#O$9CrΗq4$&G=-ӑ#JQXpWf5b[њ1!)*nUN#}GEF;԰r`CQ;$S4Rx5\Vzf~ڞU5]EbqoC"Xƛ߭ 5õJck6x#o+TجZСO|:=DDnu+ٕ z{Q' c>[]5韏~v1/.NO\[Oeۚ'U.׏|&eSrSg7p8bc:mn\g1nDgb04P|jܧ/2*^Y!NvљN?u7.NW1Duy*(' &2t #6<~rrLWM0,`% qĥ¥E}4U*ES-X6' gY԰NJ/ +h*Ip'W9v}15₨>!N('0I?o9OG%>Jk<phE ҂R(;Hm'" @fA.qPg W*1\3C.cĕ"&P JOtKLl4ZcوNh7U4x44ZQN5-KPRRzm\{d<+vVXONґGmnCYFxgUDbGc69X)#.l00K;BժDykXɍ#jv9hbm0bbbZ1lTrĴZX#B⴪Jx%JVbPga% ª`TYxw&AaKŸr-by eb *w Lz;X HǬJ*=9a*=5ጒ;ra3Өt#/dT^t7X``6혩Z@ EMPRH#S+2'#GaS{vBQPt5[,8rbВeFqdq(,q_3r1|s8aX/rH(&tbR<'߷7OÊzi q^ůa}YbY𥽿y8K,\ ӲЊW7Vf0_H\l/W; auzo< wۯgO@[oWۛfC5L 2שׂ@8nȩ;%y@Լ=/$sK;'2qPA.SGYV OC$G§d0lv1_-ubClFLkk|պzHIZhWGgO +tjJao|]4{f-^7x32=Ճ\3!nBb!脕kEq7o^|qyL].lh.T~Q] s1)*Msb|| ;+*SX?*Fiӌ/:Iwr4x߆Cdػc#X}MZn82ޱ O 6[1h󝳇9nsh܃}S9#j9_Nps _?ԙss d) ]8)m 4: >- AdX]`[thVX.>R(2k&@q>IBkY++<bjh ևd],kQZh!$<-oWcR)va]P '"HZr+V'_)*hBybP֜: XOuQ}\Δvх\P--Q(|FgI-w%+S]COFr&F\VFؾ=q~,@F٣sWu)#Ȫ;$N]=54PhдuQx{OB\[?|<cK]PO&`0GOTڀ.ZKeViu$Ic(gs)߁,u `3#b4SGs^H D B"E= C@U 2b"Q14KCpe'r&x (žX_Qeu$WiK"dZ6 Ѻ]H:ƐQzH%!XN6$ žMo5*jL80| 9 r DgOGG CADmk^>)+TwYZh"Fʨzo#UIWw~F]==!p*ƛjevPNֽTq/˾@t,t6* %QN #6jQ< VDfG >oO"iҮ\)|#QC9tƭVLNH;5"G4γ ,7/ʒeBή Ng{K0 Ύ::dSQ_G::-?{-%1&$(?@Eo[}f[⢐7\ $5Ы4mTH:wdNᑿ L2tonەtn/믕yx#џs~q^Vn/9\ロMX1|s_`cɐFD9Cjg^\<1<\ԇǟW|<jO^`Ϯzv.Q"@"*FWEk ,SrV>8JA$+sof? _f+Zsߓm4\}y ͊e5h(PL6tr^s$H# Wp-im-Ag<0{'9e"ix#0r4 *1zZyNX]f+Lׄj$0aVx'4 raTq (qGA5G=䅓a+c6!fW2m$=ҩ4si[E| I#qJ 2Ě7S,O*'2!oF;{@XIɦtU}l^!/n]_y ܇Kvm3.n+J7E_~TqcY-^Pʸ/KROߺX?0AgSz ΜOhо Wt/PLTt!X&9?و|?֪ ﷛,p.Z%'v/ٵjP}rɘ*aMޜ|[fbB;L.4H O".ZF-tϷ>?/ 9 90 g,tx s> M]* ʅqYtrkM}n ._v2Jlri9dӿβ1iвRK("y\#af%)9_DOg` s \=a"hLfq'AКՍF3 c L{:$DCDrF岬Gf3+~9Ps<ܳ2yMsN9s$PFdN() SPI3!daT⽂FCRS쮁ݣ *{R H"]P~أIsUouۺQf zrFϾH,ɖMFq:2;Zj5{$y3| ,Qۧ:7ͮ)0[X<+DbS?-,Tg+$?jᅣZboi1AM/$ڽ4mDI?|mr h؎ahiڨqef&VrjR3BN̗րȇӌNON5#걸.1j=*fS_ގ *%#aK;EOxU/u-%?J\omY x,C|6Q҇dȽI]RQ_(zQOSp*{&J(!EhE^|/!;`M 'iC 9#t%Y RkWʽn謏hVp6?/S:1!HWڅX'2@@"*<Ԅ3lk21RΈ~<_g-i'}a2os_,θDOfAޒP%zFYߠz}jm<`Gk,[2 8 \ ]}|'z j#]KNѬO JlxcXϗ\Q|*p!QgNad`uﰆw2u].н}#feK))Fv!~63y@KK}1ˤjFXZYf2 _!IdO75!ezU?,"iVԲr  utc6 9EM 0M)5IR&Q"B7*Sc@-;h%HsU:Z~,F&LrL T?>XuFHt9U9K7pl J "jϣV̧֧*guTΔnS*?> W9gIPy*ⰠY(ZVgBm/d|UۯK]g>έ %TU|rAu5KcT5KbBss2pҜ($:Yh 'ń/1; JHʑFE*)bR ͺak:!*X@mlr#-ED#.k5aFFeuM*֜EC&$zJJGoH9 p{ю~cSMAs.I.W2iLg.+yHZ9ύx:' ɘŔ9Α=@̵7?"Sl^E& XLJ t0SE Eo"^\\ a,t,7"8tnEAϚEn@8CaK/{[ׂ.A<<' \yQd&sπLAJb%᧡P@dRTdd*Шe lVүUnjGC\qt7iՊZ;N3|sX(*ͭGfBڵ9@6lZmI6Sb`z>6vZcz͗<#0C=sF9kWuJC(ǕΛE@$̢_lG=sV0'wSꓻE}M62e$_ 5J>ӻڛ6{Dd"q[ѓ>ï0U|f L~KA\췃\KW.H!|tF}.Aۅ&bM99u8!@#gUa`2/ DJHYt`0v ) bʬŘ 9qrEu`$R|g yLԛ7v_jμ'WQU^no\xcejbB`/U1P1Pc8{(,p^1 ƅ:; pǥb̙x0 G:"1GVOz~\'_p)sq#GE/ 4|Ar..0$켝/OZ#-vZ 3꧊UbNIBo&﫴 3<,\IVCòZsjv4ً,&,Q) T]*% 1Eq1L0p .J A T+ 4 Fa0/W+(u)LZߖNT)0VRTtF% $ \S]S;/vKRU)QLBa!߸w_>,XnUi=zVb Ҋ t3GîD/7wh#!TN 1N9_]/kNt{~ !o߹MR뗧{ <@[ }!):Sс2B˫)}J(o:Yx8l{Ԯ20+*yob/S@" F/ʕ I]p\A oJM=-I)-Ϥ$0\ƹ:ֽ,3IܭHr82WzŬq̕NrV9(,kIHB -/]fbx(*7)J pNOKT6*gT3|D&[ "mNJa=USf%S-x"]]hʼnn>xܚ>.@Jg(!%5FsP"SuDܙMx(Q~-#9RI. Jz·XɌt9s3tYcU)JeıD%B>P 4hA%;B"$Dp+  xRZ ( J#(RZaќiS _OgU P)x'6ވ_'n`E֚cGƓ zf5(dwFXK6d2PyT:X(6ɑ4@ݻw3V$is5LJX0Y{7\elh j|PFhÙІ{}eBQox@M= -/ wf;~}].?{I ~SU&;ǝVRM.TNǪVA;\tL7卫Q_v~czq]gJ;=8d{y`0B*%1=0U(m<05#@e K'FК S$zg"~@]7RS2 Q0ā4#hi v{hRNєey4Wx iFw_#[|հ9isAos luP+UIhB$B%Pֹxƍ ARbxِ.+bp!doo>Z1=.L:Ք)_ zPj|œ=VqHs/SϽIx71f'= wbȉR}1Iݙeؕpݎ;ͯYNԎZX#ފ { } ȯãga2qtJe<CgqqD,J۞ͼێ9Ӷ-4oDx ӷCFǓUNI0WL䬧Dp*k{ev[;LY;L gg( =Т{E0Ni%U+ʽ^xڕTx# OK T3.&MwvFsV牬c ]ߘhCdө³{ (t4~ܣHӇZdK{ZS0ɺKv 3.ܭ2Z翾wUR9g&0fDaa93gLR0R?쨞x4Rk>qG 8!tK6PuyS鱡 ?_7'7$/$e^;?|-cwa^!.Yހj=O }59Fׁu7:ߤźE]n՝ w0# UWu5a4flH"z)}*P5yNTl^""#)ռkpTqbzZ%T*yZp"bm'ԅץOKl- 4WtƇ\T2B6رq0T;z$^ej.?l׷7z(I뵯N~xN) AT۠VR83̀aAғᤦ(,i%V@!(+)/FHhF U'RRCɄ&1KRXg) 2+l(DJYuѣmQO.?}9:M TtB:0͚ĶHHXS߈̀SB'*QȋNfဖU>u IA'jRT{إK {Ѣݟ.[:ݷ+O ~TM =1rx:g@%RYk %\n9[u4m>41MG7#iqfM.1RI^J]Ƙ\+)3Ԙ6Eqp~*{>]#r10 A .Uz kr!U]:I %b?>^u^l2o2BiY)BISKRhrѮ Qoi\e%pGr9ec< y᯴Yu$Q[~v5$~~ 'X.4̚pqMMg Uu60zCN h+FveR0Pz}|*Q}lzK8+' X R;j+ N^-K'yK<$WNBW=ܮKM,Wb##J()P(,5&zKS@輄NZX /x7/SɮV B ;ӯ=tnұ4R~BJ$TͤP#"bsyRZ<5Q8^&xJ9%h:Zl/at|{yWO K<Z.o(Jߗ14B:Onop̢E {b7X||Y`׋Ѫߟл']N3ᚫJ?e6吴d9؊t4y<1B.u| cDZS4>LAZ(OWii i+2JlQ v.hd烳ycLZ=hQqh IV:808 5t\,V;hsm[H6SuZoxGI}zEz= DdJok]b;y}g_ mކ?,W1}ېˊOpKiCHI1Pgl[Ғ4ZA{mVNLF̵S̈́2b=I(0vͿJh TqB * /+,QJ3 gy!Y-PP"RpmMK4fw ptmҀ$ ]`:kfL%%F^S 7ESң-v |ּVJTY-:Y7\INj.R|A^I. \ШD硅:C5ܶrqV E =N, J (cUee\ -( XZԚw}2`Xg;NHug;$j~.p_җf"h.Ǧ.J&JGV ^nR&ku0Z]өktԻhVbK5*A}ueWoy7n{x ۫xG, ?[27ۭ [6Gj!a~x'aQ mb9݃<5 nN.?oi/V}}sU$["-JzVG<nY\Ց|"zL)Œnuˏ H)Vzڰs{${I Ĩ(JB1b ̌zK"[Ƈ4R]8]$_X^שISb*FrP%!NH=%b|d,<)Hu߂[*4HRӆ5V56i һb7JC"[&E#eIܻsB1p#MD,fF^ٗX3=Y+f?xj$FP %Êljя$n<,ረ(eQZ7u^Le|oN} EA.#u^o%,ZƯcTW *񘻲 ^p*O]0ԝBؼJެ#)cp8szPyn_9]v9'|gYzW1^Kxv=`xi 4ͳED⥎X5~rc,ZB# &|r{A'\DN@RTGA/~ߟPq0B Ɂ OۖO Sݡ/WEڜ mV-N1d4JqC%JYp`!&r#) Y$3$XTqمgax+Qm2& ?xl$2 ɻ?$m 8gޞcI8y[4o$:/o8|bozQެ&߇tFUw>O'}8.|a7៌ǀ:l!`uP&wc$׌@ ORIk 0X sc$$䣟x>Tx_?.GVEuá.G+ڸI*<ӟVZM51~*N%g`0X]1%@_cIr\"I 1a^^_h rkS⌽ΔS 2,X8؜ ?g IxkQPj"A 6q*8}),wvJrYZئ伵ï0,DR{Zm(u\oە P< TxՂr%[N;8cptX i1R4¹a~t7..+zqsuvI.bp2ēvzvs!y!=7;4<Ѓ eRԀxMS2H)wyMkaC1c$w=d oM ܨi9 2׷)գw!#(K}>`"@@ӓl ;$VL3cŝzP¨}a2=_\eh2Z& ^diE;GpQG,uouh{Vks R I[k*۹sHе`:W-x/^;5T(W0$.Xpϥp P&=PwڴPֻͩel>6ZńC Z(#Q/֫jH&(n^ŋJ% `hD@aVJŴ)K Zp*Z@Ѩ˼c^V'"wkNͼMfqO1zO6M2%aI~XŮQsjGQXu)4 !^PqUKbkFV̀EE uY,XGf7;V(J՘iTpu j#!ִb!~ו/x̷CEbngVGaI_PA= PljUG>]ud^wsG?N>Z?_M(O?;lXtrՙ[;BT߮=Rn >ټ*)R{/*qNGQAΖeߥ 7U)ďw pRqoE'Dck!; .QIdhw ȏDgRS?9i(Mp}P)r]ԖD)&9OĀƲ#ِ KwJ])#CMe(ʹ2@c膋Edۥ z!R`p|(]N@hl#<AW[;1_5Vx?9 35*՛J$pD, ?b^@VH:Xrr be#O>k?Lk賐_?3 RHq:ǔ}EaVSNqZ7y< j9B5/3}}Kux Xd eQYd$%)R;u|n(J1ZrJCf7-% WWp EUj0H (3 ZZs,h)w8Ă>x(F7˻18߇#W߳O*~{&w쥴靷 9rӱ*4&>HR$}2Y,?LWgAGaq?򒕴K5]yg|H='䯮YCV7)VisKdFVEɉ*@#_~E.zx q">qGeFPeT霳#e bD0e4X)xMӛNrfQɴR"#X )[ FX1PQZ[񫹯nV0)8/.Q0A\0ᱧ3vKԹE|˨U/{nJL0Ese;`)^䠥<^PR؀-"DIU*S}70DS"sF6phy\2:&Y%ʩα jUʵĤj~Z$6"G6(ט`d0-\PT h5)F(_v XK9"~sԥDpMeC)tYpia,krU2?8k 5ESESBA*_a9p= t%;ǩ-bB+\X?:NP@6y 'h s>mCק֊(>I8Ϸ5;]ekئq5ŷaΛ˻f]of1<ܮ7P[}3];;e?Ot]F(IkQGV~ ,wԛ,=Uj*p,䍛h#Oz7:ѻ5 t>w1:b[てzMM8 jǻImݚbb:MQǻ*VE]nrX7F6E;M#znM1H1Ϩ#QBμ[てzMh”#&0?\_KZLѫV*EyƢy/B|E3E02ߖ(CGˣV#g ,;}WU5RC֡*".¾T~IY f#.U4hҏ򍅋m4aev GiVxOuL^vbjJ陷n$3mk ,*ld\ORHoCz㥤@oiE :y|BQЂk^-!t4qaKVR*P/ĠCC@JSu^f{;dCE;l??4X)=i6'g »=~O`W:m^j' =DCRo 9g ]!M;EQaB%<9#E[mO66̙; V{'%3bT}Gs->-(7~x~{旊~dA|B Lr7>̼{X?o.<ˋ BK)Մ~W||?EK˿N 9-Gj#0ҿIUu5blfG]19#!{ *IOdOOUGu[w$1s{uZ;1_5Vx?9}Hл I3MLj FnN5ͼCx7:GMuWBgzqsuvI.b&Q_^}$TW&IgхpV~Br@9Lڏ0["&MavܠHDK9*ci -Ӛ#4)Ku(d"Z\aL-è#% )Q[b%ZPɁ䟅| tڢ( -Hr Q\L=`S(!QL9b~I][㶒+F,r@?,f&ApgL&۠$`M%2=2mY,~U,Vu4KCe,JN:4 )~ӵc`3&˹nt-YOs}hh Xr;f +Gv51FL A 5˵ԢRVs]մ:,+y9nhfy(;E&6AJ`dL0ּla6 ,w )Ǽ r#`Bgv|`k䳡owɧg;(j0QHD<ʖM[1}QNEɹ=@ RܮE:R2 6':Yna,), ,_i4iǽ^|h@ ;ŚIMq<6DH{?K(G'Iov?R Κ}^PGC!f;yVz#8]%T,7>60=a7H4PIkD 8 VD>ӓ}H-5~PEd3&Lh݊˃:Ia@|gPsiMh8342ڙERS tXkx 1C .TWlC\f7b1r珳y`Tq .?kbQ$zUs Z*/sC"ً<헠,&[ s `cue^}Cdt":Ӿ4(! IP&Γ^" W Ŋ)1G& I$@#"fpRLI%98ZLwp!ICD*wK<]r5D `a(W<̗TǹJ 3:Xw =k=xw{ˇ|*x!ˮ&.0{)`]MrTá`(%zʱN 5fT*(5jOp16A+5TCI%)DDR,tqclD ,CReJ+h~D-"=ux|5ʣK6Vh$Bs~$߬~nd䀜 .-LS}| Bv.B| I&F }z;j.t=P' UsU [* Zz4v1Ŋ`[{l1d9PF$(Ccq,9229J,l,*?<[Tp׬fE"0ԦinYMG Q#EC Cfni9 RcOr 7x$b+a0{=c~vmcXm"G 2!)C<ҿjp6CJ,4yo}z 1!F^bJisi>MHsr}Ĺ ;NnZk/ei{ZF}T! br.D`ZNRnXqv h Wc8v298pa8~_1㛿VofI/nQ{ƭfֹn1@l[:|:6b:9(lc)Rɲ[g`9عE%,GCjdի -HL)ҤX[z^dԳ{i3[ej'?p4{Qٷ~,"VFAV@ҧmзr+}ͯ?(/91Y1T '#lP+*lPjYnR:ҏjyEp6m7cނ. wn'x{o[vZz;r=.n-&ߑ*Q\ +m47zum^NXj鋩6 /$V|wH5A1wmbxm]g lS#>;G'o 6m Ʉ߾]60+ө^2bl:{+k,wڟwyvNxjy>r'?e%=ٶ^X*HB^zTxN_-cV,kաvEt|Ǩݺ,O ^Mk7|5GBB^zT ꫝ|(*t|1p1h|Sٞ BH Qvc/.;F e[ y"zLq)c-W,mqsW_FXbKnaajOw"|wHEDd&q%$Yf R fd^˜9,v'>|Z,F^ tŊ}wjsBM]ŘBLOz 8ӊٯ=:^, U,?|”iǦ|=X%+nfќjGs (xw<@(4T?H  +( pEJ S SDRs_{bKC,d7Snv$ ! Ưk%)i,aRj(qS{ 33E7^Wޒ8Eg3';O6H}EVNh-%.p/uצϿkT?pb3\ '~foyy /m}¬o F$:bᅳhň_<4mA!t_ JԍI}eHMosvWCyj&HSzo#ȹK6ٸd[ǿzs }‚ @j;'?TͰ14~͢|o)F#m H1#$Dhºjt- J3_v>;ww't~`}@zk[o.pރ_8t ɔ<_UY b&c"̈FMJbkx#L bSdGu2h$A H((Vj/T D@1j,ܴj ib%T)LDFW',X6 =wg6o4v/mF[mxeꭟ+&߽߼X w7W›?f̧ EEų3Û s| wbwkL2,84?oF !CX؁fki#"*IC)*נatlH$Mb2Om|YH28IF$J JJptD ^uE0)a1L)b ITs,dJ X9R Rk$b*R砣nfEA)mP]`9Eh c*ۯTJO- hR`쉌ND9FA>[ !2NÓ=qBxw(8E1‹M&:Vz1XΖ4]ԙ=t9DI6=weN橽uhE vn㪇>v+q'p_c1͕3Yփ?,F|i~~.&ʳf`=ٗ$8NRd@:Qe|F6X0*^E6,crcU4Efy{Czd^Mh@XMl/#tk FUyۓ!>˙Cۚ!^A{xyr{6oAA[K/3ȲAscm4_[E񂱗EWO+]ޭ+]:=ZyEFu8jLxˏH*}umG>ŭ2,ҳ%X&,Sk?̓1rCA,$UbR*B[Hek1GNs`t&lJ)O+n3)} EH%9}Cފ5t1Ѻ;R9W}/Cl9лD0@Ԅk!)1[Rqr!~ Jq(MWZ{#Bq8Hhɫ{Fq/CH(Z n  g.nkPm{B[ZCC*[+ӻwmE!;$gkSuX=:`e.l KLۿŒA{Ƴ &3׆MLbA: k.G.`w}wDT<z9|e4ْB{P>i:88(QEыjmPKN+m#IRy@4۽6|`LH)e$,7XSq_ϮOĽp~yqᝀ IO<"% Ʊh$L i/ 7};]QTb̳%ȧۛEa ͒3XGo\7BFD>$+I<O&rtZ `Uw(h4[A%E$hB 8O1H@D\02{PiN * mx[JW8ba\"QRFoGVTis!E}1eoRq;IʱVʥ5hE >!TKXk0pC B M&޼L@b>+3Lu<^PYN8QoI&z"#o7)n?2*+Fx^O!yDO/_6$wS v=13q3ەV`7<_n~J*gI e2@o*2VӾ \y L׀p i+7eΜʫA'nU83}hqW:gI-+E^Ԣĝ[Z"˯E#R+P9k _\Sҗ98S R5zbk'Tge[YUPR Kz)e$So->D/Ba ܮ??D,7<9S"68FIv>1j^VR7Gr"-65[\T0oǍPlG J8nltR"M,0t_l},P<Q=yP38%˟݁ڲ0룭6'pt֦kإ[̧6ȜyǗjXHJ0XR:QØ&e< lxwM<ۙ)iƽ5`9:F-&ŗCMO!G'5?d1E0NPP" aؙvy j:vbn~_Hڕ+c82dTah.4q)c#k|;Q3#963Z49Ra{SGG#(% 9 T=`<W04%,]1ڌK\zypH"OgXF 0 1^zl친0/kڑsF\(<Tk\xp.y\6āsƋ;uhC+9Aϛr6eᮎn6o\ 2"QS1в1\jQ7 Wh7Ҝ(j4 ȲO;1 ~.ZY-VCo!`-x*@"'}4).wXU "mA֯ rBq.Qר).wMv ׌ywYLz}O5!Re2C NkF1 ^hoyCPQ%F-=7GJA|ùe}FَY`|NA7V"9=A9|%?{( g r26f }v/kR6lFol~^Gܚx2F*Z(z6 |{4~0(M.MC÷_6^fٛ8[]xK>)I8aKPZ jKsyl)} s +]T4-5aQ Tu3&3€K*QHi}X,D]!*Tc;ʽ;e%UMsqwPjOҙ3]Ê{A1[Qnms>Q'gl'*)VRqmEhd:Uk5 $ۡ\ #1uc('௃w_o=\nMu |2g $PXsϘb qNy͍#a:I6 ϝaA׃WAe*MhqFK(?>`ፃY.Dj$!%-^PMn<iT \Q$Vhjmq WIO' Ò7 ^laO@c"YOrt{EBS.ZA/I)2 A(#N&Ta{S&0og(23ǭ)LƂY!F¶_GP(J=Ի tӉkPD&ϝ?88v1*$&8:2z:AC,ZzFi H?gFO>/էh*?eDf⥞ uKdp5xu+tj<ݖ7~T|]/~+E>Vˋ FQ]*'oO>mp%Էyl4b ϋF~w4u UXerzȍC 78rl PlȄyZ;cŤR€0RA^N&%زZA$KKw”qwc?O%OH[U cadO)G$pUWZ% cx;? JI(ф cjqb+c@dc2q6UX1k*jY1bD`J9 +5OVwAF<;Tk6DZgzca鄐'׳bT6bY2p:d*#WeɨZs+xa0Y*ĥ1瘥 l 4v-c WzUEyȺ m6ʧ{B4r&|"G߮iIt.#&#\5%vIy!?6sKMyywQ%lhۛZaM}޽HsCλOws}}\8v\ow3"5.&e߲ǘbus/V˚=Ts}]!!8zl=IRxZb䷷nXGlm`wS"0^% !1k ,DH'n5A}:\ DNxhjBϭzr~3{:A ҧ/,"-)">"\ŲSnm dofͨ}w3Қ6xE:ws=t'GL=..k3bM3z &RTO 7mX^bTgNRFQ8:/ 4 Օ19p8RDR6({Au RRGXo3X{b}խ3p?`.=h}8վX!R~dD Kk )nH0=fUrClkr--VqmΌr,]'fmOFr~qtPdsW_[^F}p<_| _$=Q|qR3U5Vr8[? M2VFY8A3_|P^f:ڰnc'mGfg6y>Tl f F GZ+n U&( Ԟi5?(8JuW[ILBݥ<+4 &m+h%ݹ?uh -@4U w.l};<.Ǵ]ؘ> xtQ\&RREQ?$cG],r!알wmm#~9سK~X$AdfeF_cae#ɹ-o5%ԝI5ĎEůU3 *YQiLH3BFYT .Aym:.nト'3*͓UQ]{TSC" oD*EDՍ;4\5xnE +[a~ ~B~3}+1JK Ng, $YyPK7v+/Vr954} *ϋնZJiv,(iw8 TKh=84$bޕ{TU夠LډJr%Kζ_\jEj>mף"* |VۖaNܟFVt1,);!5̈š Xmb5͙;}@-"W??+\0od+}Kk Ɋޯ=hgp@?ߚ|$.>^(QTFw'].þ-?󾞾ςvwJ99bͅG-;bt-w~dcfi ٭`8ə5 e:(2[ǔ& 2ZmGI=3k1_ `{Npe,9 dÚaZM,yѳbv.oz0,-^3kqrWMUO8n&tL['i[9/;qD tim Q( j5'Ta۴-u !/YaZkZik)?R||8%CAƻO kF ,S4W8jS5"vœZBpvIaT@R_=' 84r!y v:s2wfZ+&J%ۭ{qUA=' !4 hN+fDF VդS']IC sa5v+ fXU&Ng҅x5s"K#L HjxU7BrDb+E$s50Wo| ?_s&UmL ?.<3 !Ꙭ$?dyNhfwp]&L8%ǖ)ŔA4%=Qj $for ڟwcN~N.J&,3q)rYΛmJN4H Z1DXNVkQqK'o|Du (fjٍ+y▂lu:邫{-9+uWz$z̞L5z|=%@i@i8$8O[\$nփE0uH*VWLr4s me`QttAOUwc ^ ȕT\E+93Na\uSpc9S2)eӆ={-@r˓ O骚,S{2mLKr%*Ba8n_?,fs\ =9 71- z|Α.34>%e1AP nT/ ę1Pƞ-g2xTJRQb((1X}/[ "A`id};qdQ*kSX٨1| ┨U1OzdH@Bf9dek2y:d!wZIsM m=SjO#hJqdsBwkU^#_#\6b aZוTHci*J ($Jv RSR'Ў$qO&&kxuaGb)~51bK ((Ҋ.R>`$(H#SԖJc5CA uQߒRk|Ჽ 7IJ61ozZ>;f=Rxǘ4rԄrp|`M8>&;FI|ty-\{ŷ@x)Z_k9 ҚGӠ՗=!ep!&fM)9J-5VZM`&II:a噷ZWHܰZ moݑ?S6[F0d\@g!w!!>ZxՐb,T1} OT7]Pjֹq/DIz/8u F-APL aI ;հƊ}-B%Btnǫv *U:4W:Nzĩ?H+'iPuq `gހ ,! ߃b_& WNλћO}piiNㄬhQ=L&`'`o&7s7Jjuq`rϳ7=707ɐ2/L |bCz1: #Ee M)f4a0,PlXv\c3b?81>XS ] 8a8}0+1iũˆ䥮=] X=|xӱ1ψ"%WPl\[" V <""}QcC g4"OVSӵ)d6Yav+wU,=GR 5S=[ +qXU@IH಻4eW@#-H_ɵ^buSm&NPn"hAu)4B)!IDdyNȒk]jƱ8Z`K H`QTDoD ^FB̢ f.%y*+Jk Ǜʓ_RmzuS.(?N=N|@cAҹ E'TJcY7Ւ*]vzt?A@#A~mIt^Y$G"05CI~x3fkagi[KݾJ.!bX }ZNXN8E/Uq9&`*9,3M0 {I  uz=dI}'q؜y`bQ|H^ȷ{Iv;rM`쒾ޜx(y>`dG/`:$+~wȥK=.A;Zu$!qm%S|w<G7p);G<":mQgt.ɕ_yֺUG.$/.eQGp: o6cء{&/m6Ge AdWc '\z5ȤJ kɬ\I;lIE)v MªUD!b"ƃŋu%z4SyF"z`LE&]=NU4 휔OѤV'ݱo +"@ZpDhNJf>z#7*s4.ۥ{ K'/0f-_2Ě1t/a~)Į$ B +((ƻPg%‡] 4`dޗVD{K;h%%K]JtkhEH D&&-Byv{O@kg\Ky/T`h)}S܂lƛ ;p({جvm5$[쎪p&t'փV:jugk;MQ@3'}:`ZhN;&0nzA@>1_5/uKE'Qxgna K݃x0t ^^ǐcuD}<lǃQyط/|4E4°%jML.<|=CUNb-TOCQTlC(>; !8";AƟP!,Gݤ®!H!zҶ@ICDP>̓ۇdn =,rfr+W蒹{Wn`VuDWyٗ;Wշ(i-^d7Ad;_|;\v>*VQ*=W-AҮ<3/;c.v` 8rRbRʚ\tT&* , ۣ!5Cw5Ô.*"ĸBOa.|$TRP!1gi4,f -N%nKٚmR62b خBq)SGёdAp<0@B1 'p,E)>"9㮏9( #:N(݀J|Z'29(2"L$Kc;Ff'a!8QWRkAɤ!8W/\u0R o[2-oeӚ gKc1@KguVظGGNJy3NZ4KJ0+bo2зsm;ț~Ľ3ON:ZĤh!E@MV(Wi&ZW9 <τa?{ol%;9z!o;Ó568LOLlQyޝ{[vd#HƋ(&cx'̅/_O7pM2>Ӌ_N~|soY>rDa쩳_]}uۏox>vu5L`ˍO[`d*oժݤIc7?t##]%dzqN~~$!,߇oÙFBRq/JllH j(aۡQG*Ҭz#^NvT'Ŵ.B}7SJӐj*Wz R`'pȸ2T uK'ڐi`"`JxP`(2a RsssBH5e|xjpU"Jpph:<~=` G e5fZ s'"4+#GL*W(I.Ns$~hP GZNd]֨?82ъ6OG;yeR^0GʽwɆ%YtABo2W>4fhUښa9^G ċb*77X1N`62<L'ziӯ7I㋸6 #F./\^hOM˥+  x^Ȝ(l Ra1*YnaD &?X9vE}[goE^ _ 0SVP.?gk{>3U5ҀQ;1לPzJ H:̾ 83g\u"[3kǁ{Lʜ|/sܟu:FңoEds`9E\li|Tm~/٧){1&zG{ﴛ~,?!ܔkL:ɉ8i?q%+_ъΙ_&>=Yoe4]ƪC+oQa[m0V@1b] r2sƕnu*C>2gcl[_MjOάX:rWo'Na]N2yyvp`{*-yI`eQ8pk쏅QNF$sic%4v]KYͶ./\ 응͹"Cgx޶wm:GJ1֣@}eAz]fDtnUV`4)' e͂nlgy=7  LC8}qg/ n a'oߟk4.oa70ʘRΖ`:4N;+vBsM-m93aχ{*`<1Ej,NFnb2B -HJ7~iRzjSJ4vO% 'qfG0x}#G`vjoXw'[FF%گ;0qJq>r ns͙>u=W{2_&eQoˬ&LzGF/qdے _ԝ1;wˋA^☯|_Ѳ5咳e;-rM2-@@mW)8ӝLCĀ~Z/_lp=[=ae {9>= _P9T*N3Mz1cL1fET$KJeMw]z^eVZN?tUOӏ>]ޓw2<לqދ$Wt}Ik=?&UZЎMk^ך}(`)Q(Q3[eɞY-]9͏ݖ)â-S7j4#\:#yp]1b8ePLB{]RN9#x06JHau+eYA%Rzs5$C(vwo ,//}n;>/}"A,^mI 9H1]O1|X~~0-]-x]]˶z_uSH᾵l+sղ !~u Q۹`V9@f>]Mڸ{k$150n㤸}b1<ٓsx1Ne^Ød< C,OfN֘^v>@?0#9!nBۏwzSś_X(oHN&K+dYS΄A&޲* 7VmݗnPgڍ֎C׬JWʼnr*h 0! 9XI AM m\1YIm|!&lv990F|*LusagvST (wʃf2( k:<@mmĀDߴ1]ݜmk]akk"xE{eq}.WI~NgyUszF5o'|UdN_L}mWm|^_I4t*2ut~wޥ>{x6~<,DʗICsSk-]O?nή?uʃ>3C^"Szgm2}$~M`ݦL}|&S;n kА\ET.' !/\_-mY M{///N۞̓h`EX><%\Wyw!޼ҽͷP#Adg*h*(a$l!:b,%\ ;&F:$UlњAl Ŀ&_ ULUOP(>AERQF+B.*{AYctqU>5,; 4wJny /Z^ޏ˴my\-gNx;.:a܇;ǟ:D/&ԝ'WҪVeK,׋dxu| )c|RY|Kŗ,-\w^x{=ɼU /, xdQ}M4+dcQʪZ1A#߀G>[Hp+~Ӵ=w #iVK$^.eCz1-I%$Ց5tH2L@hba1at*(9;ɀOI9AMJAM^S1xS\HyLU@輏PwR5Ged"оY2Z|@7^Fn2 oak6PJJ$ED%DYqVZ `Pp*Ƙtˋ䄘Xe*1H53l_Oo3ɼ֯~+Ȣ=wĎ{?}jq.iZ^WKR+@)EnB{7 !y"wq |3RY:3vdTdggĞaE{(pgW[6ri @q0#e O?O-g ~G<>R?RګUa c~gOH(sې|[͘3oJjV̠:5vvuPOlJMɁ3Z73laYzdjR*׉3,{63o} ȁd4wTb &su07v!>R#rRD(]j;!D.'Ծs 빉/-%0#LqDZ)NN'1ēվmhWBmGrcQ+0=F@|_w֊`^69znr d1/[& kNQg_Q" $ Mh睵r@e39@Zh&(UF?I,f>*,əUAіP(DݮN:?, n{LdM1@>tzv֊5\"+-tK+h qU,ܕ3Z?}\9Tme\q>zzq!?Oy{lE.#]Jpu,Ͽo:z`>zܼ g_?M#"Tz{ wOsI<2s,id?^9e}#}ʯ'ppԭ]6deGKg*5 ׀b*hsYi~U d!TLG4dv2~:lꩥtPqk|aן!PIim"Toi^ej:4Du\:vY:=Fذ{Ƌƨ\.׶g![_uc^\ĽE8=zy8i z/PM!Z-۹F,r(%s1@GҊcUx j)0g97|X#%hkyYQ(xVNqEWN! UeFEV(u\U)}1P qcubV ⅊K?k0ب3(2` &fxQ?SYԢNˮQG2mS N#QbWoMxQXnJb h}T5V(]VI"^f0*0&6`a I9 s VTMШXL0pM%R.`b`&k{ dN@0dJUJR:;Jg Ȗ8LYl565<ulz s.U3 cu#S$g2Z:DqeqE@jl M32C#qVى?1JlD~"LS!,&vbw&o ,ĉrI[/-J+s*ZbRsVW5GmDc=:5~j&VA b$CP,5HC '1؂-R!Y"R1gITu%iB%VN=nI"e3(:@?4l؝mc=Oh)MI4Im7(%XU%6Y̊+ȈQ0aPa #:@~/ٱ4Jx7hӵ; *eh-)CZp\ !aCD P:@6ӏJ.: T( ?GRdQ%JXb18/ݡb%!y@e P#Gc::(KR̭W$7\F+"h>Qyxtzj``Ϻzh E?gӳc6 ð»"FÒձiJ+C( %zBzj+l8fDl A~jNx9+9m*rRP3)yK)eRJinI/f܎AO8+F=Mxq!ZNP6*CJ:tVc<`ɦOmIe?ylb\T!NW,w,9؋JpV.>QOi>0cqli;{gIsPHq|lp,)za%Smᒥs_3ܒy`_6{vǩ+h[D 8d2k>U2lvVaȹNI魯9\Kq@ǗoD2fyÛ&ofېpޒ܇O;QC1շ~wnfՀ\\>w-_g>ilzd&7I(Xgsy2ow1Z`goԐ]Ƞi\_ڕ8GR@ua- GΆw(g:WF!n1z8EpJ$ *܃Cvm47N+Ԥ9r9>BwZWN5ʁ8ET hv/weJ+ 2?ua[[U\7O%ws;QaM'e҄0~EyNNPQ"^,;.|~QfL'Ca K%yWZ31[ kآœeOQ}a\S̓0ewD7G)> N'+x܉a %PMrk8dhM_269ӸTyqv::NNc^8躾͐fy>bRW^bp9;wXq|?s7C]`g ;9E -~[N6ƈ“x^/r:p?ũOWǧqbC<@bϏW#tYAI%֟ӀX8=Or=O~/pL|hY9M*_4Kk&jm2|GR6%OcqqD컚1u;~XkE<.8n5Gk9Kڷe__UpЁyԫOVA/d}1F 0N^PѧVwޓ37y4{"hCnt тaksG8-PІ_(ܯ$)5$o c6jYۤ@ @ALzƵ%܏h9zbQHY13I_,esi-"Xp!T=Ơ^D1".  mJ.D4jp==^o 9󋫳I7O1ָPM?ёLw_b5FH,eCk?/ދIdEXԾ%ͯx$Og}^MVYF ιX⧿R%H"p_l۫[ϖ. ʈ[OvS󠔇IcFz˿/zK5*"JU3 1-ZNUan^֭)?7aǎo3GESF]rwdFf+@s ^2cp[ ua)0ibo5Y[I66CҭiY=/~j6|_S *Xw佑o:^_Ͽ ͬp!m $:׽į{y@CmAc'DbO_ bU( (ӦqIgCI<)x>D* ,8+aԆwvc]x{K` ez4QtOI kI+G}UIYډ:zB!3GЊ'I#!-pMD璡BDL(+e ~l5)#!KVXǺQ)= Sq&#Ǻ-S,rM*|pPZy\l@yZɬ+% %݂e+I=ֈ35Յ(*   \Hw.DPRY'N2;yxTo֊yCs]r~ۡ~ 5砎ϸmVmP_;JR&ˤwg!LIi 5HIRTJ)c 80JD;oMOxz>Zz:P?wo)!e}&\=v&*(Hae`HQs i]gH]0gÁ ͏EmBQQ>2#_yaɈpc6Yicˍ> *m<AnO[Ҟky5^%͕ 2g`~ ;Oyi};폊q7{MB:_skyRTºv3B]|(iE.T9 ܈,ȗGu]EB*hHEs.Zj(2!1Eb6c9sf+1 J޶vSdsK'?vR5n@ιYmG 9xm+ pqijnqw&bq+#kmy|}sx=FFxGMav?/X(#abB$(w20@F6ҜN>w驆 ԊuJEh,"kµh}ReVRiiV8/ӌҖX/5RʄrGA8HX/o" g2E"b팡) fkO/FLej`,aESM̑AX94y "G[A^eO'{`:z>~PA3@S`N?kJ}z!)-lB!)@lѥ16QMsP 5NI.jyMY>BbI#\FMxzs$lzM6ďfG'~{yw?25GZSgђ@|x:w j2|dxd w{gz0!Վ4J=io9Π Oi0أ30"VG5},)vYMɎ ]|||g#={SQH К8#J 콺;w=gVqzVCy~Z?̟|5rk ;]3L4Q/6p**~ERjP; 2X+TD, ~9R 9u- %r- NaÄ^)xq zcKyl8z ,Bé%]28ys~N<OO ζWw^ɎG4 / w!0"MO:տMʣnsqwT20zoġQCZWLيKb4N <D-7x-Q 1;-۬"+k#6Ioҟt9B!F"0x,~H(h"lwY쮷uoHۯ8las.P־cIں'Ɯ?LȾA300q=/#η7I %a ;զ{-=] ${nP܆rSɢ5ӹ󱛄7b\-`7D6 $4ְKy6Ƌ_a+"|)P%d ,L; f^ŎwQ,Zy|]sV#Gl^96u|ko a h"o|~VgŊ]_'3[>SZ+ӧn>FIN^ J0Sia'~ŵ"bC6Њ Ha~Ew뢚JOz$C׿:L5"X_A~\CJKP_Cm89IۖRxs<1{*(uSKJT,Q>ZĉxL)?hC}fD2* ZK8*-37X+V( 1jw5f&eAt)7Pz_hV̖NJk<[Ab5gl|[̂xH'uiVC]&ګQЄUPxK<-yr-uڔܔd7jx`$$Y`u 3\5Hfrq= $NQYGu%˄/g ĸ R[@طߑ1UhiXb :T(6<:aY:zeރ"j.ВxM Yy\L'6<x,UxM,`Ƣi6iH.MspF_-U{zq;[0"E!P  ⩥hx"cÏSz/d&=8^5gi,jNb8pȱ]Uv1s6V,2:V(#ry#7'Ec;=!HSRO{!w=@r)*{59HD?#B$/8Hծ|]( $6Sj{Exd,(g.YY.SaDpXcpT3f@zyC^9њׂPx4ܒ4w%a(/.Ļ8^ X RicRt~11n(Y҄o/7O`t4#\=O:~zv7^Ճg1™7Od>^!:HV^G>#"9Knf|{`2L&$awE)\ `QI։?q_`J3B51-J E H , Aᵵ/U [Ŋ~$IE1@ :Ԝ~oXӏԤI)2G!ԤIi 5IOR4)e+6iIک#m#xhzH1HY7Z73Tj"1S2:_-zj,[]mj鍮4irL;>F)'br=U&~Oz8b F'YU{=EGh'S_} $R6bGC}'/sOOa.BdSIΫD={%nO-(gJn N*ͦ,ד iKK8Kd쵡0F~;JDW[hh:C -p10&\Y\y9>X]@G?we 爉Nmy!\ waoc7 +c:?DDʷ?mbCܲ*~3|=&ެi)P+{?Y Rpɘ5kHPЇЩ~ť*e! T,# os>?`J83 swI\.9L IM694ph+U!%B !%+Y%^.HYaGx &DՓ\RaDӁ,a8Jy_=M醋 'zPR))阥+gО8JCZ)\@A'/rRdHrr')CWp x$̌rRZs~K|T)CWvtŗqzwn㣽抙jW 5˧WiUeé5K0:(hB@UmwFXwMtw=j\5|h y5]]yEyTҤA^C5feo͂w-:IXoItk 'xcĊoGD!D~oz4X9W=tX#ծ ٵdR͓b1΁bvE:( ,-" U` <6 5bڣJ˾0kST˂אgC$Qi&H Hsٛ*B5-mI}c념BeEC'*zu-P5rX92&sK,ឤpZc5K'%1s;/2C-;ilv6G%R_ajJgvȑ2 dބ}A0X,v0O#ɴ-Kqo}Y%)f*@!;7`9({v0˼F;@.\$Lw}e{DHRjb". #3&5JQk N4`lȕ %)o8P,T;RCގv@NAIkJ딱6K4HAF2]QE*gU+͛ H7:Jyho|#B "F )HZ߂) *ZcaLqvއ x*#; (x SzH0)A#9⯽P4b.A:}Zhmohiͻ@ڴOQ{{t(/J$>nnQ^*qT GIR)qWJ$ͥ99rtn)/7e߇S|uQ|ZjRLI-il 8#"BM=,R#| {SXNu(ڢVli)رVrFʋG;tvVWsQڊU5n$iy@tT܇TV>V oLpP."p1e}eeM:xI:k&\fQ=='#ѼL e.=ƥu:V͋+pr͉5/h0K2"*R?wsRF` ?fyw簉_O~1O֮Ĭ?l招*Kk}/)b'EZ8JRڢ9 lNJܘ 3*:h kc1b8JKQ64{Y?˨J z-6@}ލm+FU>nH(Y*YSW`8_&HakL6l{6ݬwi]P,?ڟ$K^/UWP k8/?/=0𿾚=YP-gMRB(Xogl>2W(mǭ:dJ%$EX,i*E>*xy`kT^{@U'CjysIJP`J $0We*ygUZ5 g笾%UPmK8G4o0SE圁{݃T'(Z9-BJώfǑo\akț}&fn#G{ϘcR^8V~rbxy_ &8ov`> %co|"> a1Ax^/~jP?…D3u;ζk#\aƟ͟W j}XrnZ%[&OdHB;N8d>g/"'Λ۶>H~H-Ksػ*i uhSsVHvz8mnSLRe1?ڀD31`XUݹy.Љ=,*aZ}};Uګ2S!'= +K]Wҋ*u)}SWB:Q; , ;'y$shn{w#$-GFK #g gIݳ[}e@j6Z7a1)MB6nE։>a5Y4\^>U|B^Ѯ>1_p3nB'Ru '-XpN-ծT}QIM(1Gr`kf])-Ono\בU.^nf[rWJ8+Gqca_3_gJ nzt|mQXНMG٣-oq m-@VŚ5 ۻ8|j+m(}>V*΂q,nl6)E>fT8mEgphVB%Aʷ+umTcHYVEi.SJ' >"ԍ/|| ɤo9Ԣ^JJB$7#gk gIpoŇ:t9XTS$誇:OF/r,*9ZU daJ(c_FN#,װy6ϟTYk%mQAyLnQy!5lg]3Nz8c fGAXzR߉l:׎ ;.0ꜼYZ^>g&^S{8" =jRWδqC%']G53Jf%Lk?tU=φ>[^ix B?ܦS~[۝IE3eV*W/_8љBI1ܴMgDNܴ%N?Ǵ`~y{߾v5wgw>Cų޿'l}c/76ڌ!v+ꗛŵ mv+RQe"F厑-/)R1ߔ7"Y򌂭zymn"%f9S !MQz -eE3@&-S>%l¹II~L:Qy/]6&sysY%z j'8,uprle-"%lAU/͍Fv@/}JL\e's<S{տUΑtFN@2ZG9HW]c~œ+1kcm O?>Rz7l)CC 1>tC`pJIL*bKz#jm kj= 0[դ:>?u+ՊI;7"bF";}~]vZߜ-?Gy4{<=>Ψ.:md>}7enK ^hQ7'7gъ_)f$o| MlBl*TTZ8i?B,Q"(G ʷ>T #H#*éXl8U;'Y;D^ɦ©}" k}0ƃGHSa wŋV$@c&Mֺ{_4z+ sr4ՂY)2)=GL! GAT,fh"*Y+(UTP,vqtY0/F&addhSS))K7;z!eY&Ђ^}`6")9i炓1 +CxTgCwl&mǞcW^*h{Gf2 j[grɉM:ވ88Ҙ/oŗCw^)B/TnݓGnl")nfd"Y$ږs؅rm6bxrA1EJiߞR-l~)՜p@J8ez8t |an18ɴGJ!5hQC`8kќ\-]Ļs%q[Ϛ+/+[9ÔveWћw=ʮOR S0X)+ 8c@Xm%H@B5YN3yepnbS)%s#Y|N^IPz&aj_-("U&u aE&$%P QB'eqj#A{b_wYؖG{SyiYi~J08JdLXq ^ەtK] h{,YR6陜2|#Zl<{mK]6bx*uqAՅJ(ܦ}Rx-uq L(MS)PK<,C8Q C%˭8*i Pg &YI褋wþϫ-*EjQEf/^zY|R臕w V^<CE/{j e{I\A+Ev3qaCku ~9yԫcćkύ0Op!s:qeD <5"R$4n# 6)i(0mt* S<B9iU(Nq7<(7]Ox+qYIa/hc繠*DQ>sr,<&o7 5!(4ܸ:ӘhcG'i<-<LߎRAv3{8ް^91Cq8¤=Ciɩ9a17 6&JPLdA8B{>㼍_zʒmD iyAkJЙ %('Jebk)3ɲ,{C:ořɩ wfXyRCU9($bbkJQ0) . +9 y٦noIBֈIJoɸ{|ϙ䇽,R7Ӂmi{)B(O*D"mxňy>*M|q0(i`mӵP,%"[r 1NNqv;"'5K?Qiz`}eӰ~K[!∤45;}O~D؃^bNsɧo{L2!yJl;|}":rNPu!ԛN}f]-7eYdL`E? 5[}3~bEG5~§B&SPJѰ9GA>;(d )럍5Yխ* JzN c/wn}U3"~q3N׿=>n $'pFoMd)d_]dncl]RFQvn3SnfPHwTm?>zX,z8yu8~?fxsf5b{UCvd` !{߸]=+`s L/;2w{F„Y?۬ZM܄6ˇO793C 7kyymOl6z7˧蕉V>~bBɏt6_}O7,߬' &q:nUbe5ɗo{R;nիZG_=q}b>fcм&PRy5)EN'(DUZN4Y!V$7 o ~|$|M ^Ȭ[޷+rɑ^ZZE] &\^t_7lˍ-ӵ ϐe oo4+BNtR,~pS=e,ޓ^4\:.$E2陜IqNz*ᦻ78 ZT(b%WHEMKT.\;YQeYmDYc4R+ZS+:p#yVMfz[a[Ohz)Ych`h^.mzX\!S˺[<&Kخl (7\sUWENY˱*JӬ^HJ9ɐYj8Nܭf,&?|v 7QË୘ݔrAXlnMogfWZ^C|96]QfI5T*AaYUG?o.U7Y5?}+K?(o6P lB5 EaS3R ƶHg,/+x~+Z hO6: cچŮעqVoyIK#^kƕbH.$`9יfYN@0am$WOH]?7Fujwnh?U؟f)aa>QQX<:'&L9ԿljXG;UX&8_…sȀ$J"TADvssҳ+&TbpvvKMjRZ=EMj#TYW%x/훧.v7DP8-Vkbo*))yC ~xhwwEW3! 'H 4|ggx[̧)qtw- 粳m!m0NIrP+ߣ< j5*w^z:8'I]*s`V$CUyΆm]/Iqkl@D?ϳܪ|^7߾c&z%B#߿ =M鏉Tf6JuTHAd%Z%T$(Ww]%J۝M^ rAG;ߟN>%&9&hM2Ujd+'orlѽӟnb8QqJ籄+@lR,adK~1,q $ۘ]"@sX?{Hn_;e4wC&|$g-A'; ?[[oSmf~bX|,s1 3`mUl +`t#7UkH z\p0] :1)-' I[Pa AWXp.U3{HA-)o>G 5c!%S} @cHfӾXw^~CR;o[#QbF&6Ɛ!X.5Qa`bx/K%o܀\TȘ!P\NJTQƂB?XaY?%4!Tl²zR )K. +Nz+>KbD92[_'[ID]ٍd=R>[_ܠ?7V#E`k-|.o**Tnkbkw9TSHA#lu@rPwOwpuX5HO$JshQgDNid fB0u+^o#\l82^0sJG} @ũs ƞYb}lL9@LӍtd=R$? ,3 Q^ZºAv5KM#Vi deǴ;M#%n-dNECnu<0=K=;L9HKDm .~`edZ3N .Ȭ)oLǁ>@HYpZ._ܴWRY(MR*zNƈ)kuYX,|չ۰ <@,@dk,0C2^R󁪌;juW |U c]^>@`3s=ʡ ]C$!St(&L(u.ݯH`].-LL8x [27aS[5HBzX( 2uȳbĦT͞[ԢPzڒsH5׎ VZpeTː+U6[W|4 u$DI'^%ا=GK_Gެ*y-h?nˬfe\_cF %ߧN<h@љ o?-xb~[ɔ=4781(^=3j?ȲSмfRE'6"!M5522Y% l/*rkblOS.ƂbA3k`/dJ Y7-La+c{7qcd5ٶ N{C,bm,U_}ZӪp[ZBI_|-P :@ROkA%IS ].F*i8sիo^1,#LZKxbSӈ1JPl#cե0Q\Ps |SE9:p!^\MXɍ8E4MR % cZaUР[D*)UVGܔee-ݮ>vR/ouxy<ܻXMpJXLRdJ-0mR)X28qMMSRb)`7.q @ 6rbKambpb$}es4O#OZXLc&8FMHHR=J 7#C:BjTbPHl5N(R͹(q~Ri0L;.k-I F9tUwcc*0m o~ݖZx#3zKXnJnAS*8Vۘwǒ/ oH"({J؈GC/\(5U9©AZM"^RjH 44FKF;+ |A:<(^)y"rk,XɅՐl^cY,5;b4B4}yjLJ|N/Uf:ovtiupBg8+t̤6DDRs!2ofX8uqi-ܹƠm,iߐ>:8n)?1NA׆X 3ڮz-qRm*X(li !eJPO:K iY1iߝ"'UN%?_'7<*6kU>J xtM{ά|^힀`V}OWX;x3ڻl㗘 s&_z1Xef 6Segբ.B#)>UCֺj5^-*2YW zjQ%%jӔ2F1s!Qw%Yӝa2)U QdVlɬPldV9B]ZTU 50iZT`kQPt ǒ3J;~&4#J"N+C PXSmkP%i}Ƀ,Qw#KhcƭɳcL58~-dS}G5/Lù.GS^_]UuoT"8VnV,iTEq3edrqfxdۙ#=:Op<ͻ {#y?Vm9=Ms)ם_LOyuvh_#Oƿ\23a/Ek_EYZfY%^NDop&M}iwlV(󌦆$䝋hR;vs~&v˃ѩ;FvAaZӶv(P5!!\DȔvnuy":u(.8L9Lտ,ݚw.dJV<'O8WpbNNccE!VaIz/$u1@2!I= bz`uy1Տl==E.T"k< )x)EASTQ- s)}RCa8 /KPXF5'hw#ĉk%pBldDEEKg(ٞTP=Nq_a-5u& }X1D'$E&DI` FH*ɂ҈VNDmXEF"mw_[?ٝhA3a:H0YY1b{ 3YԹ:i?w"+|p%$+!N۾1<5G3)n{]xi*ڻ\~T]>9Lyψh]Y%nQG۸ }hs^Kt6Vs#6aMn؜eCoڔd~n|-sLݷ2 pfc? 2t+O"O80 "w3cyU0(7^l{P1x:mˮr,Ks2*>Qo9XfN.u8po\1rI75Lة=7`062vzTIs]*H"Mql`ZYP4xԾ?!3$*&Yw˗r/yLNo$$D cV}F[D5|)smW y{YDpI"5 Kxp>˧`S M?+P7pS_`}H U/ɩ!)vIS`H)B= (٤_c@ps34nj6ި5Tc2Cztmiޯ[:O_Q7\, WWyih4MR% &3g*~Ȳ~g6u_-:V%!dj2o5\& N/̟VFcxz{si-)- 1Ni6ouUJuzѶ8g JRv0wf1b6uU~.w*$GtH1+N#n,R5MrIi$[H( ~sCY^ݡYd:Ѡ7*G;TF\0<gMkբev>?K(`" >P*;2 =2ƻS-==8%|}=IU}PЍʅw*^lTZ-_JB} R*%UQ]QyPp<pj쬧Q $^H pl2rglZ|Mq*^YXfak-Fep|(!ʚlnFp0xp (]pHkk :E׌bnz Ā4 p`,\#MD!)m58MBpgm׻Qjg bk 6378MBljTؔC$)$ǴiІi E佮؂1`қ{%W+\AoWc >'/P5\iaB1a"V`ESdcbb S+L`QX S .ZrAS8in$=7 Hﶩv>2 ^]?.#FaRQցp/w{T+ےXj{,Cz:2>> <0>)fK5>>ҭ$^J6D0ʙ@(4½@6Z'~v"h380z.+,uJt>0Tz-QOH~!Ab)?*aRB9PYO0e6eMKT8EzŽ{_-6yeSAy@z4ujrCX]G&̖:EtWg! |u,XcOE_@yF\] b\K>! MrK# /4whL}zj.݄2LN FM/Xc?X/45DE_F5jռϫfp1)1!b1!#@+( BCHL8FBIĜ]B K4ک6t㙊YGbnVkm"[ӊUҦzPFqvAUba2(Z!`_=ζQ66FVMzO0  ( %}@ #(ֿ ?B<}bj"EիYnC$z7 SC*Eb,?u,:%A8"=7תּJ#d I˞.2]Y!UZM G6Gzڦ IEz)#?zq HEDiF< \Oö׹mo9KĐ ( B1c( U8fSbk—9%6>%c! ` 3^W xDvO1G2Ԟ.ʅ75P1KZ*Ԟ($P}L4R0 gD"5pRL;xJ($D cd0Bp'rMJ]T/;7 3h3PoSmio{۔>waߊ>(zxǺ\Zƈ6K=z.kQMrJܑwkGu1Fz=h3Nf}оt3]'+Urx$.p@}PG>MΛ|&"N4=A@Rq"kR7~]|@fPm0w pE󧬪xyo}VFzi*LR!k9S=tZl;{3YO//}OFuU59gqݠR-L" :6 A.5bQG$SҥfH-;W7EIӋP7w>7%į_AVݠZA4Z&P4$TL@%1`XuC9 <-g]o0"ɹ8ϗr[X-!/P};/+!GgSΖZ-=f%' pLh=zَg;E)byiѮӭhGnzi*1ݺ6:ck &Jn:tc^2HDUgͩ@ "#3Uu%ܙYH_O3y3x@Q*Ґ\EtϬ#-Z7W lTǵ2֭=pYc=_ҺU!_:8[7Ҧ戁F)߭UaB69ı*А\E :e0 LWiO XfDF{so?$6[GكLN0kzhY[fai%$\t"aԣԘke',S!Jw$ԣT\WwԖh) 8cZzZ B<A\M|ݪPaZ4@|,.TWZID,$):m+%?%)؉vt^wLKun=´^oiQֆxjfK0^i\sY.w{ٸ% Xm5k1{}OƜՌZOt>K@^lpWPۻSG6jGa3h-:gm0;mjE2񱸥|+z"@=E݈׊ҩ[Zu׊FiQ[QZ!tڈ F#Sԍl( P#JMFOkEIQj2۹N(pź\+J 5A\ˣc@A =ƊI+)!rm5BQc0~v"R\R]x[l{4?X,g$X\ݑFVϿw;.(:JJ|}+̬)|'揕6O Nuǚ*6];PApι6SGK5"՛1.$6%zN&)t9-$T@Kӧil [mJ7wRN\L48egkg/yV0%{XْtF%:wm('C7=r'4OT9iY-)Û.%k Y`NP'3(.#p:$6r'TrLT/kdrZe7ܥ|sSZO[I}zG'y%uoF3S:??>q1/޿ M #%9H3/8bDUtrQcKI0>C}}~<0bX/z:s16$cZ4'2B+1O67ac~mLSp궅 2 :hpL?GQG>WF+9Z#VZKj߲Vlwks~ tW輺mP;^:iDչmSѝՓmL̫Z-Q}k 8ȒZ9|!ū&"Z19ysՐx +%u_+4KեNj4JM#H.9̲GlGt+ [J@ӑH9K=ߥۣU eQ$A!U 1zsA$A,1QLQc_HhbP)/~Are&Cc J!FF(iu[*X" ![0Iʝ_1Ӄ$?=Etfw?~z-0@8GhtJzäjB%yh>3>@?yx\eYO&}Igx-R H2=$z_J}C~'AB=oAv=w{'w}b:Ah‡/}wKrE1FfЉǤ֦P\k^t P CSnvmSKI=*JaUK/QK劀P mj)<#HՖ^RKr7ŸZzLQQj}R|[;yٞzTIr/[KRJ :ǤF^bf%| -=&(5%^R`-E)cRRc@3%k)V3yn̽]XzLjUK/ZKvI)Bi)`!;Q. 6Zdl@_Ga T> %c+EY$n f*bB] $$SH7Y -hUuWCמ+3wzI]c՟he6^8"d]:=ҪJ\GWIVWARG QWW=k\N5ܘPͷ_g A.fs Zv  tsþ\2Bd%mG$+BII08AL"e@)q P@P`|sI8s<=so``/gD4,3䊓З19 eaāb׺E!!$ǀ #mŖ")Pqbj=Gd}wS N}6v_*,3i .p6SYG>[=|0u}cϛޟJ.5u9|him=m$g1ܱMi/F%vK1J6]I, F A"AxXjG s}ʋ=hok:Υs [$YΠH7n㨎Q><- tb'%׿4 f 6@J(2c^M? {!ѹA/cCm_ >͵q ?d Y5Ė @3kgLj*)b( ØjA)E08@$%;Rl6 -蹩w5#HMha<ެZ]ٺws?TÓWx?'V/'q^/o]f|VzV}X7!D|Fp)ʟ~h'*{O$1 5u]/M!1wYfї[ U›&IޱKk&-\w\LLطj+53&so9—U \WĒsH _^KlBdWL˨ErҹoN&62M쭧i/TBC(Bb=]U(IC=%+Ru;v0B+_Op1==|~%7mnLe-&;_ާdbY 75y6[ fwm_swP8M|Mps%>;v%;\j>$[Xjw̐GRvH&K'yW][RBPD zʱKBj``3&B je6f4B!p㏿a<"EAP P]\(Z%R)-$ׄ@3 3-e$dnUAdծn._L !h>NLb 2UHMw]% {sz)g5EnWs9>1^7_X.C }/ekG(o ~8?;!=)o%:4T'd'k6T#h? Jr3"{gRV)s#V$Oi" S`MjeT~PCH& 2t uoòr/:{F U1as`42 Tj,KI^QMYF<#Fei Q/Qjo0oba!Vk6Wc@m5O.ᅳ"\)p+a|V#Sx)ӂ>n n!5M\`_"syNuFjf0dYf p[y¨f X7zͺ7H{bgRD4eg@H*ZQhn@Qw([,W10jvMK|r儿e?>Ӣkuftim6WOR-~{wst%a~o>?~fSjzaˏ~_ء3@M L륟L> bpm;Jq ߞ\Nhj2Y~(Gg聢0j2sgL :ހo^n`'9ޏ\z9}woowMaIR[7ڇ-܈_`'l |KoG:Nb>d1 x /"e[eεZ< b'u!FC GpgKhԅhۼ> 5ԥ'r0՟ZɻH8%`UrF؎j ߙ i'aZ6 [y=Ui \c+6f,զŒMfXU?UU&[FÀu+m:A~2:QU_'hR%h8z(ma׍wӛu G/GvZ Ðq0h0CղK4SBe;TZS+:ATG· !́- jKw/ TЂic~_ԩP)Tҭ77Ո} X-n2+ő9|׫_ʂv]f:u1W7o۶-߈}CW]o˵\FJnvcvs,v* Zz65glWV4:.Q2%{MVJsb1gh;UNB->2Su!!/\Dd{+&GnN3Bۈ@!h[->Su!!/\D[˔O&pzFp5 Eϖz\dS)퇰Y{mǰDVi0Oe5ƞ#R1`w9i=N._\ ꠫1#l35 cSn(T)ʣdJ1j_fZHM[?Y{Bq UX| Wm/cgr22PYg8FNѼgiC+R r2~Q88PJXUH)vQT^X`P9}5O11weZ $4|}*V 8)mSM,rE2aٜ"+(R $[!"wێNͼШwqn L !JS#f ʬBi#>:_IfT&!,`>*̺ OŌq!:WjGkPjsL#h?E@{Nڮ}Jb<5:_<ͧF=-5gckVƾyg҃ ۀå_Dy1(GpŢ9Z7;2 og*a Dg%YnU H2c 31&*4 IpeYDQ6IpC;Wy:9a$?F@Xg)'{ D:vQ(nW e ٺƻ _I)ely9G^N2%LD~rI''A_OI!Sg dB&γo_ Γܚ41 Mf⓿Ѽ/Ex5pne6vl2C9ᵼA'ɠ[}y֢tuQ^h(5_L!V%EHmRh.Ux$LYn3$*!vp<\0D= c0 CA+>0h> ?#w_'jƹ&B^sj[UPA@%2ږεy8YCm+iKPLx28/UZPB4ɕ!ie sjԌA0Ӏ#yZ;GH6:JΌ87^0 hew#)#T0S&3"fJht/kxAkSKC&3CU(4h" V)p3*(Nx&2nta1 ~q,JQMiMiӄ#5ːg $ =z;㜇Z"RK\J 2 )eY*`9zJ'RL tl)u8`PMt}fxU%\ ̨haFF@.>}bΔŹo8.[܏A4fԱ*O̐iWDэl)e`)UѾ3=vlp +>3p(ds{<wPA a 8s3ϡx|;#2\oRu`J(p\V PBWtٟ2(Ϻ󞫃:,#\i1V@U. c?00 {1i]Z<N4#Dg_#&mHTM;R QENr2"a2[=Z3!^\P*#Ԋ_Q3BZjg\a+軥VR y"%SnUlWgC҆WV5:]ƷjE4J*]qj7EwKt"F\E%p-~v@B^nL`dFD~ΏuCCwJcL0C>hBt xofiN({G=;Θy,0r\VB{il(4A8=4SXg!JÔ|aU /ևڶ!B ̆ 4&~{(u`!.h\_=8Q&\Zs2#mƈEala T[w78ZБazG9HuXzhiG_Z*B o^fǰ] vU%s b%gIJt]T32um!H67 bj#fN9DϪ?vsktIw 9pyjSTٰ5'PU$ Ҭ͡=7|*;3(bHB5T| P8[s]O~\϶##$paDS\ BB2\QB*Ehfuk&-=?,&&rKi}~yN5bx;@0h bFV\k\,.#naD就,onY=Nٽ}<+|\횼^$/Nk}}[n&?qEwr<$vzRXGW7{lFTcCgIx!hiro noy!dh(Ji0oBYp VҜdhilr.SaIA k̵ Ƕ۟A>jW]7 h֘RupңR wRP 1/T')=j)_TuR*<-~I5pNRzR Wj񛵭ө>NEjԯٔnF5'\R'%բ#rcRRWӖPR`+f7oB&2)6LԗvxoUV08% eZW()yd}무 d;z?]٫ӿuu>y{$2rr!aI%&J>C} StI`; >]JlbhfdJ3EWc)znKCy(2,%J1K:e!c;'Fi;2$z&<0Fv71.N<#C]c(ٰ eȇwo~d%Ra\2/?܂xq}_y IOQ:~J.pd:9ph6<.Jޱsʼ U;ZUzhOE:[Zy11yYZ8Vʤg{YANZ-7Zě 'U~d]Yf\V\ƞS:4ɎpQ<ጚJXQ.]>Ɠ^3õ z5%R&qZ&<3Ѕe$ [TP)?̲}/,12*LqD) f@"IM\#""90Ȓ"!I pHN0v43ѐBJ+!&Pexa2Bd% #5yRrSaIC=a.DP8ǨS OoV ʼnfzg @=۹O a(8/-fH.IԈ{R_ե)R4SIx)q^ZHy{iЮJ2祅 q*Jx^w0:Ucz)8/ R+&WgqՃ`]=+ ҸAk9RT\Ю{"_꫺Ԝ)2祌^JIx)cq^ZHͩrR^D)OK9yi)RSR8/ y$tWuVO^z^*tbGiByi!5gzKKћ)KCK8FR}h0ƅ(+EJi$A*eο!MЙKQ+22LfHKD pk\*@h7Wec`.W;~~yuf]| rv\T0/?-i?V>x K/}>|=ٛvg- <[ύre(?y_>v|y޲twze@[x bKe~/nWe4U[Keۂiԓ7u Ujr"mQVr59|7YWLLCkP57Ptfv̭R0q %=2 zVU=ީ~{Nc 5p̵wd{=L?<6\]O4!CHJ96H-x-x(zd?L Q _vZ _yVM~MJL/RQc~PΥO0%2ai'DTKݺL}(\)( ]>>n`ZŇ7 T'ECE~ȳDhaB#o]J')ĀI:"/4YyւI! W{7;z:g]gڳ߹|J4'3d[MA/13Bey2fP)ݏy׋m =fJ3WgD#sⓡ]G^أďu%ЯճOo^Nw-˄7 ݽK2+HA T/oo<-mg_Ͼz!rHZvFTTpDf:́θDicQ~ P~9+{y6.g幖Y(\"0ۼ*9[d/o޿ b\|s9&;wE\b"eŕz9!F,PC3MUy&ao9anl ^j]!ϑ!t I 5O;@ɀǞ $WJs)B.+՘vWh/7E5Y_n˽h\cI?-:4fͶ}f+[m3 jCV|΢f.FM-ufW$/Iު^4"9RVX ՓfM*ۗk@~`B1U9L_-_E%uRri=ťF6ASGlδ i//hIews&cNiJҌ :*-7Z2 (uV \qJ3[@L )|\6cZ2j&A9<`3Js&3*j8CL35CyZjA$n kLnj F"W$?A6%Rs+haYNXn3g5hw~@/~/,?;"Qnu9SC&}]tTr%߭?)#Ŕs>B\2pJB& HFAYXNLH#&ioCQ}9htV|whRFn#1@ˤABiCj Ac܏%P1,uB[˝ӹG{h5 :%g_3pz跦`*׭g凿||NE-˧WaJy_oe LK?o‡rwOcZ0?s8'ZjN".fr:Lyk~|$~n$Aa OYAm vDDžD'<o㟹FwC=Bt t ؎jpfyZ3ᑴ~ahAR-A3AwH ׀1iO.T]#ȫ';Cn"=Fף,K;>0 4y ։4 [! HY U! f5c>e]'ڽo.5 sM(lj"\ѮSw!WZ/x՝^[<_/r4u.oh|Vk .xG_+[nbtESIIF m]5G2H"t-]}n~^`-A05tfDhAX?|s,@i5Rz>y3 mt߻`sF2ʙ4GtE N :l `S)3cneG Q;VV[s1.P!DD^ڹ]\efzQN|42#]N#D%ߡQVE9K)54!-aMAT`enm_B_sw Lm2_ _t}+ex}ȗ_q:a&4UH6ɯ:me=*ǽ#_ULYn!pSR6Sn]1qhݎJ9ZthltC.ɧ4;mtSSGn]1qhݎ9(:tthltC.ZH/sdg~}X1OʏE# vs?V{k͏J>_<Lr4ũʗkQ^+:{CQV[bTlmž½vQbY>Y,ol0J*H±G#k x10^t B腄mJ#9 ԛ6<4I%:3`l<>*c^P;"xx$KCmozJ='A> sRCi3L#= FFhH"`bq*;940]S?Sih6rΈdz`>)hF9Ђ UB#TDa179Z fH@,GPJk-Df:h6v[dۄ $b-={ms|dkRXNsf5!h4iNC zC;sHLq9`(J/_r:MወT`܇/}|]\W8jI9*NmmG* yh[mo|9]U|V쮾OҫT^o4\K//l T?sdquH(D\q& =e_?gg|FN߉dz17 K&34տVb^rj^ P-腣l 9vQ[~(Tq\0dFP^#תZRu:xo _LAQTn&|F~@9O$jW)O ˘zE6C)f߼py NV]/]vʗdVtT/oo?&ibn$gy<:xu#7vܺ\ԥNۜ?3FhAT&r@3`K58k҆jBg?m~($_ή[AY.gkrD P .՚Dns^'~_^jN"ѽJ5%^qkRBxG#aRYΔ6 $w$Ͻs[Ц"}(a h͛{;Q ]T |sk/|Pϩd02#křu.0iO.JP,ǔXrGٻFn$W8.F2n5"A~ ~ mEg2--%~{Hl.V=,UC B-2en* LݗՎ!j@:ØSh59Gr(A jAHFQkb&R8E*)C eYk5 !WFpq?kߖY$Z vPs_ QMQEg*>` Mܘ05#\Ɏ%/3_&`IODfQqTP'1k%SNǕU.'jslY Ŵ!$RanbN}Li-۷[>hL-^P`=NS%SMSd*Ƨ^^] K"KMӇ֭[\Z҇, HqZqa.y>J $VC(gg`v:]P}6PݱW]~7h"k#Mg QW t1;+pDmwqRNr $,EHX.5 N o?;V1U/C.k!.Pje_PÅuiA۴x4ǒuqzawkq;nz wՁ t|:\ʘ25 h@$_qy7z7_@.ޭ]?jͻmXz:!z)l)<I,}9na=zulNv '*1jlo_]Ex{9O:޽`y3)"/cV&\P/&dQKv}D SO_ PjrhiWXxXd>f91рyxSH.)xuݮ׀ `.yfm췘?|>Rqc\C: } 9lO (9n,U9Ӽ@3dS*LɔF14ӄ@ e2Q^/6i}[PK[Pc "x'%SF7xoc n&$7P=2Jܞ.p˖~g煽9B) HVHN~3)P8 T8kVB.w ԕIĽLRҢIw$z3J^b"T㤞N=L}V'ܞ7W|R1H3`jUBK(ID z9Ai;bY'UWfyH @ʘH*m7aV:a.>JCgji:ӓ=C?eߨ .ѷ^ ,n4f oOIDK7쟔T\6\uKN6Em%wIgXPL6iyJlhU0"!OccXIJEJ?>"V4Vqg1KapnF>cjIA N#L!J%HBE (,P"hɩ.gH1H$7!Ōq!42a" "%dF4*R!SGC*"J"01PzFeFv5;~z4:X7ЌR ^hz $T,H )SVo(E&aF(jtQlYNdښԲ>敔Q*n1/9<6^}(2?';y`g;/As|,?^u{[ K*d=UGI-S_7Pkm4y{(-:M*s@x85\SEzwXHgQd*HIB@)\C*^#aa[^AvH`{\[օ0FR`llz ǭ4 `7 =2l.+\ ϏOhmZWc6A1fI}s$Tl=;b!P^پ'c OU܃TPklQ˽졚I1n.Yn=nո%f Gk U8oYgf[nr1japX= Y7qƘ7t›hT1Fp ULƔ#F#C>OGNyy ά'uC Gʊަ.jR{G[WE2񿃼`*7òь {gTi!{?oVJXl*åXsw^U2=TkŚWŚ@L}aW6}wphϻ'Yz:!ꅩB8(hSV,p"p!Dq D5 CS.0^O"Q3ц9 'm9Lh£εgRht;2ГXsRI-ň!ZM2/e(=/͡Ԙ1y(eRPW(='͡Դ-1t(%`RbNzRv(ͤXF)i6c_J-gLj0Q:h2aR& JD9o64#JRfYҴ3'Q祾9zV!;b(R{aH;-ehR(='́9F ۡlũyo&B![Rvf$'nŻ3<j S_'mє>i'+Ax+K:O}/Ot(1Kk1;Evj.p&9Gj RJ{wF(jtR @!V:&q1bJRRDN)bxNI2"*uJ[{.?"aaF!JuǞҥrNUW7n,+f:Y l"a!Zbe(gV]aXoebs:_&aqvE zC(=Y i7 k7wSG:(TqIb I% 0N#%RHKcfJbIogȆȪOZ.V>*#dAAYhBc'& ,PE>ZH"EL $]'ZAlsZ'Fq -K1Ai{^iJ<$]3%iL #1e =:26"I&%wO;^'(xkGex#xc"HB@@ŁR=QCLšx\J0`(D-*w0ieL`DL|[blAH4C 쓜SDȌj#!0#v'ԩ9|I5vpPHihq<>S%F[P!" %XXXwan>Z1pV`lgN0mrgA fd3*M;x-;-̐0o\0E͏v)NKӤNJU3v.8 ݡ)1upG3,YޭP.ŹLT ][bs$6zkuV,<' D1?o[$d˫rBsUƸvVA*n],)Ag%% $ÿDQºCU(+ry[Cj@Zv2ag}{]]j Sk:) S^Y"ޡdեsGBnCڣc]yNDy6-ylj@<}8>x7_@.ޭõ)!y7IKVB8D0EEWG ɱl.BA1iyc@heL$p=SQͩyO(%,.V<tXRU7TAԭ^Bz?J+\bldm9<[ʇgXOq3T 3JX&8 AqhHI)S~Aza(P:wɄeNp6r*4(5f`nRJ BR7dwzɒg,-+~.Q45˄UW>_CHI< tå֘ro69'X.zLd)n![0?ꅏM^P4KxB<)ME{|^}xb?m6FV,?֟3X_OfɯubR<7[l_b;Ǜ?,ŝy˗4({LCp ~uM.ke866iԽ1k~XM~|w?n_C&i*I=F$/wT?~l,`w}#᷅r1xR Jqh=vr^[X9S&,LK00HR"X%D$`bbQ-҂20JIn圃rT(Az.X6ܠ>,(dU]8c:t2N.~>O&甩v2g'PI/g0-2GT]gccfs ݏh^`2*S開9{s,5o81iU=bHq dMYKdڡ,`'SLք<)K?Ypp\5Av^]ޝ &sqϯ+qrzUcPLگVx.4"x<"YJyrq.BPsIHq+Je>}9FBK&jえU# 1ṴChÍ>͓<ġ`%V_HFZ zZRy{0#OkDE9`<(䡏=٥j21TP<Ǡ`3z ]{ҫb4Ŷe8c;w)\ڢ<-R!n;GVRʊ%VF"N"$LD*L4THymihu_ɋAb/HHf.l9Ż^9XPõ$mHD0QJ  $e 4($b VETج"Fl3. 7֦#rA.Wv~G(LzaRٝFTcH &aY<gFpn:4~RK "2ţ-[f(6sr_,pg**KYKwQcqغ %7lN&.^^gXiji5Uz+:drBU?{ʂ:Mq*r>RXPYEˎUU9 LWg}Yγ z ,M0:r&Fk28 ИY>cԬ4W/]YZ\?2TDyO[ʧRQ%W $(%Wt*8U4_N)#'Wt!GjamNf1_;Ѝ;ŠIMVgT=+jC<"8Qbb 8FLK("-7Q8ՖXf 6%6= %X AC1bɅJcdKcN}/F6J#`72܎PwALCwnN!4,Ys>.Ɠ^0% ]?ӚU#7nϳf%,Sl๽$`ZK~`}.l *<`tb$cJ'g|t~fԯ01]̆qb7'TRWQRk%Uc*NB'Pt,ʚ-fj G/_n6W5d;nh\H cTps X~wAMw?2g{ɿ[֐CK *ͭ[f2vItΕf% 6VV\m׺W hjFROр0 p'|beJl~Ε+pMpd7)h^o>daкtHPR_+9DM`V!-WJkQ-F™z?.۟j~T.ޛjA8vpz9ƨ[R| r?_@zV 2kխW*"Jg_mxo: Fn ;εn8"vcy%b]5x.@hdd$&?pYL)\TZiNP2qw0rO 5y~9z\*ً7Y4qui"ıEؙ%A)}A,],:BR4ѢP]$~0-K PAh#[0?/|*!>m&I$=HJ:ZsnP$ᡉ6%6CЍksLwmlyǽpP|D5y?w5o {iKECzC38!|GK" УsO C890#_!'hH5Ao+JrT,i~S@RzG:JgnCgϣ/D3ry4iUY#QopNCG`D,Ғ0e,J9 q"cc]@m% !F%ߺ®Zj]ؤ<&UP|_9V +zDt=7{>Nr_R ],Ȭs{/Iꤣu"9fy;]q^pCE'gQ?ǃi?AŨ cmVvy/'n:{P9n@]y2ɬN,տ)JC/dC_(\9IIy>! y*ZK9uͺiQ*38uթ*xi[ZPOքz>j+}mb],Ty,(AJ>(&A)LZU$E0O\yhĹܾ? a8YV5kshK.Iع49GU"IU*Uq+J>} Cs52uHhvj Ȅ3Z) C 56 RJA[DV됇)O6n;QʈUdGxwVp0 p E2 Yh%KyT78\=2nQQVOyj_T(-# |C/qJG=GNh !#Bs~RxkLpv/p+A]C \ONIi/]Oٲ\ j,zɽrgAi :k6+D2 7)AsU[}kTxh#ğ$ B(K 5NУI[ -UJRoYA֭Kq3U2NmiOԩ$? Ux՝*.Ui9U7x0@:,c6zC'"V`ռX[)bO(9֜o°*^.l!Ք"ӊ0+T{f<) I)>:#ڕ%wy{Fn$Eݙę/ۣ;13O@umG̿/P$bBeݒXB o|bmnV61]݇9p-ޕKVׇx)_U*W\㇋nÕ=,c2<<sBy~~puEAMH~ <3>]>a9`{ؿ̢Y ^ ,9Q,= HRMԂӎp, zgw.[_Xh@!;dg}VຎJiûZ4 Jն3HEeSLQɔ(l` GgɮFY’.Q@IqdlQ\i,ܺ=A)}A3{|zys|8; WO|̣/6Kr 1&Z%m2%J>Oy a"̝\gS><] n2,4ϸȴc`1Y,|V[q71lg<Ʊ8Wfaȹil-&:^K_ r-x@4BO)λ#Ѡ=伵db79 b_=HMznMr>cLu̮ oJEq.LQ:1h"raZ $jl 2RtO ar7w*S(+DR$ރGiT 9/AKj:H,e7R]Ԫ/7"sosBQd\Ls.'WlkJdW-U- &ZTtٚ"UJVvM,o~ 9JӆMHbI ieGms{S˒[Yr֒*8z~_QCΡ7l[}jp~O&'OCl[7lsl F'e86p_5=Ԥ!% gJk{#FЙjv=iFXNm&ձoB/n/i<@w0r $9m$NC$d"2qui,SqL2􎄃gw\ DbXJ*(*!9T j F!]2^i9Z$6t{J Ju|sM+J :B5 2(BR QD:)$Pqy⹈#ԻWp( /E)ZڥCБۉvt^\Q(+z每ӯA[,M8. ӰcfpX܃}}z b:se2*e 45x_>Zxy2?BܓR"RfAݟ ˿|y;ѓJلQ¡ wIr-ܐӹ p{O>[ӆlDk㣢@,*( &' ^X\16 P)ɜi(lZ;5}6$pBfWA$TeB0*3NqpDmUZPkDžf^P„Bf _`+%|!]=E,HP-is`?zvOA8,[|W;hw0N[ՎdHԱ&H]ɬ=Cñ/|" "BksO7>|}W85қ HeDz&rCŊxĈ׷ . ;yqͥڒ达 ́~= f1$꾭9jJ,YSw zc$vb,2/뭯3%OgK.9j =re|h0_`,deȨill:V VV3y5W4ؚ7iiNEd $\:mFiFDcbٗ_>3W_Q~#Ip܇f0:nQH>).FUW'B̀vNI38(/ ppJ#8i/Ъ"G~3h[c JSrsk]n5"Scɹc&ᵲ囇fbdLLʳmm Q Ȭ:my|6H ֏$ѪA Rt Wġ2l8W@)z^kpኍ:9ᏺ*2 BP^Mcnա>*]zp.;% $ٝW +8[$5-o4A%@Ç̐?D$LLQ 5EE?)bKalI}pnta:AFkalXPp`6ۑ掂c;!|%6q`|ą3ٛ}#aÎ.o DuZ|7}wR};>VRw:Y9"Tzľ#pdѩ mJP?DoC#X\^]Λʨtv#Zk٬ Wg|ؾ¦.GdSa]%56(n0eN1ǰt/ ^?<_[|lD!9c=;}qAP1·$zؙR%Q d#35b-_5>ES&YJy :''$a6ϊ-Rm),/1&E$ikPJs$$#I*QJD(A|{_ynԣdb6(Y ^x@& ͉#L,lbm7**$RZ_>G_Dk?37hxx+-9xJ5C e}SҴi|(*wC}] z.:m)e& J)c7u8RzRir: ?ԕR֖~Zx%r0zj i4ɍ*=m}$HN:[[O K_4$lhR+"eU z!y;c 0J߯ +#([wny?-R+is=M%:f[G. ˋBIo/TuW jPOʯO`+U:(@%|P栗CPM u͔W{s^ؿj繦yq[!I:[ r8D2AQ=x ?O@J5Wf}sx~s!qtS~r{:/m~xU4ʒN٫ŻGSM5Pf;X }TALo3BZQ]jr.qGE"uTTP+~8~ĎW]#LLz"!3%nXO}.iWm+z X-/c7~n(ԣ.H$X󩲇 !pk0Q9) GAb_& ۑЌTh5w5%"$'c }ϒĄH&Dmb:jCP8X.IЇ'iR=Xh$ވ=ЁAt)l÷wrzew3]4h8Qv^01ޛ`V9@V%AVNP%{]hmA+iңu"DF; )+l #Jh +6N G@%N &Tqᗚu>**JRvD|cDqB<}KW "/ɘ$2 ʴ!<4Xzv.v5e. L5*ďmqS ?e9oR"*G|QOcl,sT|L8B'!B!84"gDSTp)Dm k# =-Mhv4@sn4/ٚjA?V:%H`NgJuEr: )!Ia{(6p& /-$k,&R#kDP&N]E >TsCB= 9bVYN $/Z{MOe cP5fُwƐ;peϐN>fNۇ6O'>ǰ#W? <਴f?rPpܑ!%GAң1Ҝ#7C H/Sm^xEB_qOUF'XĆHi\k9<-jmzR0)U"ci'?q`6:2_ۀez(y-A̚ @SJpaSł(Z^[+dٔ*Ch^ʹ@ݟb4Dr%-hn{=(z$(c9{@Xycf_jqh Rd)ձ/t!:{9BatTˋ`^,lIm,@ZLs$3UFi= AN\6Eٽ=X8'血uLҌ%8VF XPr#SYB_l|B9<{).`W~BeX}ou PXg):# ^3~%A$*wGM<2di$IeXenh!ڛXg1!jveu{#݄9_cYe<媟Wl~? n/uܔ`Ji0;\:RB3l/iݹ ks#3& $]:o7oًAگ ̉RTZq/t[_0R2?,H[_Vm{FHR/-69Dq}5hreȞWUDBLXexdtpʵ$Ra eC~a f.#:+9v-cKli߉E"/7a ohcKg_I2#{RT(9r1X`M6@` Nlul缐ZId4EdČ5RpYgU&Ktv>FVh{fߛ0>3",ٵʥE=e;dإD*[ohrfTzgpDZdCDsf&k8@z/_,[QkQX5"ɤmx'Rfi^,8܋%Y%,J ^xPUģ'?L!;BeL)bu*H! BX ޳_,)TW8y,mEyH.'=0! [J,I&W)bdm}jMkhN._Ď4ܯ3! L]8-}:ͬܫK3kWv*X9'J@-YTI ݯ#6j1wM?h|T3Y+T Z>x+QB;bTq.Ҥ&eێm]pXxI3%>D k{ "cҨ21MZG=Q4xӅ0H/겉vQt K`쩝s HDd;H'S+A̱!H/6u7xP@X˷Oٱ-7 [&,V6`J!V?EKDל TJxԧgAa륝53SerNɸF:p[1L[-ݽ[@gE,R C Ã_~<XB)iNx"T:Vu+L(!:D s,=Rb>[¢Zڞ&ANe5njOZy[jX,^BSL1$E'[X>ICmӨ@(*9|Mx4&ud9@z}Aow}mc"d!9|X\ֿjs=1gDTj\pwME%e{1u5I ,xԭwP~{r8QKJSiȴQT5BU۲8v gx~4S'j~Si1=\r A*4wS vG挞ɏFcIY#wym׊#('@@Bv*8D륒aڼU٦+jݚ'Ż?SA ܱ҄^qܘrbNu#lǾyLijU{+htX m}cёҝX xM,` =i9U)RàBMO%k˫DeRlI8n:dfmG eLZuϵNO{qMBSagi94 05¦)T<;RjX\ahcNkpiiI̢֑uMB)Ֆb՜wH}BfZC&ۙe Q艊Kن4l<2o^ՉKzadH#3l֗gY9>]"3Y toa聥BڰT|3;t֩m.F5F B&j'J_j6otPDAd tf@\ma;%5N^c:lqꛦ㼿rbV8eIfN5&vt߮X|xg7L !<X+' F+bh9ňdmv4,n;/ڐ{Sv.>|9 F f$VQTjx Yp$IMݡᆇpDj38 M8}whXF33hMsN%Iam.VcjVuZӅړ^n#p (\7HSF^ֱoql`NOfᷠ=;pdyY!{ IyLK "mϊ_m3#8/a]Џg mH *|3,k"HBqkFҨlhX2j"aE{- `mZ |4i=|ڛoP\TnK}Yv}Pu3F. ۂd:$"GSJ=uҽ/7/-9Lkl% B ^L`%Ч6˷ҳGSG)6g G/x bkAuBuf~?Q}%]X̛b=Yeï3&y(R'@e i@׬ FN5nnͮ^NQ ẀX!JU(hxT|ުjo|+o0I++QJ&Rf4qwg|1JIbƂF73^(QDe:3%4_Ҍ垝 &lxCRBIXH=CEYɰ]L@?uO) j!K%+ OG<Ô)[=K{e;\9h޿v{RBOpu es%rPhIdlQvo?p p$< Uef_SåTPd>SO#I;S`݇"zlme訓$lh $K6;4XVO"I\T/H u,  34vk济OHܵ\~U1T43kk22,W)n-nkx<\0 :l>:UpVJQ!R?_MͶFeW˽KΨ`"R[<;" Ě;ɡ77#MhW^.h99U*񼺍xXGb7$gpj:r+LZն_u]>9.K?.3'˚;b3SI)ez .c{ ?wԞʪtJك8 D=zCvJm$ՊdFۨAɏEUAFGGm]ёD¯mCos@]\JK#< &7V(6oZ7}a[cu=b6d0&2lԔe%'B α1RA쵅|PE=7XIp{0^ȮgOpSW;0)h|_l,R`R M %qؖ}Hۧ2u|u):#9 T%ad8$r Y9} ]BN9v| _4=.;:i,KJ= ^:+`XN2VIz6 ]9Ut.rW [P{zՒZ"wQŷEZVc8#n-4x+ ?'bx]@JKBy 7rѩ;(F *${ b Mo`.tH-)ʋ*t_V! :8풤Nѕ-9M_@ie ,`@cC>%J=3d͌LdsWر0wfZ KêV{@񮃀xvep@q}lqut&H.4zxGqs)#@rX^%ֿvk;,?kCTN?`ؽm#1`iaq =R8?iq 'g'MA I"Hu<$UKoC|ѯz,Hx@o2'd@xC Uhx$A4D֬ve5ľӪY>n%;Y u>Xů Tɑ׿z|=:g8}wpC#rrfs=?WiKt^]*W2zW rCk_'(&Vٮi2%1\7a@q ˊ5TgooQ mR1Z(~˳V;&:;2E}aɧ&?Kqˎ@ѶqP~~6[0x^|֨{7%6n<SѠۦ{{/ 6*n"I'`Paa+Es0?dP>fA?I^X:_|^tJ.ϓ_w<0zl2c,ۀWr,*[PKE2159H@KAɻtPee rKcw@ED-yoE"eZ%J;ϐk`S0N?"%_o?gWvvq񧗆vRN \09Nw%?Ks˅H+7:κlI֗K<ޙdInNR\AIFm]}ž܃ ,^ۧ59{Wq8Cη5[&4˔ỸKi˗~E5H1ͽ^iE7 i:ծɼ[;Hr+_psID#izw(=hZGJ2❿`-ˮ(Q-w7_Ώ Wo1 ťm?n[]XG9i UvQw73~{q.ޟgHT lC?#& W#Z$t'aP3ٍ9ɬMȗ|*EOZx+L>yۡ^)/P6F`1CDxA>ώ?6#Ns"6,b\""KkLWяL΍_yd-3QoU{`aqW&R|WAv&D'3`zx"L#aae iޫ?nP6}byZ t36.̪Hzqe`gsiȻs.S;Gy`b# F2·|'ۦ4h#:c~9ɖ֭ F2eGtۦ4h#:c9uMKօELl!墦lb]fqU8f3?'qny(Yo^fw򐲴}ku5s՟~t47CABNe劆^'aND HPY˕ӊyzd˃1ZagiȈ |RafqJj!#9/>mvz׳N'אRN)w6sPWݦ9lCn٥>107%r([J[#T̓u "K(~hXxBiJD8#fGxRRL0vx<@5 A2Q$/ J:Huq.3ˆqw|%bGgNUg,K*Iب1%჻Bo~Ú^c֖rK;= tpQ+;qʚfSfzȈB֙ +ecLA`bU" Vs}vF$E!0Cރ4A% acCT5FjO(l#Hm[ aD.|H,V6I Hi+)(Ai.J1SiUh3QJxP̏%wzψHd.? 3Pˀ)`v$O8t0DpzT;auϮ$791>=9M!+ʄۅ|@xb:Eb1`Db 36)FY XB[ױP S?paK2h"*5+B"`8rIM+խGŅ2拤@Zmn7-Q1NlqOo:lKQJ 6@ĮUbt,Yo Sܙ`:@0ẍ́XdgL<ޓD& 808b)9)"yBtA?Fi āϤ :22I ܑ6ZcA@ Nbqt@[2IqV&[3in0L"LFr6 O,JJHpyY(K Wrk\_ Ǻn]wq24哪 bwLMuC4lK2J}tI`-hٍ4w t*Zv-`, `$]쥒V[b\Ȏ)lF=dt;)o+Kq1,rgpMp#ZY%k%U}}R(4 PIP- zW\J&()6TFS'WJ&Z:2] ,+&B_2vG^ c->lv~eXV-NvεMI ZwjRNoآZ 16H |8=UK2DڂFs$AA4L*On`bZ psmJNt޵q$e/wN7\66b; cT8?_G)p)bt*"RY+B|ΐ(s8#be_NFb8ߣ|lХU,(as%4-/澤8$vRV|G^0rU-5w8'X[xJ' ڛ--9RbE~pj$%+A$j z2p%冊9#Nj𥥁)œ`[K1%KF ^sĠ=568{ab4X-QP }K)e{ >Y Oc< :Kl1iA՗ PXQP*$B+A()Ta!?FbŊsev( 7 * GLr66:arpىD,fD;('E" z0JpDG2& atIjIKج1Ά%oŔ>XqOXS7^ $/02&Ԭ&(*bh&5I"z)zc)>p 2%`!dBC߻ڵ yOKĥ<ǐɕ8B"Pܖ^ω^"{z[SKNmb3JlSFe.~.ܕmbY~eK嗞<7Y)qnyƃK{N{ݥ^M SR Jytjq$<j Ö=YR)\5dAS'HHq@\y#=ETx-A2SĆN(ƯYӕUib3&rÑ/uq?l@޲ JQŶJ(& X MSѓ] JC\g)w/z >Zny5OAl/tHClIͣz$'=}b'm;;?>|{@ytu5A@hG/Z(8`G[ %~GC[>CiuL}# & if4RQ )f` &LENPV-X<(Ɨ}Ǽ@\K9xVzqZŵҫvxl=:A(Km%9&8]Iocuf(G]us}!}IsF~2u2_ Lwf _{"/OS\KAyӬj` nG#]uAKvS<C^]D$nݸ A(M{$uj,9?N//2^HVo]%& )AGҌ'7IObfS` L`Αm:Pi5JT!t4sp$zAAV\(z *ܻ~?EkJfG7(?@F!֢5eP0\H: bNA":mٴ^QpUQu{n:>=.\|$ `Mh>Tz+& )1n( \DVKڱz _k7#q]Nel[V"2 Yft6!xAdTzoǟoXo*[תYJ*~ lWQ-&^^gE@fql,8gN)#߽DqrÇfpV;b> y9t_Oۀ +b%dfDJneȝ,iάC^bcƚ~o; eYErbeWNl7ʉ5PN<EvU$̫ n^S&JL>6_sPd:łqJ0ȹ1dD&E#~oon΀*`,x'G)><1̈́SxYjRI+&;PJ~1=RPݙ*ªZ'F_ΎiPv OtoUyv徧y:zgqG(Vy6 )Ni}V23&ڮ<t]|#mSK;KGh-o䳧w(Nހ z)B;ݜEԚC5}$@age|w|Kq&SL h<[XAܲ9 _Nl #_*Z hü|o1kkeQ`beږO4f<1jz_קmEwY7mmhNw4nEδ[*܉ڭ y"$S\Ųbg?2XM1>f/M"/[&W]sfUZGNXUJ0@ӠSb4ARNFX.͚%J>c/>6X@f\2m &I䗷H؜AmHRKէorm:QZX0 ЦqrZ q*ձTX$m"vqT¹ց]`Yri?]ij557/(yyQή?r{%Vݷ;i%^D^BsMjۤ4D-~=ScC:@z$!T 'jlQ) 6 ɵ f`:bvXSwe͍H0evg8 @o[v;/v(PU@Fꖆd_1}EJ]H `LGnpd"KN^&/u]) ul6SVNbJ"Jf-]s6@\{}b;?)n}xߘ/_OsIgY;9t&{gis'zm?\"y,\""`ڿuV Δ̜L2Vu]Kez^業1ĥ چV@ńTlK5gW68:Jvkpw|\9ȝ@2hc<-0W0|1E+ۦtKbNe)/h՞ݾR{)l̈́`]M$yoj5\BL. ߽(,K["eI7襀4KCVs H,9Y‘@˿vr9zupC_CC/EyT;3ڮʾ2A0[޿ D~/Ҩl2sW|-+i̬"d"U!!C̮õ[uonjcĂ6߼cܜFѣPZH:oL/sHzMm=׋4+=):I'FӈC"0k p]"Cڔ9W҈L˒.:>p-U /]if_Հ38th!^E2VvO7һ"0R`c)`ko BtUPX-QKC A*VҮJ(F124sH J)>X9^93t 4,Zm*i,Xy[J;`F䴠}B~Ň'3 oqՄkwhx;ᮺ7MMOk+db"3)_ƤaPUNWP ӱ8VRAzЇ-O'4JiJ 6itf|'c_ѻ%tK;VFxF9wAtuM{^PY%yiN׬&BKiS[GMn׫x|0ckwshWS\~^sR31 +lG FL|s#8*`h{b藿yEd:[h䧏MgUJ IwR4hTljt릳j-c]QQ+N-%dFw a!'Ԃ~B1}DcKĻ楂E@@7@c3ZioP6ʃ%g@ ׊=DW`׍}h!{ `)Ā=ٷ/iymǨ,kuu,M<|{z3i;5xi3 [X{K>gw9g }#c! xw;Yۈ|펇IKC蔛7|tO89*1>v:>KsZ\OFO,%G6oL{iߞ l^J9)e5yB0?Ґ[ 5%h'CvmDu }mMڴb7w>#ܝjZV3VI.{Wrvm5dj" j/i^"ab6\3g+V}@ro-LlpG‚=|g Y+߯G/fxZhBn$m/H;)d?N{nR'LEs'Ʊv]\.c[1“{'Yt/c+є&:=j G+U3~]/hMř\uc1xY_i #M.أd[o"Yc;__V-Ǟ#YOQ;tnsrE q3m-Qc# ?;Pј՚ө|BjGL NӬTH/; +F*BڳJfR71nrj[ISg+=E+%?JVnbvH+F*[cl'iR%}[KV\j3_,ko'rTP/~ID TS W6\`P,QI *YVo(&=xB&gDdw跪n}Hs'6zaD0O'>ݏ讌i6_n]hwj2Āb0ݬ3ႁQGL~7+J.4 vީ {' e0yc{o?\s׃ szÁWUF m* 8{ >\]X>yt$iD-<*tJQϣ!AGY@"0>V%Rƌ#c~B)v癭Uw Z2N*1zm q9ԕӿFM]Y]-VShEUbLZƴ[2h.kPXWIK,8AkQ^kdćcRUɁlT^0j<0jE&˘C0{%V>`#Sĺ4R41b~=f멐d$]EgEZXgI~kCF)HF G 7B\N΄4L+1B[N\g7i.!f6Ohk LdHRu{$-/ZZL;L BeYKz;ޯH ^鱲؁$@F??QutM81h8$2ڗ' B q\eR(Kl}1St}VMy9:2Ss]316 xߘE[<}:fX+ǘ^HflڐC]-M'AcP1FQ kWwAŰ'Bތ|!#i59׆FRar5 b`FA 9y)r;=.CEv/WΥ\a.8Y)oj9SGaӬZV.JOJIR}ԏlۨ\VwR5+>mT_R 6uSR%ҬT ҷFˏJH҆jZ4{R:))l rES,8IQ@h~?+\ŅZbG"ݚI`pvxA*޻ -YioV͛f5_,eiv(uu}L7{p]OF/4J#{| .f{b])Lm)=eG-918=BuyQy: =&kcH1JO:1~ 5B>292c]nD c7#HAYuI0n>0v_pcdO?1+eO FhDQ, є gi76o{)oh]13ౖc$cS)vȁσYmQ, -y2J<<Ҟ0 9e|ʘӔ\qk` ,XAW ǤuKnA sҺ$Z{@11;-V0O$idT"Wdl7μCsgqQ媪5Z2dj-gY"h4*UJLSvr=x4j\΅ՂLkS `uyaTwr@5L5YfIgi\*AT3cȵ,>i$,\`&Vymp;)yLsl˺3AŽ Gǝ8-e^I2 \\"fxX}͙]7 ~s+Tg̜&sn RvU$*<&2oKHyt,Qa{O_>Oh >[ѓgΡsnUהүQ 3ӏ ?,@Ŏę9mk[U:d!r Cy,GgBZ[eFtpr G>6|4dy0/~0-/Ζ yXx(c90'u?)SKOׂjqlt5I]4%%F)`5s _Y#(B_&6kȂ||DF-1,DB3),dYP ĕ1l4,+2f,Bϥ"h<l!JU,re7,@;n$w Wڽ3htf!DZrÄ8ڝDqK:4m]"rx ` A;8\(PlĽ'%,)!(m]^$ݼLpxT,̐Dqѕ.j29j~රr24j[{BР̝.{>Om)otμ[_ ݃NTkyXӸ:M`G:-P 9̑O*^gzʑ_1x⥁<4` ޷Dփ[lYv(d*+Vt ͭ ; I D+yFфx𡧀)K̨-0 ]ȹF^=&ClךEM]: < ň 4(9xn;h366^O'x^]DLIqkjq6qL|]ǧθT5~XQ^|ݛH8 Ɛx D3Acm8M[uV|~GI1\!rfb=7IrE{1Z河LgI +lG2 KY_~87bE, ho0gqi!Ћ9\1oӲﰩ-ډR*rC\l07bPYBkܼph]cChhT`,*OZvǫ,y^()ߔY'6i΃s`F]QZ~1*XO'M$MJL]Cι(V@dٌqiOвI^Y-_3䦮\Z. 1fFʰD,qNB>@P.Z ɸt;&.w6< NR|4]"'M~oVgb鲉rȹs.fYD(N# ` j)VQs4?2`lg[DP{ %Mo&5#blSbp<p]efD PVC,7{*#9Qvh!`.h%}` ^$ҏo緇,5ǾOi=ܒoL2,{і9wY3]r V+!V`Bܹ X8?( @$E4% & hۣ؃jT͛vr|X2#()tE$޻>ZOM3 HK&1Ȃ"rfZ]6l.:)b!Z+(2iȵO ˬWu! u*An𤇮7A u.of!tL-r\Qsozh{x Yzw_ N 9nA0H!/RG?eW|4zT%_Q͛VSץJBUXq3 b@j% <4أۭR'"?ʨ&3D" "sJ wQ3b|C%k -,DKw }Y#ݔdxkbG'VC|"Βn";)) *yQZ1r($&kӲ)Q5 yg]R߿aM,(΄4'Rtu@BU39i/e:C0v+ZFhoQj" '3 bFD ٥%Gr)?Eclmۛgm1aCIZf$̴>jL>hgwy(.`ڧ쪋iCaJxL4Ma2lop;DP x 4 _ΔtA􁑿,r+Lb{+n SEEԩn[is_F1ҢaZɄ*˻1[Wjv K-l$ >&&%PsV_/DBc*UZmq\L\S$e/<зʬѺBGYb!DjYp: H }VX sB[r>3D1YD+<琌UjT|.zI29pDłT)h{뱎v6磌5g\px]ÔQiHלE`!nE!eeWl;zmz`onPz2 ce+9IqcLXR"("kj2`\k}Ct3Dv5"uE;u rG&M(C4!ZQ9heWi8C1|˄?I\tAI=ҹa\~ϼI>yJ],"{"L`[H'v 3ܢ>"|kYƌLX)ܡ!pκVR3"Qw"!۩=2)ȶ#2PģQ k&׷r.Ǜǡ( (Ȳs+IsoJ.>`ӅoW|aQ-⸲+Af/2GU̠}eҲzG)ΊlV\ivAsź3\tdRW4`ʦ&fί&mm'CR -STluS2Bq4^ZU&JIy 4L| 1x=fJ#j,ԃ^g:8jjUW7utUDQKYC^ -+ Id"F\s_9GhSH~=@'4 Pj`M%y ꦜ!B61klk%Nm#îE|I( ;I%et)<1zc?1F[ %(8Eş߽{ŇLJwq)ѱrV,V ߤ<]#|3ήFuzCG:׿Teǜ~+ޜ2λC rg2_.efU`{rscjnt~EX">(Fn#3ȷo1ϥ~9?Oc$UgWcpk^IO/O^2Z9~h !9M^Pi籜Oh^Gr  n|Tޮl~nVw}RlVoy=bآh*.wzz6Bg?€޸ͧ,/=Djj_o՛/kN 8{~BOU43wM下M~~oCdP Qt[?^eNHG?,{j?ڌX?)nCMDg/1zTXqts?эv#ҏn~XC Rh9y,w^MF!`#%al\o `#+',c8#S Y?*s"`<*`M34#Y.ܬ:ŗ 4ss_tWmK+V݇Z#Ȕ6dB9׿_~:y

Ҫ7:w?קu,ӻ3?VPwJRc&)Yl|As>D=9e|9:Nؼ6؀cQV"Jox)#fM.bHg62ӿ.%)K5$ U)Ce.-cwUګ>,q;~|Ӎ7,] 9\j%dޞݙ/{5z5=zsJ kgm~<mɄ_Fc7E*4ӓċ_ZwNMoxI1hJɏ 8ꎰc!ëQƚqH~ߢА7;Zc\1༖LR3QI+`KFdl9=ǀ~iyX{`E |=: {y2kyO9eR[F_8_6 ???ģ$gYO "<%0|?mDD\3"XVŻibvy{/qmC3P'MXv4FݫW w^SCְK2/\׎6|Ǫ?|L9S΋1>2e$\]u<, NeBE I'II9 ]?op/ކm[U6mN@3-0B Md1sKJ, /rg&[QO4iOJ,m 9=_\LfTh 9ו>(A/Ư#X`T4 "Ǔ—Ve LCD!tt*Q!ezېz; lzFhQa1|mԣUF R6rk()Z%&1yJR6hX'E#3(~ r0A/Hy!c3jV }*ˆ"1=>[)+tvN dII#0^cFnӋfF cس/_}eX{R \pAzasj`g-77B`> pV!Vr+#>)=Q) jKz@Y#b#'6ѿhwL<9IS_Q|VaņԹ#Pt䗈pKN*ɜ4Xߡ% |`=Z|䖻-.`@o%%|0Cw \ƋLI Ƹ]^z0o~%`, `~ bDFb#zg=>bVAzT $)E`G]aaOac}ս1s8=9T93o Ynw-&96nX_P 4_t8Dz"XPF _b`ݳzA( .Ǫ,1Y)bJG GbBDt/χp7ckr꼢KR&Qx4/H<("7325~ߏ?gLfsss^a47$Sϵ"{Jb)*rV:`GPV_bo>"![׻dZˁc_-GCM=m|IuW(04Fvz8γ92=b)9T‹'FCu&_.~'q<;f8JaU$ /$%Ϟ~U%ծ7yiSipIk[R; a)Ga9^Ӽg;=;fxnxZ#zƹm!)5{$j"G UA#&HAp]oYšvm#ٰQAґ᠙e mZʄy)RB'Xa"QN}B8T( :,F)E)8}y m;HbO'6bݮ X(Pρ@| Zљ1Rf,c$&,B{WZՉZnn7uA6Uw-ݺ_ihА\E_SeV Wmk >:(Gh0bKznv6 [6b #Q!z{D/x"kw9 'TxM@L-U6wI3Iz欌x7LkLk5?2C* ŔY%|V3J<@L mxӣ:._9vof3ڪ BGza7`i NHժbAK6z"T!gv h 揌Su“n4sd?h_&826[C4 )eSRvA%:joIxXWhÒ5Ӌ/Dz27:.`J4FF@&˚MLmH]BQ&0rՂ6?{âa+H k7OFYț_L&Ƞ5ɨ0~ϔ $. X $z?Sm4@ÒV,'0]x\HL ELċ1B% P=^ծTZNX`D`d8i^J`y @h)'̆cDĠPr`Ѱ(0 G>kOJ0J ~(sJc1&yMIva̤cMQSdcnx1]Ir 3,Yt>xP1oWeCBwd%C1L(=e#ЊaWtȄG.WH(uB 7^@yZld6WM.Cdv3_+17 =dW1@-X(J}x}$r@qV:ǒZL XcFbiu/;<4]kI7$o/74̀C0(pxkPi&eF7QɘI ,e5 1.:M+|Eh,-Fi*A<$a*0I:g9+Ԭ盬 p|fmT'7#b>y9:Uza[j.z+2Bd7mxC/]y*>ϧOA㗨BV9RT*RqHPXq2]ƅ9 ava^ j!hMjY :}"lR=y1^ CXvpկ_ia͡j=Ͽv*}RRgv6V҉B>Iߖe01Ɣ{)רYyҿLezz/Oݻ6%q!q_ީU˪eUU=^ H'(/yS' PY8/XUb8S)߲4L:$̿\:s.St ?朏˸x?|k ͟{ GWu@+V։K*wu*13!3e$@+!SAVX '։@`vf-/ah 1bhצh#0c|E`ރx6ijS`B|Ž-P+C]a>./|J+H|AH,L䜁QcQF|1Lr1ѯh a49VN ^G}uoᨿ%IJ#È;AӦ"%V@`U-ZZ(! $h 1  G^Ds{XG +< _YN-ʏ5 6%e(c#0UhnR/P,Xff.^H~ɧD*^T(~}ĶJCKyPshbR Hί,1p+įLAǟs.G(>?atz-a48/WיOyC;tx~Ⱥc ʘow/]]kI_w^iW 8-OQ%N)> P<{e')\K#ɮ0f9%Tz{xzvʶwFl_D^o~ػj̴\R%jXla}9_j}˙R[Ph`B`5ϠϵRXk( jm}w𮽩 -Bmo} aD_@0,sw''s;>?\ƪ?ڜhAZ^HMEjEZ>mC~=A93 vC̯[,皜583zd|;W90w?c46&q'?v0fW76^D8\C 8*~ny+_ĊIRU$xRF#EJ'5R f\Dh۫baV6$3jA=Vfq [b8"ZOGn^ֆTquZAKJl>JQZ.ƁL@K]%u#6!Wo?Y(t,rM6}M74~߫ilzb\t)v1W[%-~xEg׌fR~ yW / ZDgot8!)Gt(IF06=/\ 14h>v_o޴7>`h%rg!BξGg=D"Ȇlu _b5|7|t}K.i23~g4G0TZvﺝ?aGw;go?;;*Oڒ1kmY6jn{f<{TZըzz׊,+3 ΃ L:9eϕ|)ըcOWk;MZ}_5\Ukޛ~bkabԕcz#?j5خ4ݺՖZݘ{V2 >=Yds_mQ˩R+ ם2^۴pϸrpwo0C~0T!1]Ϙ< toq4} C`FcԩY([ 9 dӀlV}^+i2.nnn`4 e/xA%,䕛h#RU囹ݚbPb:MQǻ=c]vݚ*nmX+7F6%wkA4}F*_ּPTwkB^TaP]NYSj41n6K~~҆ n) [L\-v}]sOޕn97GaZ-psV.3| so:*CC?V9vy+`~a'|]<^~5#qGώt쓭ϢN'K"2>|fwʊԯ5fA%n[OjIP.!7TL׳ɲe]t'kh a'7{f=YQtq0_̐iqW!5g,ϽM53R9WqT|>8 2/o@d@!ԑц0sK_,Teԋ@\9O5o+= QY'b`)Z˴D֙;:ϵ,>gbxLEp$9q{*l\a8WT|F ڰE 5 9Pw){xrਁxB[ źr 8B@p|bXXvҍ/z+ZS\Ԓc#JTahJwvT^mKηԛJ(7*N.^KF0`h׀v Atp|݆VRj8} N3RHUt(UrySkhO^yL^ޖ̾iݕzMVrEFNOq)B„q4X4QP1K"XIrE[{snU8UbH%"`MU!ʧʁ,;84u**xS{Wh#bRA97 pVӐKJf#;r \c.B Ʌ\Z-$2LjH\H`32-xH^B ]jRwy} M> E&ֈSM#6 Krqёq!.whVDXFRjz\YyFyL&GLpxf>PվFJPu[4I-vzZi-lK*˼ZHpڄk k*}/,hQY]/Wov88=LqH%V}s@E)[e" 8iF0zNMgL*&-~F ` MK/x8ZIH+G:-scU}DTsċXІqSA8@W." 8 N{9<ؼ2WLӶLbe&] AC"o{)hD $H+GPh JjaXZ*) ̙|G"p>^=ȝߏC:nמ ?8t$D"D{,Ӡ4АGhU>w/_#i{#mOH̿(di&r&/{S*n޹(dGS&k/nb$]]Pkt@7@߰MuGa1XΏWb.>~W=O3o'?q緓.?hkN/c_fvpU;cYbʧ{taLQ&֝2A"n7|HzLa*LZs~rf=NzivaĦhH2+Ty9&5s,BѢ.g~@!Η=ݻZk2aZLtc߯6`|Nz~?aBqb,qsץ]QW* >{7~zJalB0$7՝bz\;?rW)ysG`˫ݙ 3ٙ .Of?cJv6\l2`zaZ z9 A2> ¿376*zmAII݂r'@9MZ!* C޽~}6}D"r qozxc[~6)??;~Q7y%ʂ- >bQai* /"yQ(2&{.ud5b.3ycAz>Lc{ʲ,90Mf%CVʰDVR! g (Kt`}0KGe'0Ey/#yXz5*[8OJ](! X +:$T` Bm#yB7&MژM5ҙ1JP4V#HCWS9Z -s6,Oji(0X*#8HVcgƷv'{W#

>:xV$)ʾ {OÝu/֪RuUm[sjߵCC'oЦ{ggU}bb[E.j&>CqV[.Nξ Hd|^H j9D]/sa 3;kΖWg^٩zx/c&;Ve\dt,egWfu$u$cB$ݴ66S[p!_T׍ɶ'n;nrLlߐ0vЦmHɲFˀ)B:~,XOx Y77_+d3w׼~ƳG-wĽ7ed1->V5_}p)6%ޡҫD0%APcƕ5s;9y\ʪb$xf& N:~Wp*98=p-OGx 3 -1 `wB; u2xnG uokvZPH՝3SAFu*TS!F`Wmu27;{ICBN:NGź¤T#yب4ͿX`ub >|q2Xi7BQv3Sʙ_yuXȋeXo:Ou8ChTJwW֞Z/QI4+W.:8UukDu 6xniۺu ̴n}hW]t'EO5zhÑ:ǰn]1Qhcݎ*&fݺiА\Ew)-1-8NRz~ ]_cP4+z/3y\Osy3VύVMҨxzt7VT.< i}ShJVx##Wņ&HL4!aq|-%7u!,1gڢ,[ !} }/Fdl$,}~o {,]?: j׳dLuS K)̑ ]c<)kFdm`Œy^= oamTp jAN(CY^M3+]hgenƂ s]nIzZ*7h-hU(Ur:noKD4\p©\XćY`~ p[;{U ~DKvmi s讔w0Ѷ{Q}0\/e~|S΅M/" X"[`Xis9{$y?(n9+O~0?wN𱐇k]cW'yߪ$ܺ暌Y#qZv(- і`pU;pؾG ߊ],ŃcPJ|pDtp>7`^UɒsӌM3ZKf4,(#t 13314ms (13:59:17.919) Feb 18 13:59:17 crc kubenswrapper[4817]: Trace[539346253]: [13.314486157s] [13.314486157s] END Feb 18 13:59:17 crc kubenswrapper[4817]: I0218 13:59:17.919871 4817 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 18 13:59:17 crc kubenswrapper[4817]: I0218 13:59:17.921347 4817 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 18 13:59:17 crc kubenswrapper[4817]: E0218 13:59:17.923929 4817 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 18 13:59:17 crc kubenswrapper[4817]: I0218 13:59:17.925104 4817 trace.go:236] Trace[580476168]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (18-Feb-2026 13:59:06.163) (total time: 11761ms): Feb 18 13:59:17 crc kubenswrapper[4817]: Trace[580476168]: ---"Objects listed" error: 11761ms (13:59:17.924) Feb 18 13:59:17 crc kubenswrapper[4817]: Trace[580476168]: [11.761293371s] [11.761293371s] END Feb 18 13:59:17 crc kubenswrapper[4817]: I0218 13:59:17.925304 4817 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 18 13:59:17 crc kubenswrapper[4817]: I0218 13:59:17.927490 4817 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 18 13:59:17 crc kubenswrapper[4817]: I0218 13:59:17.955783 4817 csr.go:261] certificate signing request csr-528zm is approved, waiting to be issued Feb 18 13:59:17 crc kubenswrapper[4817]: I0218 13:59:17.965589 4817 csr.go:257] certificate signing request csr-528zm is issued Feb 18 13:59:17 crc kubenswrapper[4817]: I0218 13:59:17.980812 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 13:59:17 crc kubenswrapper[4817]: I0218 13:59:17.985297 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.000698 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.021993 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022063 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022092 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022114 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022138 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022159 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022183 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022204 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022227 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022248 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022273 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022341 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022362 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022383 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022408 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022430 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022450 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022473 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022493 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022516 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022534 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022557 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022578 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022602 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022620 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022637 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022660 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022686 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022710 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022731 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022759 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022788 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022810 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022834 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022865 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022892 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022913 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022931 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022962 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023004 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023025 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023053 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023082 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023109 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023138 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023167 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023195 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023222 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023248 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023274 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023301 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023339 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023368 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023398 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023428 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023454 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023482 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023511 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023539 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023566 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023595 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023620 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024133 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024163 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024190 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024220 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024249 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024275 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024311 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024336 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024364 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024393 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024424 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024451 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024479 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024521 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024549 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024578 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024611 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024639 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024669 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024697 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024722 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024750 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024778 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024804 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024836 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024867 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024897 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024934 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024961 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025005 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025032 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025061 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025085 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025107 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025128 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025147 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025166 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025188 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025208 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025363 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025392 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025420 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025445 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025468 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025500 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025529 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025559 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025580 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025607 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025636 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025663 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025686 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025714 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025743 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025773 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025803 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025830 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025855 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025881 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025908 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025937 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025995 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026027 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026057 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026093 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026122 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026150 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026177 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026205 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026233 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026255 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026282 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026308 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026333 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026361 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026382 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026402 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026422 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026443 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026463 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026485 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026504 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026523 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026541 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026560 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026578 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026597 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026618 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026637 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026654 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026672 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026689 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026705 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026722 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026739 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026757 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026776 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026793 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026811 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026828 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026846 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026864 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026882 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026901 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026916 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026934 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026952 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.026969 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027003 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027118 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027139 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027158 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027179 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027197 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027216 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027238 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027261 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027282 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027303 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027325 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027345 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027364 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027383 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027400 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027421 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027441 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027458 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027475 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027500 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027524 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027542 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027559 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027580 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027626 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027647 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027672 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027692 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027713 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027733 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027757 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027777 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027795 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027815 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027833 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027851 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027871 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.027892 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.044795 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022293 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022657 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022657 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022692 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022692 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.022751 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023031 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023040 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023085 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023302 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023479 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023521 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023669 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023683 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023693 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023688 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023696 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023831 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023877 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023884 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023921 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023936 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.023947 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024014 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024175 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024344 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024372 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024416 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024481 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024554 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024600 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024618 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024728 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024771 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024811 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024867 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.024921 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025043 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025050 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025119 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025256 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025446 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025510 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025750 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.025769 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.033809 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.037283 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.039567 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.039777 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.040806 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.044396 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.044826 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.045099 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.045342 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.046287 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.055113 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.056211 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.056470 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.056672 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.056854 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.057273 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.057499 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.057627 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.057724 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.058459 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.058622 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.058739 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.058893 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: E0218 13:59:18.059247 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 13:59:18 crc kubenswrapper[4817]: E0218 13:59:18.060457 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 13:59:18 crc kubenswrapper[4817]: E0218 13:59:18.060495 4817 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 13:59:18 crc kubenswrapper[4817]: E0218 13:59:18.060580 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:18.560548141 +0000 UTC m=+21.136084124 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.060751 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.059350 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.059574 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.061148 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.061297 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.061686 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.061891 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.062066 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.062212 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.062355 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.063002 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.063208 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.063458 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.063756 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.063850 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.063958 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.064030 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.064101 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.064160 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.064315 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.064468 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.064626 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.064718 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.065157 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.065682 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.066591 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.066704 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.067225 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.069092 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.071405 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.071413 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.071461 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.071813 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.072017 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.072075 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.072314 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.072575 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.072658 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.072764 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.072946 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.073119 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.073499 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.073305 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.073582 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.073682 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.073836 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.076330 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.076378 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.073898 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.074059 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: E0218 13:59:18.074284 4817 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.074899 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.074924 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.075179 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.075195 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.075426 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.075743 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.076490 4817 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.075753 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.076005 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.076230 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.076645 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.076814 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.076905 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.077207 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.077330 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.077360 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.077443 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.077348 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.077686 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.077875 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.077887 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.078054 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.078098 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.078220 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.078391 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.078456 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.078466 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.079008 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.079092 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.079114 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.079241 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.080192 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.080542 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.081005 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.075122 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 13:59:18 crc kubenswrapper[4817]: E0218 13:59:18.081232 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:18.581206145 +0000 UTC m=+21.156742128 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.081522 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.081879 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.082076 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: E0218 13:59:18.082089 4817 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 13:59:18 crc kubenswrapper[4817]: E0218 13:59:18.082181 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:18.582139749 +0000 UTC m=+21.157675952 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 13:59:18 crc kubenswrapper[4817]: E0218 13:59:18.082272 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:18.582260792 +0000 UTC m=+21.157796985 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.095908 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.092133 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.097428 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.097885 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.097933 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.098451 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.098506 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.098529 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: E0218 13:59:18.098542 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 13:59:18 crc kubenswrapper[4817]: E0218 13:59:18.098561 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 13:59:18 crc kubenswrapper[4817]: E0218 13:59:18.098575 4817 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 13:59:18 crc kubenswrapper[4817]: E0218 13:59:18.098643 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:18.598625596 +0000 UTC m=+21.174161579 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.098899 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.099222 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.100546 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.100725 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.101087 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.101334 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.101613 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.102092 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.102519 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.102905 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.103469 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.103526 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.103734 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.103799 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.103928 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.104303 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.104891 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.107972 4817 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 09:42:55.798272247 +0000 UTC Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.109424 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.110023 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.112661 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.113552 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.114022 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.114534 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.115686 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.116382 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.117842 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.117887 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.117992 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.118474 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.119455 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.119663 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.119853 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.119908 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.120052 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.120286 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.121657 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.121844 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.122081 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.122683 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.123477 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.126266 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.128564 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.130200 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.132049 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.132169 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.132356 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.132426 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.132512 4817 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.132568 4817 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.132623 4817 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.132678 4817 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.132729 4817 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.132805 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.132864 4817 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.132921 4817 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.132972 4817 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.133047 4817 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.133100 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.133172 4817 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.133224 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.133277 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.133329 4817 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.133386 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.133449 4817 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.133511 4817 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.133574 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.133635 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.133694 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.133746 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.133808 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.133860 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.133918 4817 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.133969 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.134038 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.134093 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.134152 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.134205 4817 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.134263 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.134313 4817 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.134381 4817 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.134444 4817 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.134521 4817 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.134586 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.134643 4817 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.134697 4817 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.134749 4817 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.134884 4817 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.134945 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.135040 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.135094 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.135148 4817 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.135212 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.135269 4817 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.135379 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.135449 4817 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.135501 4817 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.135553 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.135604 4817 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.135660 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.135722 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.135774 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.135831 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.135887 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.135961 4817 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.136043 4817 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.136108 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.136164 4817 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.136232 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.136288 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.136358 4817 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.136420 4817 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.136504 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.136597 4817 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.136669 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.136738 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.136807 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.136868 4817 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.136929 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.136998 4817 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.137060 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.137126 4817 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.137181 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.137232 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.137284 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.137340 4817 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.137395 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.133389 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.137452 4817 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.137571 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.137590 4817 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.137605 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.137617 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.137630 4817 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.133487 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.138767 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.138743 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.138807 4817 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139045 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139061 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139073 4817 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139089 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139101 4817 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139112 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139124 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139140 4817 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139155 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139169 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139185 4817 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139199 4817 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139213 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139227 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139242 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139257 4817 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139271 4817 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139286 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139311 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139325 4817 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139340 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139354 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139369 4817 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139381 4817 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139393 4817 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139405 4817 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139416 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139427 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139439 4817 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139451 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139461 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139472 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139483 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139494 4817 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139505 4817 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139514 4817 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139524 4817 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139537 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139547 4817 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139558 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139571 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139584 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139594 4817 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139604 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139614 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139625 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139634 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139645 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139655 4817 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139666 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139676 4817 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139687 4817 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139697 4817 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139708 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139718 4817 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139729 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139741 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139752 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139764 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139778 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139799 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139810 4817 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139819 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139829 4817 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139840 4817 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139850 4817 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139860 4817 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139872 4817 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139881 4817 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139891 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139902 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139911 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139921 4817 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139933 4817 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139943 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139954 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139966 4817 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.139997 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.140009 4817 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.140020 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.140031 4817 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.140041 4817 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.140051 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.140060 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.140071 4817 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.140081 4817 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.140092 4817 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.140105 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.140115 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.140126 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.140136 4817 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.140149 4817 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.140164 4817 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.140177 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.140190 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.140202 4817 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.140213 4817 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.140225 4817 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.140235 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.140246 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.140257 4817 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.140267 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.146839 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.147879 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.149621 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.152382 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.165002 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.172765 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.172826 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 13:59:18 crc kubenswrapper[4817]: E0218 13:59:18.172917 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 13:59:18 crc kubenswrapper[4817]: E0218 13:59:18.173042 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.174168 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.177627 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.178136 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.179382 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.180034 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.181029 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.181556 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.182154 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.183101 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.183641 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.183713 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.184682 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.185187 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.186291 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.186771 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.187299 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.188189 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.188693 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.189671 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.190048 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.190583 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.191498 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.191941 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.192951 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.193381 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.194506 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.194547 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.194897 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.195518 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.196603 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.198652 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.199426 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.200683 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.201304 4817 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.201416 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.203464 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.204822 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.205348 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.207760 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.208134 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.209088 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.209902 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.211966 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.212753 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.213890 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.214579 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.214931 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.215726 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.216631 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.217543 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.218385 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.219446 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.220367 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.221323 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.221797 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.223171 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.223400 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.229080 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.229648 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.230629 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.231785 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.235833 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.241924 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.241963 4817 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.241973 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.243361 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c676f0-4dbd-472a-8ee1-31adc0c27dd3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcaf32faa113768c4a407bd44c1293e8b47dea2412a4cceda99805f5d13c7011\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0109fd97f0cd99d13ff1d4b00b268ccecdbb5ab78143c6c7299526c8de0b701\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f192c80c5b84a0c8f855352f85dadcf7ca8cff63d3ebcd76ca3ecdb4459f467a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ec21fa1560cd79fad347fcf677bd1f46b09ea43f3dfe3e072d7b8c233eae48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eea321d8e7b36278725596935e697ed5a16ba2c976f519db63e438907d80dcce\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T13:59:12Z\\\",\\\"message\\\":\\\"W0218 13:59:01.491479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0218 13:59:01.491844 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771423141 cert, and key in /tmp/serving-cert-3017209942/serving-signer.crt, /tmp/serving-cert-3017209942/serving-signer.key\\\\nI0218 13:59:01.712422 1 observer_polling.go:159] Starting file observer\\\\nW0218 13:59:01.715200 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0218 13:59:01.715414 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 13:59:01.717182 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3017209942/tls.crt::/tmp/serving-cert-3017209942/tls.key\\\\\\\"\\\\nF0218 13:59:12.020707 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T13:59:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68a4078a92ba550fe821084cb5337702f3b17125dbe215bbe057112a0d057c7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a253c8589983c04399a0e26dd9ef673d6ba96e4063974031901099b6518ead1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a253c8589983c04399a0e26dd9ef673d6ba96e4063974031901099b6518ead1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T13:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T13:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T13:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.263918 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69c3a96a-f881-4310-92b3-293b5f4bfbd2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://144e1e125bf6f0fc575ac2ac61f6bb8313e28b0f893bde76ae7753f09f75efa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f536b0d62076bf44c14eb23cf13fe5bc79c67153d27a1dca9df0577aa9c9f036\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://915da8bb0a6add297b39ff7e59c7bf717e2672d5ffd6101d9d5eaf49304b9e20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a62d15097d948d2c6b3d02119e678b234c4d2aacfde6af88a5d03f93e7bad9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T13:58:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.274826 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.288261 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.305494 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.316611 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.317442 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"59dc40fd2ffeaf149b40661d4471745a9727cd482d64dc4c55d1ca03c9d88049"} Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.319586 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"936169eafd37eb66829bdf6e87304d682f2369d0afb16228852d5b16b1ec6576"} Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.321934 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"3e88192ad8b8c6e7cbab6124dfdd6a7215a605e6b562d4f1be3fdef74b625108"} Feb 18 13:59:18 crc kubenswrapper[4817]: E0218 13:59:18.331047 4817 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.331326 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.355393 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c676f0-4dbd-472a-8ee1-31adc0c27dd3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcaf32faa113768c4a407bd44c1293e8b47dea2412a4cceda99805f5d13c7011\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0109fd97f0cd99d13ff1d4b00b268ccecdbb5ab78143c6c7299526c8de0b701\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f192c80c5b84a0c8f855352f85dadcf7ca8cff63d3ebcd76ca3ecdb4459f467a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ec21fa1560cd79fad347fcf677bd1f46b09ea43f3dfe3e072d7b8c233eae48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eea321d8e7b36278725596935e697ed5a16ba2c976f519db63e438907d80dcce\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T13:59:12Z\\\",\\\"message\\\":\\\"W0218 13:59:01.491479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0218 13:59:01.491844 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771423141 cert, and key in /tmp/serving-cert-3017209942/serving-signer.crt, /tmp/serving-cert-3017209942/serving-signer.key\\\\nI0218 13:59:01.712422 1 observer_polling.go:159] Starting file observer\\\\nW0218 13:59:01.715200 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0218 13:59:01.715414 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 13:59:01.717182 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3017209942/tls.crt::/tmp/serving-cert-3017209942/tls.key\\\\\\\"\\\\nF0218 13:59:12.020707 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T13:59:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68a4078a92ba550fe821084cb5337702f3b17125dbe215bbe057112a0d057c7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a253c8589983c04399a0e26dd9ef673d6ba96e4063974031901099b6518ead1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a253c8589983c04399a0e26dd9ef673d6ba96e4063974031901099b6518ead1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T13:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T13:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T13:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.376631 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69c3a96a-f881-4310-92b3-293b5f4bfbd2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://144e1e125bf6f0fc575ac2ac61f6bb8313e28b0f893bde76ae7753f09f75efa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f536b0d62076bf44c14eb23cf13fe5bc79c67153d27a1dca9df0577aa9c9f036\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://915da8bb0a6add297b39ff7e59c7bf717e2672d5ffd6101d9d5eaf49304b9e20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a62d15097d948d2c6b3d02119e678b234c4d2aacfde6af88a5d03f93e7bad9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T13:58:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.410004 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.647321 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.647434 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 13:59:18 crc kubenswrapper[4817]: E0218 13:59:18.647528 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:19.647490647 +0000 UTC m=+22.223026630 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:18 crc kubenswrapper[4817]: E0218 13:59:18.647605 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.647620 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 13:59:18 crc kubenswrapper[4817]: E0218 13:59:18.647635 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 13:59:18 crc kubenswrapper[4817]: E0218 13:59:18.647654 4817 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.647653 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.647695 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 13:59:18 crc kubenswrapper[4817]: E0218 13:59:18.647722 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:19.647698632 +0000 UTC m=+22.223234615 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 13:59:18 crc kubenswrapper[4817]: E0218 13:59:18.647778 4817 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 13:59:18 crc kubenswrapper[4817]: E0218 13:59:18.647831 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:19.647821505 +0000 UTC m=+22.223357698 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 13:59:18 crc kubenswrapper[4817]: E0218 13:59:18.647866 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 13:59:18 crc kubenswrapper[4817]: E0218 13:59:18.647869 4817 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 13:59:18 crc kubenswrapper[4817]: E0218 13:59:18.647890 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 13:59:18 crc kubenswrapper[4817]: E0218 13:59:18.648023 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:19.647972429 +0000 UTC m=+22.223508412 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 13:59:18 crc kubenswrapper[4817]: E0218 13:59:18.648024 4817 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 13:59:18 crc kubenswrapper[4817]: E0218 13:59:18.648084 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:19.648072252 +0000 UTC m=+22.223608235 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.967560 4817 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-18 13:54:17 +0000 UTC, rotation deadline is 2027-01-04 03:58:34.27271291 +0000 UTC Feb 18 13:59:18 crc kubenswrapper[4817]: I0218 13:59:18.967639 4817 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7669h59m15.30507762s for next certificate rotation Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.108716 4817 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 05:24:42.228822452 +0000 UTC Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.171029 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 13:59:19 crc kubenswrapper[4817]: E0218 13:59:19.171201 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.327561 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6aa1481d21edba8135fdf3d83262ee26886f2ecbfa0252fd955041b33ae387c2"} Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.327611 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"fbb5902d057f8c6f37250e91d1b57e5cc8658bac2d5779f149d84fa0536c143b"} Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.329241 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"86a4c02908fa2416da618c7e2e085861e9a8d200efdb0a47a6fff70121d20ae2"} Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.353381 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c676f0-4dbd-472a-8ee1-31adc0c27dd3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcaf32faa113768c4a407bd44c1293e8b47dea2412a4cceda99805f5d13c7011\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0109fd97f0cd99d13ff1d4b00b268ccecdbb5ab78143c6c7299526c8de0b701\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f192c80c5b84a0c8f855352f85dadcf7ca8cff63d3ebcd76ca3ecdb4459f467a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ec21fa1560cd79fad347fcf677bd1f46b09ea43f3dfe3e072d7b8c233eae48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eea321d8e7b36278725596935e697ed5a16ba2c976f519db63e438907d80dcce\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T13:59:12Z\\\",\\\"message\\\":\\\"W0218 13:59:01.491479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0218 13:59:01.491844 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771423141 cert, and key in /tmp/serving-cert-3017209942/serving-signer.crt, /tmp/serving-cert-3017209942/serving-signer.key\\\\nI0218 13:59:01.712422 1 observer_polling.go:159] Starting file observer\\\\nW0218 13:59:01.715200 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0218 13:59:01.715414 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 13:59:01.717182 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3017209942/tls.crt::/tmp/serving-cert-3017209942/tls.key\\\\\\\"\\\\nF0218 13:59:12.020707 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T13:59:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68a4078a92ba550fe821084cb5337702f3b17125dbe215bbe057112a0d057c7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a253c8589983c04399a0e26dd9ef673d6ba96e4063974031901099b6518ead1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a253c8589983c04399a0e26dd9ef673d6ba96e4063974031901099b6518ead1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T13:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T13:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T13:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:19Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.368473 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69c3a96a-f881-4310-92b3-293b5f4bfbd2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://144e1e125bf6f0fc575ac2ac61f6bb8313e28b0f893bde76ae7753f09f75efa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f536b0d62076bf44c14eb23cf13fe5bc79c67153d27a1dca9df0577aa9c9f036\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://915da8bb0a6add297b39ff7e59c7bf717e2672d5ffd6101d9d5eaf49304b9e20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a62d15097d948d2c6b3d02119e678b234c4d2aacfde6af88a5d03f93e7bad9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T13:58:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:19Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.386009 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:19Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.400504 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:19Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.426435 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:19Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.447132 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:19Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.474996 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aa1481d21edba8135fdf3d83262ee26886f2ecbfa0252fd955041b33ae387c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb5902d057f8c6f37250e91d1b57e5cc8658bac2d5779f149d84fa0536c143b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:19Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.507437 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:19Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.529102 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a4c02908fa2416da618c7e2e085861e9a8d200efdb0a47a6fff70121d20ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:19Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.543507 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aa1481d21edba8135fdf3d83262ee26886f2ecbfa0252fd955041b33ae387c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb5902d057f8c6f37250e91d1b57e5cc8658bac2d5779f149d84fa0536c143b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:19Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.561480 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:19Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.573897 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:19Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.600314 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c676f0-4dbd-472a-8ee1-31adc0c27dd3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcaf32faa113768c4a407bd44c1293e8b47dea2412a4cceda99805f5d13c7011\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0109fd97f0cd99d13ff1d4b00b268ccecdbb5ab78143c6c7299526c8de0b701\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f192c80c5b84a0c8f855352f85dadcf7ca8cff63d3ebcd76ca3ecdb4459f467a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ec21fa1560cd79fad347fcf677bd1f46b09ea43f3dfe3e072d7b8c233eae48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eea321d8e7b36278725596935e697ed5a16ba2c976f519db63e438907d80dcce\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T13:59:12Z\\\",\\\"message\\\":\\\"W0218 13:59:01.491479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0218 13:59:01.491844 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771423141 cert, and key in /tmp/serving-cert-3017209942/serving-signer.crt, /tmp/serving-cert-3017209942/serving-signer.key\\\\nI0218 13:59:01.712422 1 observer_polling.go:159] Starting file observer\\\\nW0218 13:59:01.715200 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0218 13:59:01.715414 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 13:59:01.717182 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3017209942/tls.crt::/tmp/serving-cert-3017209942/tls.key\\\\\\\"\\\\nF0218 13:59:12.020707 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T13:59:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68a4078a92ba550fe821084cb5337702f3b17125dbe215bbe057112a0d057c7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a253c8589983c04399a0e26dd9ef673d6ba96e4063974031901099b6518ead1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a253c8589983c04399a0e26dd9ef673d6ba96e4063974031901099b6518ead1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T13:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T13:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T13:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:19Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.616931 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69c3a96a-f881-4310-92b3-293b5f4bfbd2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://144e1e125bf6f0fc575ac2ac61f6bb8313e28b0f893bde76ae7753f09f75efa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f536b0d62076bf44c14eb23cf13fe5bc79c67153d27a1dca9df0577aa9c9f036\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://915da8bb0a6add297b39ff7e59c7bf717e2672d5ffd6101d9d5eaf49304b9e20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a62d15097d948d2c6b3d02119e678b234c4d2aacfde6af88a5d03f93e7bad9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T13:58:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:19Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.642169 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:19Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.657612 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.657717 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.657775 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.657871 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 13:59:19 crc kubenswrapper[4817]: E0218 13:59:19.657914 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:21.657863848 +0000 UTC m=+24.233399831 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.658004 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 13:59:19 crc kubenswrapper[4817]: E0218 13:59:19.658138 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 13:59:19 crc kubenswrapper[4817]: E0218 13:59:19.658169 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 13:59:19 crc kubenswrapper[4817]: E0218 13:59:19.658181 4817 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 13:59:19 crc kubenswrapper[4817]: E0218 13:59:19.658269 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:21.658257869 +0000 UTC m=+24.233794032 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 13:59:19 crc kubenswrapper[4817]: E0218 13:59:19.658191 4817 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 13:59:19 crc kubenswrapper[4817]: E0218 13:59:19.658300 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 13:59:19 crc kubenswrapper[4817]: E0218 13:59:19.658328 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:21.65832096 +0000 UTC m=+24.233856943 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 13:59:19 crc kubenswrapper[4817]: E0218 13:59:19.658335 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 13:59:19 crc kubenswrapper[4817]: E0218 13:59:19.658358 4817 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 13:59:19 crc kubenswrapper[4817]: E0218 13:59:19.658379 4817 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 13:59:19 crc kubenswrapper[4817]: E0218 13:59:19.658416 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:21.658406152 +0000 UTC m=+24.233942135 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 13:59:19 crc kubenswrapper[4817]: E0218 13:59:19.658435 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:21.658424783 +0000 UTC m=+24.233960766 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.660892 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:19Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.824846 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-qhbvx"] Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.825667 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qhbvx" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.829245 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.830605 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.830690 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.832689 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.850519 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aa1481d21edba8135fdf3d83262ee26886f2ecbfa0252fd955041b33ae387c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb5902d057f8c6f37250e91d1b57e5cc8658bac2d5779f149d84fa0536c143b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:19Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.860400 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5693edce-06cb-4180-b00d-6f3c63da3457-serviceca\") pod \"node-ca-qhbvx\" (UID: \"5693edce-06cb-4180-b00d-6f3c63da3457\") " pod="openshift-image-registry/node-ca-qhbvx" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.860446 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbfb9\" (UniqueName: \"kubernetes.io/projected/5693edce-06cb-4180-b00d-6f3c63da3457-kube-api-access-qbfb9\") pod \"node-ca-qhbvx\" (UID: \"5693edce-06cb-4180-b00d-6f3c63da3457\") " pod="openshift-image-registry/node-ca-qhbvx" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.860512 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5693edce-06cb-4180-b00d-6f3c63da3457-host\") pod \"node-ca-qhbvx\" (UID: \"5693edce-06cb-4180-b00d-6f3c63da3457\") " pod="openshift-image-registry/node-ca-qhbvx" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.869613 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:19Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.888419 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhbvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5693edce-06cb-4180-b00d-6f3c63da3457\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbfb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T13:59:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhbvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:19Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.905169 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a4c02908fa2416da618c7e2e085861e9a8d200efdb0a47a6fff70121d20ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:19Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.919020 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69c3a96a-f881-4310-92b3-293b5f4bfbd2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://144e1e125bf6f0fc575ac2ac61f6bb8313e28b0f893bde76ae7753f09f75efa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f536b0d62076bf44c14eb23cf13fe5bc79c67153d27a1dca9df0577aa9c9f036\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://915da8bb0a6add297b39ff7e59c7bf717e2672d5ffd6101d9d5eaf49304b9e20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a62d15097d948d2c6b3d02119e678b234c4d2aacfde6af88a5d03f93e7bad9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T13:58:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:19Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.936776 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:19Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.954344 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:19Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.961950 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5693edce-06cb-4180-b00d-6f3c63da3457-serviceca\") pod \"node-ca-qhbvx\" (UID: \"5693edce-06cb-4180-b00d-6f3c63da3457\") " pod="openshift-image-registry/node-ca-qhbvx" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.962110 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbfb9\" (UniqueName: \"kubernetes.io/projected/5693edce-06cb-4180-b00d-6f3c63da3457-kube-api-access-qbfb9\") pod \"node-ca-qhbvx\" (UID: \"5693edce-06cb-4180-b00d-6f3c63da3457\") " pod="openshift-image-registry/node-ca-qhbvx" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.962226 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5693edce-06cb-4180-b00d-6f3c63da3457-host\") pod \"node-ca-qhbvx\" (UID: \"5693edce-06cb-4180-b00d-6f3c63da3457\") " pod="openshift-image-registry/node-ca-qhbvx" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.962313 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5693edce-06cb-4180-b00d-6f3c63da3457-host\") pod \"node-ca-qhbvx\" (UID: \"5693edce-06cb-4180-b00d-6f3c63da3457\") " pod="openshift-image-registry/node-ca-qhbvx" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.963131 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5693edce-06cb-4180-b00d-6f3c63da3457-serviceca\") pod \"node-ca-qhbvx\" (UID: \"5693edce-06cb-4180-b00d-6f3c63da3457\") " pod="openshift-image-registry/node-ca-qhbvx" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.966843 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:19Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.983147 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbfb9\" (UniqueName: \"kubernetes.io/projected/5693edce-06cb-4180-b00d-6f3c63da3457-kube-api-access-qbfb9\") pod \"node-ca-qhbvx\" (UID: \"5693edce-06cb-4180-b00d-6f3c63da3457\") " pod="openshift-image-registry/node-ca-qhbvx" Feb 18 13:59:19 crc kubenswrapper[4817]: I0218 13:59:19.985182 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c676f0-4dbd-472a-8ee1-31adc0c27dd3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcaf32faa113768c4a407bd44c1293e8b47dea2412a4cceda99805f5d13c7011\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0109fd97f0cd99d13ff1d4b00b268ccecdbb5ab78143c6c7299526c8de0b701\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f192c80c5b84a0c8f855352f85dadcf7ca8cff63d3ebcd76ca3ecdb4459f467a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ec21fa1560cd79fad347fcf677bd1f46b09ea43f3dfe3e072d7b8c233eae48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eea321d8e7b36278725596935e697ed5a16ba2c976f519db63e438907d80dcce\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T13:59:12Z\\\",\\\"message\\\":\\\"W0218 13:59:01.491479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0218 13:59:01.491844 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771423141 cert, and key in /tmp/serving-cert-3017209942/serving-signer.crt, /tmp/serving-cert-3017209942/serving-signer.key\\\\nI0218 13:59:01.712422 1 observer_polling.go:159] Starting file observer\\\\nW0218 13:59:01.715200 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0218 13:59:01.715414 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 13:59:01.717182 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3017209942/tls.crt::/tmp/serving-cert-3017209942/tls.key\\\\\\\"\\\\nF0218 13:59:12.020707 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T13:59:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68a4078a92ba550fe821084cb5337702f3b17125dbe215bbe057112a0d057c7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a253c8589983c04399a0e26dd9ef673d6ba96e4063974031901099b6518ead1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a253c8589983c04399a0e26dd9ef673d6ba96e4063974031901099b6518ead1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T13:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T13:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T13:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:19Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.109465 4817 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 02:04:46.457396878 +0000 UTC Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.138499 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qhbvx" Feb 18 13:59:20 crc kubenswrapper[4817]: W0218 13:59:20.152178 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5693edce_06cb_4180_b00d_6f3c63da3457.slice/crio-b9fb2676e2a59b072ee9de8419841e746a34aadcaa824856f383ceafd4179362 WatchSource:0}: Error finding container b9fb2676e2a59b072ee9de8419841e746a34aadcaa824856f383ceafd4179362: Status 404 returned error can't find the container with id b9fb2676e2a59b072ee9de8419841e746a34aadcaa824856f383ceafd4179362 Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.171690 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.171740 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 13:59:20 crc kubenswrapper[4817]: E0218 13:59:20.171879 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 13:59:20 crc kubenswrapper[4817]: E0218 13:59:20.171993 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.211855 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-5h4bp"] Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.212305 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-g6zzb"] Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.212530 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5h4bp" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.212926 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.216933 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.217067 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.217225 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.217735 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.218015 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.218155 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.218328 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.220693 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.231024 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qhbvx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5693edce-06cb-4180-b00d-6f3c63da3457\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbfb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T13:59:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qhbvx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:20Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.253383 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86a4c02908fa2416da618c7e2e085861e9a8d200efdb0a47a6fff70121d20ae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:20Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.266117 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aa1481d21edba8135fdf3d83262ee26886f2ecbfa0252fd955041b33ae387c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb5902d057f8c6f37250e91d1b57e5cc8658bac2d5779f149d84fa0536c143b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:20Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.269340 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rn6q\" (UniqueName: \"kubernetes.io/projected/cb572c3b-50c1-4c26-8e38-214c61889f96-kube-api-access-4rn6q\") pod \"node-resolver-5h4bp\" (UID: \"cb572c3b-50c1-4c26-8e38-214c61889f96\") " pod="openshift-dns/node-resolver-5h4bp" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.269385 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cb572c3b-50c1-4c26-8e38-214c61889f96-hosts-file\") pod \"node-resolver-5h4bp\" (UID: \"cb572c3b-50c1-4c26-8e38-214c61889f96\") " pod="openshift-dns/node-resolver-5h4bp" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.269412 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5-mcd-auth-proxy-config\") pod \"machine-config-daemon-g6zzb\" (UID: \"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5\") " pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.269438 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5-rootfs\") pod \"machine-config-daemon-g6zzb\" (UID: \"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5\") " pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.269457 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5-proxy-tls\") pod \"machine-config-daemon-g6zzb\" (UID: \"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5\") " pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.269489 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnwqj\" (UniqueName: \"kubernetes.io/projected/b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5-kube-api-access-dnwqj\") pod \"machine-config-daemon-g6zzb\" (UID: \"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5\") " pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.282453 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:20Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.295367 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:20Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.310139 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:20Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.328178 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:20Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.334021 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qhbvx" event={"ID":"5693edce-06cb-4180-b00d-6f3c63da3457","Type":"ContainerStarted","Data":"b9fb2676e2a59b072ee9de8419841e746a34aadcaa824856f383ceafd4179362"} Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.341851 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5h4bp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb572c3b-50c1-4c26-8e38-214c61889f96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rn6q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T13:59:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5h4bp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:20Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.356922 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c676f0-4dbd-472a-8ee1-31adc0c27dd3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:58:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcaf32faa113768c4a407bd44c1293e8b47dea2412a4cceda99805f5d13c7011\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0109fd97f0cd99d13ff1d4b00b268ccecdbb5ab78143c6c7299526c8de0b701\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f192c80c5b84a0c8f855352f85dadcf7ca8cff63d3ebcd76ca3ecdb4459f467a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7ec21fa1560cd79fad347fcf677bd1f46b09ea43f3dfe3e072d7b8c233eae48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eea321d8e7b36278725596935e697ed5a16ba2c976f519db63e438907d80dcce\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-18T13:59:12Z\\\",\\\"message\\\":\\\"W0218 13:59:01.491479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0218 13:59:01.491844 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771423141 cert, and key in /tmp/serving-cert-3017209942/serving-signer.crt, /tmp/serving-cert-3017209942/serving-signer.key\\\\nI0218 13:59:01.712422 1 observer_polling.go:159] Starting file observer\\\\nW0218 13:59:01.715200 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0218 13:59:01.715414 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0218 13:59:01.717182 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3017209942/tls.crt::/tmp/serving-cert-3017209942/tls.key\\\\\\\"\\\\nF0218 13:59:12.020707 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-18T13:59:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68a4078a92ba550fe821084cb5337702f3b17125dbe215bbe057112a0d057c7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a253c8589983c04399a0e26dd9ef673d6ba96e4063974031901099b6518ead1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a253c8589983c04399a0e26dd9ef673d6ba96e4063974031901099b6518ead1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-18T13:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-18T13:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T13:58:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:20Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.370162 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cb572c3b-50c1-4c26-8e38-214c61889f96-hosts-file\") pod \"node-resolver-5h4bp\" (UID: \"cb572c3b-50c1-4c26-8e38-214c61889f96\") " pod="openshift-dns/node-resolver-5h4bp" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.370254 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5-mcd-auth-proxy-config\") pod \"machine-config-daemon-g6zzb\" (UID: \"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5\") " pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.370375 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cb572c3b-50c1-4c26-8e38-214c61889f96-hosts-file\") pod \"node-resolver-5h4bp\" (UID: \"cb572c3b-50c1-4c26-8e38-214c61889f96\") " pod="openshift-dns/node-resolver-5h4bp" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.370656 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5-rootfs\") pod \"machine-config-daemon-g6zzb\" (UID: \"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5\") " pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.370722 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5-rootfs\") pod \"machine-config-daemon-g6zzb\" (UID: \"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5\") " pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.370751 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5-proxy-tls\") pod \"machine-config-daemon-g6zzb\" (UID: \"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5\") " pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.370847 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnwqj\" (UniqueName: \"kubernetes.io/projected/b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5-kube-api-access-dnwqj\") pod \"machine-config-daemon-g6zzb\" (UID: \"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5\") " pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.370909 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rn6q\" (UniqueName: \"kubernetes.io/projected/cb572c3b-50c1-4c26-8e38-214c61889f96-kube-api-access-4rn6q\") pod \"node-resolver-5h4bp\" (UID: \"cb572c3b-50c1-4c26-8e38-214c61889f96\") " pod="openshift-dns/node-resolver-5h4bp" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.371236 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5-mcd-auth-proxy-config\") pod \"machine-config-daemon-g6zzb\" (UID: \"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5\") " pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.375512 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5-proxy-tls\") pod \"machine-config-daemon-g6zzb\" (UID: \"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5\") " pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.379886 4817 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69c3a96a-f881-4310-92b3-293b5f4bfbd2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:59:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-18T13:58:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://144e1e125bf6f0fc575ac2ac61f6bb8313e28b0f893bde76ae7753f09f75efa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f536b0d62076bf44c14eb23cf13fe5bc79c67153d27a1dca9df0577aa9c9f036\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://915da8bb0a6add297b39ff7e59c7bf717e2672d5ffd6101d9d5eaf49304b9e20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a62d15097d948d2c6b3d02119e678b234c4d2aacfde6af88a5d03f93e7bad9a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-18T13:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-18T13:58:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-18T13:59:20Z is after 2025-08-24T17:21:41Z" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.394102 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnwqj\" (UniqueName: \"kubernetes.io/projected/b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5-kube-api-access-dnwqj\") pod \"machine-config-daemon-g6zzb\" (UID: \"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5\") " pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.396141 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rn6q\" (UniqueName: \"kubernetes.io/projected/cb572c3b-50c1-4c26-8e38-214c61889f96-kube-api-access-4rn6q\") pod \"node-resolver-5h4bp\" (UID: \"cb572c3b-50c1-4c26-8e38-214c61889f96\") " pod="openshift-dns/node-resolver-5h4bp" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.493950 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=2.49391895 podStartE2EDuration="2.49391895s" podCreationTimestamp="2026-02-18 13:59:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:20.493691804 +0000 UTC m=+23.069227787" watchObservedRunningTime="2026-02-18 13:59:20.49391895 +0000 UTC m=+23.069454933" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.512523 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=2.512501601 podStartE2EDuration="2.512501601s" podCreationTimestamp="2026-02-18 13:59:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:20.511377872 +0000 UTC m=+23.086913855" watchObservedRunningTime="2026-02-18 13:59:20.512501601 +0000 UTC m=+23.088037584" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.531787 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5h4bp" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.541625 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" Feb 18 13:59:20 crc kubenswrapper[4817]: W0218 13:59:20.544431 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb572c3b_50c1_4c26_8e38_214c61889f96.slice/crio-5eb979dcc9c4e9a2f0d00345cb8681fa7cf3eb84b65583057f0bb6a6aab4dc0f WatchSource:0}: Error finding container 5eb979dcc9c4e9a2f0d00345cb8681fa7cf3eb84b65583057f0bb6a6aab4dc0f: Status 404 returned error can't find the container with id 5eb979dcc9c4e9a2f0d00345cb8681fa7cf3eb84b65583057f0bb6a6aab4dc0f Feb 18 13:59:20 crc kubenswrapper[4817]: W0218 13:59:20.559722 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5c599bf_d7b3_4cd0_9fe8_e31ac79192a5.slice/crio-05d41d369d695d81559ba9bf2a312396240b13a2d5dc0fb3a65070f7d4afb080 WatchSource:0}: Error finding container 05d41d369d695d81559ba9bf2a312396240b13a2d5dc0fb3a65070f7d4afb080: Status 404 returned error can't find the container with id 05d41d369d695d81559ba9bf2a312396240b13a2d5dc0fb3a65070f7d4afb080 Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.604017 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zh96d"] Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.604756 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.606349 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-xkbz6"] Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.606536 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: W0218 13:59:20.607352 4817 reflector.go:561] object-"openshift-ovn-kubernetes"/"env-overrides": failed to list *v1.ConfigMap: configmaps "env-overrides" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Feb 18 13:59:20 crc kubenswrapper[4817]: E0218 13:59:20.607382 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"env-overrides\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"env-overrides\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:20 crc kubenswrapper[4817]: W0218 13:59:20.607434 4817 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovnkube-script-lib": failed to list *v1.ConfigMap: configmaps "ovnkube-script-lib" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Feb 18 13:59:20 crc kubenswrapper[4817]: E0218 13:59:20.607446 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"ovnkube-script-lib\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.607912 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-pvcsq"] Feb 18 13:59:20 crc kubenswrapper[4817]: W0218 13:59:20.608446 4817 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl": failed to list *v1.Secret: secrets "ovn-kubernetes-node-dockercfg-pwtwl" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Feb 18 13:59:20 crc kubenswrapper[4817]: E0218 13:59:20.608470 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-pwtwl\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"ovn-kubernetes-node-dockercfg-pwtwl\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:20 crc kubenswrapper[4817]: W0218 13:59:20.608513 4817 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert": failed to list *v1.Secret: secrets "ovn-node-metrics-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Feb 18 13:59:20 crc kubenswrapper[4817]: E0218 13:59:20.608525 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"ovn-node-metrics-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.608591 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-pvcsq" Feb 18 13:59:20 crc kubenswrapper[4817]: W0218 13:59:20.608908 4817 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovnkube-config": failed to list *v1.ConfigMap: configmaps "ovnkube-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Feb 18 13:59:20 crc kubenswrapper[4817]: E0218 13:59:20.608925 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"ovnkube-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:20 crc kubenswrapper[4817]: W0218 13:59:20.608967 4817 reflector.go:561] object-"openshift-ovn-kubernetes"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Feb 18 13:59:20 crc kubenswrapper[4817]: E0218 13:59:20.608995 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:20 crc kubenswrapper[4817]: W0218 13:59:20.609023 4817 reflector.go:561] object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Feb 18 13:59:20 crc kubenswrapper[4817]: E0218 13:59:20.609032 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:20 crc kubenswrapper[4817]: W0218 13:59:20.609816 4817 reflector.go:561] object-"openshift-multus"/"default-cni-sysctl-allowlist": failed to list *v1.ConfigMap: configmaps "default-cni-sysctl-allowlist" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Feb 18 13:59:20 crc kubenswrapper[4817]: E0218 13:59:20.609839 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"default-cni-sysctl-allowlist\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:20 crc kubenswrapper[4817]: W0218 13:59:20.609894 4817 reflector.go:561] object-"openshift-multus"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Feb 18 13:59:20 crc kubenswrapper[4817]: E0218 13:59:20.609907 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:20 crc kubenswrapper[4817]: W0218 13:59:20.609930 4817 reflector.go:561] object-"openshift-multus"/"cni-copy-resources": failed to list *v1.ConfigMap: configmaps "cni-copy-resources" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Feb 18 13:59:20 crc kubenswrapper[4817]: E0218 13:59:20.609939 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"cni-copy-resources\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"cni-copy-resources\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:20 crc kubenswrapper[4817]: W0218 13:59:20.609963 4817 reflector.go:561] object-"openshift-multus"/"multus-daemon-config": failed to list *v1.ConfigMap: configmaps "multus-daemon-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Feb 18 13:59:20 crc kubenswrapper[4817]: E0218 13:59:20.609991 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"multus-daemon-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"multus-daemon-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:20 crc kubenswrapper[4817]: W0218 13:59:20.610105 4817 reflector.go:561] object-"openshift-multus"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Feb 18 13:59:20 crc kubenswrapper[4817]: E0218 13:59:20.610118 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:20 crc kubenswrapper[4817]: W0218 13:59:20.610143 4817 reflector.go:561] object-"openshift-multus"/"default-dockercfg-2q5b6": failed to list *v1.Secret: secrets "default-dockercfg-2q5b6" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Feb 18 13:59:20 crc kubenswrapper[4817]: E0218 13:59:20.610152 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"default-dockercfg-2q5b6\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"default-dockercfg-2q5b6\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:20 crc kubenswrapper[4817]: W0218 13:59:20.610184 4817 reflector.go:561] object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz": failed to list *v1.Secret: secrets "multus-ancillary-tools-dockercfg-vnmsz" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Feb 18 13:59:20 crc kubenswrapper[4817]: E0218 13:59:20.610192 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vnmsz\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"multus-ancillary-tools-dockercfg-vnmsz\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.673870 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-host-run-k8s-cni-cncf-io\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.673913 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/04978aec-7cd4-435f-a1d9-d3e0223c0e75-multus-daemon-config\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.673933 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/539032ff-0878-4adb-88ba-770644cf6912-cnibin\") pod \"multus-additional-cni-plugins-pvcsq\" (UID: \"539032ff-0878-4adb-88ba-770644cf6912\") " pod="openshift-multus/multus-additional-cni-plugins-pvcsq" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.673951 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-etc-kubernetes\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.674113 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-run-ovn\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.674181 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-node-log\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.674236 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-system-cni-dir\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.674281 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-multus-cni-dir\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.674298 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-kubelet\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.674322 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-run-openvswitch\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.674343 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-log-socket\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.674364 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-ovnkube-script-lib\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.674380 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfz4f\" (UniqueName: \"kubernetes.io/projected/539032ff-0878-4adb-88ba-770644cf6912-kube-api-access-cfz4f\") pod \"multus-additional-cni-plugins-pvcsq\" (UID: \"539032ff-0878-4adb-88ba-770644cf6912\") " pod="openshift-multus/multus-additional-cni-plugins-pvcsq" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.674430 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-cnibin\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.674449 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-hostroot\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.674474 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-host-var-lib-kubelet\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.674495 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-multus-conf-dir\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.674511 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/539032ff-0878-4adb-88ba-770644cf6912-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pvcsq\" (UID: \"539032ff-0878-4adb-88ba-770644cf6912\") " pod="openshift-multus/multus-additional-cni-plugins-pvcsq" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.674530 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-systemd-units\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.674568 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-host-var-lib-cni-bin\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.674587 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-env-overrides\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.674608 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh8c5\" (UniqueName: \"kubernetes.io/projected/04978aec-7cd4-435f-a1d9-d3e0223c0e75-kube-api-access-gh8c5\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.674628 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.674644 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-ovn-node-metrics-cert\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.674675 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-cni-bin\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.674693 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/539032ff-0878-4adb-88ba-770644cf6912-system-cni-dir\") pod \"multus-additional-cni-plugins-pvcsq\" (UID: \"539032ff-0878-4adb-88ba-770644cf6912\") " pod="openshift-multus/multus-additional-cni-plugins-pvcsq" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.674738 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-os-release\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.674757 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-run-netns\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.674773 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-ovnkube-config\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.674804 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-multus-socket-dir-parent\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.674834 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npdcw\" (UniqueName: \"kubernetes.io/projected/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-kube-api-access-npdcw\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.674847 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/539032ff-0878-4adb-88ba-770644cf6912-os-release\") pod \"multus-additional-cni-plugins-pvcsq\" (UID: \"539032ff-0878-4adb-88ba-770644cf6912\") " pod="openshift-multus/multus-additional-cni-plugins-pvcsq" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.674870 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/539032ff-0878-4adb-88ba-770644cf6912-cni-binary-copy\") pod \"multus-additional-cni-plugins-pvcsq\" (UID: \"539032ff-0878-4adb-88ba-770644cf6912\") " pod="openshift-multus/multus-additional-cni-plugins-pvcsq" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.674902 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-run-systemd\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.674921 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-var-lib-openvswitch\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.674955 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-host-run-netns\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.674972 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/539032ff-0878-4adb-88ba-770644cf6912-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pvcsq\" (UID: \"539032ff-0878-4adb-88ba-770644cf6912\") " pod="openshift-multus/multus-additional-cni-plugins-pvcsq" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.675028 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/04978aec-7cd4-435f-a1d9-d3e0223c0e75-cni-binary-copy\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.675043 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-host-var-lib-cni-multus\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.675057 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-host-run-multus-certs\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.675079 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-cni-netd\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.675097 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-slash\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.675118 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-etc-openvswitch\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.675133 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-run-ovn-kubernetes\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.776382 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-cnibin\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.776437 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-hostroot\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.776467 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-host-var-lib-kubelet\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.776492 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-multus-conf-dir\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.776516 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/539032ff-0878-4adb-88ba-770644cf6912-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pvcsq\" (UID: \"539032ff-0878-4adb-88ba-770644cf6912\") " pod="openshift-multus/multus-additional-cni-plugins-pvcsq" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.776542 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-systemd-units\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.776577 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-host-var-lib-cni-bin\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.776596 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-env-overrides\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.776616 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh8c5\" (UniqueName: \"kubernetes.io/projected/04978aec-7cd4-435f-a1d9-d3e0223c0e75-kube-api-access-gh8c5\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.776640 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.776664 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-ovn-node-metrics-cert\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.776683 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-cni-bin\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.776701 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/539032ff-0878-4adb-88ba-770644cf6912-system-cni-dir\") pod \"multus-additional-cni-plugins-pvcsq\" (UID: \"539032ff-0878-4adb-88ba-770644cf6912\") " pod="openshift-multus/multus-additional-cni-plugins-pvcsq" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.776722 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-os-release\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.776744 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-run-netns\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.776762 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-ovnkube-config\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.776783 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-multus-socket-dir-parent\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.776803 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npdcw\" (UniqueName: \"kubernetes.io/projected/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-kube-api-access-npdcw\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.776825 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/539032ff-0878-4adb-88ba-770644cf6912-os-release\") pod \"multus-additional-cni-plugins-pvcsq\" (UID: \"539032ff-0878-4adb-88ba-770644cf6912\") " pod="openshift-multus/multus-additional-cni-plugins-pvcsq" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.776846 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/539032ff-0878-4adb-88ba-770644cf6912-cni-binary-copy\") pod \"multus-additional-cni-plugins-pvcsq\" (UID: \"539032ff-0878-4adb-88ba-770644cf6912\") " pod="openshift-multus/multus-additional-cni-plugins-pvcsq" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.776866 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-run-systemd\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.776891 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-var-lib-openvswitch\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.776922 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-host-run-netns\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.776943 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/539032ff-0878-4adb-88ba-770644cf6912-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pvcsq\" (UID: \"539032ff-0878-4adb-88ba-770644cf6912\") " pod="openshift-multus/multus-additional-cni-plugins-pvcsq" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.776973 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/04978aec-7cd4-435f-a1d9-d3e0223c0e75-cni-binary-copy\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.777014 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-host-var-lib-cni-multus\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.777035 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-host-run-multus-certs\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.777057 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-cni-netd\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.777078 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-slash\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.777098 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-etc-openvswitch\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.777120 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-run-ovn-kubernetes\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.777153 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-host-run-k8s-cni-cncf-io\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.777175 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/04978aec-7cd4-435f-a1d9-d3e0223c0e75-multus-daemon-config\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.777198 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/539032ff-0878-4adb-88ba-770644cf6912-cnibin\") pod \"multus-additional-cni-plugins-pvcsq\" (UID: \"539032ff-0878-4adb-88ba-770644cf6912\") " pod="openshift-multus/multus-additional-cni-plugins-pvcsq" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.777221 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-etc-kubernetes\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.777247 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-run-ovn\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.777268 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-node-log\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.777290 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-system-cni-dir\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.777319 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-multus-cni-dir\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.777339 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-kubelet\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.777360 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-run-openvswitch\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.777378 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-log-socket\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.777397 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-ovnkube-script-lib\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.777418 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfz4f\" (UniqueName: \"kubernetes.io/projected/539032ff-0878-4adb-88ba-770644cf6912-kube-api-access-cfz4f\") pod \"multus-additional-cni-plugins-pvcsq\" (UID: \"539032ff-0878-4adb-88ba-770644cf6912\") " pod="openshift-multus/multus-additional-cni-plugins-pvcsq" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.777750 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-cnibin\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.777795 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-hostroot\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.777820 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-host-var-lib-kubelet\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.777850 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-multus-conf-dir\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.777952 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-host-var-lib-cni-multus\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.778001 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-host-run-multus-certs\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.778032 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-cni-netd\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.778061 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-slash\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.778092 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-etc-openvswitch\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.778123 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-run-ovn-kubernetes\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.778152 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-host-run-k8s-cni-cncf-io\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.778209 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/539032ff-0878-4adb-88ba-770644cf6912-cnibin\") pod \"multus-additional-cni-plugins-pvcsq\" (UID: \"539032ff-0878-4adb-88ba-770644cf6912\") " pod="openshift-multus/multus-additional-cni-plugins-pvcsq" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.778241 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-etc-kubernetes\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.778307 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-run-ovn\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.778291 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-systemd-units\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.778333 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-host-var-lib-cni-bin\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.778353 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-node-log\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.778421 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-system-cni-dir\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.778575 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-kubelet\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.778611 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-multus-cni-dir\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.778648 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-run-openvswitch\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.778665 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/539032ff-0878-4adb-88ba-770644cf6912-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pvcsq\" (UID: \"539032ff-0878-4adb-88ba-770644cf6912\") " pod="openshift-multus/multus-additional-cni-plugins-pvcsq" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.778758 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.778831 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-run-systemd\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.778831 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-multus-socket-dir-parent\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.778700 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-log-socket\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.778864 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-var-lib-openvswitch\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.778874 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-cni-bin\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.778894 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-host-run-netns\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.778915 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/539032ff-0878-4adb-88ba-770644cf6912-system-cni-dir\") pod \"multus-additional-cni-plugins-pvcsq\" (UID: \"539032ff-0878-4adb-88ba-770644cf6912\") " pod="openshift-multus/multus-additional-cni-plugins-pvcsq" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.778950 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-run-netns\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.779233 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/04978aec-7cd4-435f-a1d9-d3e0223c0e75-os-release\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:20 crc kubenswrapper[4817]: I0218 13:59:20.779303 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/539032ff-0878-4adb-88ba-770644cf6912-os-release\") pod \"multus-additional-cni-plugins-pvcsq\" (UID: \"539032ff-0878-4adb-88ba-770644cf6912\") " pod="openshift-multus/multus-additional-cni-plugins-pvcsq" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.067210 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rx2q4"] Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.067813 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rx2q4" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.069920 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.070774 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.110232 4817 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 18:59:43.481340704 +0000 UTC Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.117103 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-pj24h"] Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.117624 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj24h" Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.117701 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj24h" podUID="29f1a30b-47d4-452e-9017-dcc9cf78795f" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.171333 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.171516 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.181071 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c405b132-ac6b-43e7-9937-393e89a189b5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rx2q4\" (UID: \"c405b132-ac6b-43e7-9937-393e89a189b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rx2q4" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.181144 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/29f1a30b-47d4-452e-9017-dcc9cf78795f-metrics-certs\") pod \"network-metrics-daemon-pj24h\" (UID: \"29f1a30b-47d4-452e-9017-dcc9cf78795f\") " pod="openshift-multus/network-metrics-daemon-pj24h" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.181173 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c405b132-ac6b-43e7-9937-393e89a189b5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rx2q4\" (UID: \"c405b132-ac6b-43e7-9937-393e89a189b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rx2q4" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.181201 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49zjd\" (UniqueName: \"kubernetes.io/projected/29f1a30b-47d4-452e-9017-dcc9cf78795f-kube-api-access-49zjd\") pod \"network-metrics-daemon-pj24h\" (UID: \"29f1a30b-47d4-452e-9017-dcc9cf78795f\") " pod="openshift-multus/network-metrics-daemon-pj24h" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.181259 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpmww\" (UniqueName: \"kubernetes.io/projected/c405b132-ac6b-43e7-9937-393e89a189b5-kube-api-access-zpmww\") pod \"ovnkube-control-plane-749d76644c-rx2q4\" (UID: \"c405b132-ac6b-43e7-9937-393e89a189b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rx2q4" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.181277 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c405b132-ac6b-43e7-9937-393e89a189b5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rx2q4\" (UID: \"c405b132-ac6b-43e7-9937-393e89a189b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rx2q4" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.282434 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c405b132-ac6b-43e7-9937-393e89a189b5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rx2q4\" (UID: \"c405b132-ac6b-43e7-9937-393e89a189b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rx2q4" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.282512 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/29f1a30b-47d4-452e-9017-dcc9cf78795f-metrics-certs\") pod \"network-metrics-daemon-pj24h\" (UID: \"29f1a30b-47d4-452e-9017-dcc9cf78795f\") " pod="openshift-multus/network-metrics-daemon-pj24h" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.282550 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49zjd\" (UniqueName: \"kubernetes.io/projected/29f1a30b-47d4-452e-9017-dcc9cf78795f-kube-api-access-49zjd\") pod \"network-metrics-daemon-pj24h\" (UID: \"29f1a30b-47d4-452e-9017-dcc9cf78795f\") " pod="openshift-multus/network-metrics-daemon-pj24h" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.282576 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c405b132-ac6b-43e7-9937-393e89a189b5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rx2q4\" (UID: \"c405b132-ac6b-43e7-9937-393e89a189b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rx2q4" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.282644 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpmww\" (UniqueName: \"kubernetes.io/projected/c405b132-ac6b-43e7-9937-393e89a189b5-kube-api-access-zpmww\") pod \"ovnkube-control-plane-749d76644c-rx2q4\" (UID: \"c405b132-ac6b-43e7-9937-393e89a189b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rx2q4" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.282672 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c405b132-ac6b-43e7-9937-393e89a189b5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rx2q4\" (UID: \"c405b132-ac6b-43e7-9937-393e89a189b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rx2q4" Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.282834 4817 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.283028 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29f1a30b-47d4-452e-9017-dcc9cf78795f-metrics-certs podName:29f1a30b-47d4-452e-9017-dcc9cf78795f nodeName:}" failed. No retries permitted until 2026-02-18 13:59:21.782933253 +0000 UTC m=+24.358469236 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/29f1a30b-47d4-452e-9017-dcc9cf78795f-metrics-certs") pod "network-metrics-daemon-pj24h" (UID: "29f1a30b-47d4-452e-9017-dcc9cf78795f") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.289245 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c405b132-ac6b-43e7-9937-393e89a189b5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rx2q4\" (UID: \"c405b132-ac6b-43e7-9937-393e89a189b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rx2q4" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.337817 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qhbvx" event={"ID":"5693edce-06cb-4180-b00d-6f3c63da3457","Type":"ContainerStarted","Data":"2c4d7380b01d261a2562a3f8e23674040061ba73057f1193b71dcff23ecc59f8"} Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.339663 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"d8b8bdb643a7817951ad46e83f459287f88d5a32bae65e097572e6e8c0cb7921"} Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.341856 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerStarted","Data":"4263a220e5890fa40e9706fbcb656b210eb46157f1857a07eda5fc6e064d541f"} Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.341882 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerStarted","Data":"c7c1c799a80a9d975ab53d4cf5272008822680f6f55efd7a2e6bec382bbea671"} Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.341892 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerStarted","Data":"05d41d369d695d81559ba9bf2a312396240b13a2d5dc0fb3a65070f7d4afb080"} Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.343706 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5h4bp" event={"ID":"cb572c3b-50c1-4c26-8e38-214c61889f96","Type":"ContainerStarted","Data":"30bcdea8e491d37a3b640b65fa6736fad45c0db576c999a18ca71f2866711921"} Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.343792 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5h4bp" event={"ID":"cb572c3b-50c1-4c26-8e38-214c61889f96","Type":"ContainerStarted","Data":"5eb979dcc9c4e9a2f0d00345cb8681fa7cf3eb84b65583057f0bb6a6aab4dc0f"} Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.353569 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-qhbvx" podStartSLOduration=2.3535577610000002 podStartE2EDuration="2.353557761s" podCreationTimestamp="2026-02-18 13:59:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:21.35313337 +0000 UTC m=+23.928669353" watchObservedRunningTime="2026-02-18 13:59:21.353557761 +0000 UTC m=+23.929093744" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.382021 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5h4bp" podStartSLOduration=2.382000007 podStartE2EDuration="2.382000007s" podCreationTimestamp="2026-02-18 13:59:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:21.365817768 +0000 UTC m=+23.941353751" watchObservedRunningTime="2026-02-18 13:59:21.382000007 +0000 UTC m=+23.957535990" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.382539 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podStartSLOduration=2.38253512 podStartE2EDuration="2.38253512s" podCreationTimestamp="2026-02-18 13:59:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:21.381447832 +0000 UTC m=+23.956983825" watchObservedRunningTime="2026-02-18 13:59:21.38253512 +0000 UTC m=+23.958071103" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.420855 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.428791 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/539032ff-0878-4adb-88ba-770644cf6912-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pvcsq\" (UID: \"539032ff-0878-4adb-88ba-770644cf6912\") " pod="openshift-multus/multus-additional-cni-plugins-pvcsq" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.533337 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.539734 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.548519 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.553172 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.607498 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.628830 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.680119 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.687062 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.687209 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.687285 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:25.687246974 +0000 UTC m=+28.262782977 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.687328 4817 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.687374 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.687393 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:25.687375698 +0000 UTC m=+28.262911681 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.687538 4817 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.687594 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:25.687583953 +0000 UTC m=+28.263119936 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.687643 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.687791 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.687819 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.687833 4817 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.687873 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:25.68786396 +0000 UTC m=+28.263399943 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.687916 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.688048 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.688068 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.688080 4817 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.688119 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:25.688108447 +0000 UTC m=+28.263644430 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.692115 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.693113 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-ovn-node-metrics-cert\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.778878 4817 configmap.go:193] Couldn't get configMap openshift-multus/cni-copy-resources: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.778912 4817 configmap.go:193] Couldn't get configMap openshift-multus/cni-copy-resources: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.778915 4817 configmap.go:193] Couldn't get configMap openshift-ovn-kubernetes/ovnkube-script-lib: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.778874 4817 configmap.go:193] Couldn't get configMap openshift-multus/multus-daemon-config: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.778994 4817 configmap.go:193] Couldn't get configMap openshift-ovn-kubernetes/ovnkube-config: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.778973 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/539032ff-0878-4adb-88ba-770644cf6912-cni-binary-copy podName:539032ff-0878-4adb-88ba-770644cf6912 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:22.278952087 +0000 UTC m=+24.854488070 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-binary-copy" (UniqueName: "kubernetes.io/configmap/539032ff-0878-4adb-88ba-770644cf6912-cni-binary-copy") pod "multus-additional-cni-plugins-pvcsq" (UID: "539032ff-0878-4adb-88ba-770644cf6912") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.779038 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/04978aec-7cd4-435f-a1d9-d3e0223c0e75-cni-binary-copy podName:04978aec-7cd4-435f-a1d9-d3e0223c0e75 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:22.279025619 +0000 UTC m=+24.854561602 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-binary-copy" (UniqueName: "kubernetes.io/configmap/04978aec-7cd4-435f-a1d9-d3e0223c0e75-cni-binary-copy") pod "multus-xkbz6" (UID: "04978aec-7cd4-435f-a1d9-d3e0223c0e75") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.779044 4817 configmap.go:193] Couldn't get configMap openshift-ovn-kubernetes/env-overrides: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.779057 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/04978aec-7cd4-435f-a1d9-d3e0223c0e75-multus-daemon-config podName:04978aec-7cd4-435f-a1d9-d3e0223c0e75 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:22.27904794 +0000 UTC m=+24.854583923 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "multus-daemon-config" (UniqueName: "kubernetes.io/configmap/04978aec-7cd4-435f-a1d9-d3e0223c0e75-multus-daemon-config") pod "multus-xkbz6" (UID: "04978aec-7cd4-435f-a1d9-d3e0223c0e75") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.779191 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-ovnkube-script-lib podName:1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:22.279155902 +0000 UTC m=+24.854691905 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovnkube-script-lib" (UniqueName: "kubernetes.io/configmap/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-ovnkube-script-lib") pod "ovnkube-node-zh96d" (UID: "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.779222 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-ovnkube-config podName:1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:22.279210974 +0000 UTC m=+24.854746967 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovnkube-config" (UniqueName: "kubernetes.io/configmap/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-ovnkube-config") pod "ovnkube-node-zh96d" (UID: "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.779239 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-env-overrides podName:1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:22.279231764 +0000 UTC m=+24.854767757 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "env-overrides" (UniqueName: "kubernetes.io/configmap/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-env-overrides") pod "ovnkube-node-zh96d" (UID: "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.789368 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/29f1a30b-47d4-452e-9017-dcc9cf78795f-metrics-certs\") pod \"network-metrics-daemon-pj24h\" (UID: \"29f1a30b-47d4-452e-9017-dcc9cf78795f\") " pod="openshift-multus/network-metrics-daemon-pj24h" Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.789611 4817 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.789673 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29f1a30b-47d4-452e-9017-dcc9cf78795f-metrics-certs podName:29f1a30b-47d4-452e-9017-dcc9cf78795f nodeName:}" failed. No retries permitted until 2026-02-18 13:59:22.789658864 +0000 UTC m=+25.365194837 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/29f1a30b-47d4-452e-9017-dcc9cf78795f-metrics-certs") pod "network-metrics-daemon-pj24h" (UID: "29f1a30b-47d4-452e-9017-dcc9cf78795f") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.795946 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.805288 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.806332 4817 projected.go:288] Couldn't get configMap openshift-multus/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.806427 4817 projected.go:194] Error preparing data for projected volume kube-api-access-gh8c5 for pod openshift-multus/multus-xkbz6: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.806545 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/04978aec-7cd4-435f-a1d9-d3e0223c0e75-kube-api-access-gh8c5 podName:04978aec-7cd4-435f-a1d9-d3e0223c0e75 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:22.30650907 +0000 UTC m=+24.882045083 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-gh8c5" (UniqueName: "kubernetes.io/projected/04978aec-7cd4-435f-a1d9-d3e0223c0e75-kube-api-access-gh8c5") pod "multus-xkbz6" (UID: "04978aec-7cd4-435f-a1d9-d3e0223c0e75") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.819171 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.821062 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.823766 4817 projected.go:288] Couldn't get configMap openshift-multus/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.823817 4817 projected.go:194] Error preparing data for projected volume kube-api-access-cfz4f for pod openshift-multus/multus-additional-cni-plugins-pvcsq: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:21 crc kubenswrapper[4817]: E0218 13:59:21.823880 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/539032ff-0878-4adb-88ba-770644cf6912-kube-api-access-cfz4f podName:539032ff-0878-4adb-88ba-770644cf6912 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:22.323859879 +0000 UTC m=+24.899395852 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cfz4f" (UniqueName: "kubernetes.io/projected/539032ff-0878-4adb-88ba-770644cf6912-kube-api-access-cfz4f") pod "multus-additional-cni-plugins-pvcsq" (UID: "539032ff-0878-4adb-88ba-770644cf6912") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.824520 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c405b132-ac6b-43e7-9937-393e89a189b5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rx2q4\" (UID: \"c405b132-ac6b-43e7-9937-393e89a189b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rx2q4" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.847970 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.877751 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 18 13:59:21 crc kubenswrapper[4817]: I0218 13:59:21.886440 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49zjd\" (UniqueName: \"kubernetes.io/projected/29f1a30b-47d4-452e-9017-dcc9cf78795f-kube-api-access-49zjd\") pod \"network-metrics-daemon-pj24h\" (UID: \"29f1a30b-47d4-452e-9017-dcc9cf78795f\") " pod="openshift-multus/network-metrics-daemon-pj24h" Feb 18 13:59:22 crc kubenswrapper[4817]: I0218 13:59:22.039475 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 18 13:59:22 crc kubenswrapper[4817]: I0218 13:59:22.047705 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpmww\" (UniqueName: \"kubernetes.io/projected/c405b132-ac6b-43e7-9937-393e89a189b5-kube-api-access-zpmww\") pod \"ovnkube-control-plane-749d76644c-rx2q4\" (UID: \"c405b132-ac6b-43e7-9937-393e89a189b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rx2q4" Feb 18 13:59:22 crc kubenswrapper[4817]: I0218 13:59:22.050795 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npdcw\" (UniqueName: \"kubernetes.io/projected/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-kube-api-access-npdcw\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:22 crc kubenswrapper[4817]: I0218 13:59:22.053668 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 18 13:59:22 crc kubenswrapper[4817]: I0218 13:59:22.064050 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c405b132-ac6b-43e7-9937-393e89a189b5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rx2q4\" (UID: \"c405b132-ac6b-43e7-9937-393e89a189b5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rx2q4" Feb 18 13:59:22 crc kubenswrapper[4817]: I0218 13:59:22.110838 4817 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 10:19:23.345914929 +0000 UTC Feb 18 13:59:22 crc kubenswrapper[4817]: I0218 13:59:22.173661 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 13:59:22 crc kubenswrapper[4817]: E0218 13:59:22.174243 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 13:59:22 crc kubenswrapper[4817]: I0218 13:59:22.174713 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 13:59:22 crc kubenswrapper[4817]: E0218 13:59:22.174787 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 13:59:22 crc kubenswrapper[4817]: I0218 13:59:22.296909 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/04978aec-7cd4-435f-a1d9-d3e0223c0e75-multus-daemon-config\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:22 crc kubenswrapper[4817]: I0218 13:59:22.296962 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-ovnkube-script-lib\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:22 crc kubenswrapper[4817]: I0218 13:59:22.297022 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-env-overrides\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:22 crc kubenswrapper[4817]: I0218 13:59:22.297073 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-ovnkube-config\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:22 crc kubenswrapper[4817]: I0218 13:59:22.297103 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/539032ff-0878-4adb-88ba-770644cf6912-cni-binary-copy\") pod \"multus-additional-cni-plugins-pvcsq\" (UID: \"539032ff-0878-4adb-88ba-770644cf6912\") " pod="openshift-multus/multus-additional-cni-plugins-pvcsq" Feb 18 13:59:22 crc kubenswrapper[4817]: I0218 13:59:22.297133 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/04978aec-7cd4-435f-a1d9-d3e0223c0e75-cni-binary-copy\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:22 crc kubenswrapper[4817]: I0218 13:59:22.297942 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/04978aec-7cd4-435f-a1d9-d3e0223c0e75-cni-binary-copy\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:22 crc kubenswrapper[4817]: I0218 13:59:22.298759 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-ovnkube-script-lib\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:22 crc kubenswrapper[4817]: I0218 13:59:22.299154 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-env-overrides\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:22 crc kubenswrapper[4817]: I0218 13:59:22.299572 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-ovnkube-config\") pod \"ovnkube-node-zh96d\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:22 crc kubenswrapper[4817]: I0218 13:59:22.299575 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/04978aec-7cd4-435f-a1d9-d3e0223c0e75-multus-daemon-config\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:22 crc kubenswrapper[4817]: I0218 13:59:22.299572 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/539032ff-0878-4adb-88ba-770644cf6912-cni-binary-copy\") pod \"multus-additional-cni-plugins-pvcsq\" (UID: \"539032ff-0878-4adb-88ba-770644cf6912\") " pod="openshift-multus/multus-additional-cni-plugins-pvcsq" Feb 18 13:59:22 crc kubenswrapper[4817]: I0218 13:59:22.307715 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rx2q4" Feb 18 13:59:22 crc kubenswrapper[4817]: W0218 13:59:22.325054 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc405b132_ac6b_43e7_9937_393e89a189b5.slice/crio-f23c2fc8129b0e0c56cbd22380a5d14e468cd7063701cf5e5d82928bdcaaf7b7 WatchSource:0}: Error finding container f23c2fc8129b0e0c56cbd22380a5d14e468cd7063701cf5e5d82928bdcaaf7b7: Status 404 returned error can't find the container with id f23c2fc8129b0e0c56cbd22380a5d14e468cd7063701cf5e5d82928bdcaaf7b7 Feb 18 13:59:22 crc kubenswrapper[4817]: I0218 13:59:22.349167 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rx2q4" event={"ID":"c405b132-ac6b-43e7-9937-393e89a189b5","Type":"ContainerStarted","Data":"f23c2fc8129b0e0c56cbd22380a5d14e468cd7063701cf5e5d82928bdcaaf7b7"} Feb 18 13:59:22 crc kubenswrapper[4817]: E0218 13:59:22.362687 4817 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Feb 18 13:59:22 crc kubenswrapper[4817]: I0218 13:59:22.398625 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh8c5\" (UniqueName: \"kubernetes.io/projected/04978aec-7cd4-435f-a1d9-d3e0223c0e75-kube-api-access-gh8c5\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:22 crc kubenswrapper[4817]: I0218 13:59:22.398766 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfz4f\" (UniqueName: \"kubernetes.io/projected/539032ff-0878-4adb-88ba-770644cf6912-kube-api-access-cfz4f\") pod \"multus-additional-cni-plugins-pvcsq\" (UID: \"539032ff-0878-4adb-88ba-770644cf6912\") " pod="openshift-multus/multus-additional-cni-plugins-pvcsq" Feb 18 13:59:22 crc kubenswrapper[4817]: I0218 13:59:22.404044 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfz4f\" (UniqueName: \"kubernetes.io/projected/539032ff-0878-4adb-88ba-770644cf6912-kube-api-access-cfz4f\") pod \"multus-additional-cni-plugins-pvcsq\" (UID: \"539032ff-0878-4adb-88ba-770644cf6912\") " pod="openshift-multus/multus-additional-cni-plugins-pvcsq" Feb 18 13:59:22 crc kubenswrapper[4817]: I0218 13:59:22.404312 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh8c5\" (UniqueName: \"kubernetes.io/projected/04978aec-7cd4-435f-a1d9-d3e0223c0e75-kube-api-access-gh8c5\") pod \"multus-xkbz6\" (UID: \"04978aec-7cd4-435f-a1d9-d3e0223c0e75\") " pod="openshift-multus/multus-xkbz6" Feb 18 13:59:22 crc kubenswrapper[4817]: I0218 13:59:22.423863 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:22 crc kubenswrapper[4817]: I0218 13:59:22.430748 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xkbz6" Feb 18 13:59:22 crc kubenswrapper[4817]: W0218 13:59:22.449063 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d3841ed_d9b2_4e7a_9eb3_6650fa0be74b.slice/crio-6959bc59ccef4b1b32e4b2c520bee0db960072623fbbc83680c6ae2e722c80d9 WatchSource:0}: Error finding container 6959bc59ccef4b1b32e4b2c520bee0db960072623fbbc83680c6ae2e722c80d9: Status 404 returned error can't find the container with id 6959bc59ccef4b1b32e4b2c520bee0db960072623fbbc83680c6ae2e722c80d9 Feb 18 13:59:22 crc kubenswrapper[4817]: I0218 13:59:22.452087 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-pvcsq" Feb 18 13:59:22 crc kubenswrapper[4817]: W0218 13:59:22.460168 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04978aec_7cd4_435f_a1d9_d3e0223c0e75.slice/crio-0af47b646a829a5063d530befebbe37c4c4a8a684166aa2992eea40bd3e8baba WatchSource:0}: Error finding container 0af47b646a829a5063d530befebbe37c4c4a8a684166aa2992eea40bd3e8baba: Status 404 returned error can't find the container with id 0af47b646a829a5063d530befebbe37c4c4a8a684166aa2992eea40bd3e8baba Feb 18 13:59:22 crc kubenswrapper[4817]: I0218 13:59:22.802732 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/29f1a30b-47d4-452e-9017-dcc9cf78795f-metrics-certs\") pod \"network-metrics-daemon-pj24h\" (UID: \"29f1a30b-47d4-452e-9017-dcc9cf78795f\") " pod="openshift-multus/network-metrics-daemon-pj24h" Feb 18 13:59:22 crc kubenswrapper[4817]: E0218 13:59:22.803017 4817 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 13:59:22 crc kubenswrapper[4817]: E0218 13:59:22.803290 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29f1a30b-47d4-452e-9017-dcc9cf78795f-metrics-certs podName:29f1a30b-47d4-452e-9017-dcc9cf78795f nodeName:}" failed. No retries permitted until 2026-02-18 13:59:24.80327131 +0000 UTC m=+27.378807293 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/29f1a30b-47d4-452e-9017-dcc9cf78795f-metrics-certs") pod "network-metrics-daemon-pj24h" (UID: "29f1a30b-47d4-452e-9017-dcc9cf78795f") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 13:59:23 crc kubenswrapper[4817]: I0218 13:59:23.111370 4817 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 14:31:26.455664013 +0000 UTC Feb 18 13:59:23 crc kubenswrapper[4817]: I0218 13:59:23.170778 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj24h" Feb 18 13:59:23 crc kubenswrapper[4817]: I0218 13:59:23.170859 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 13:59:23 crc kubenswrapper[4817]: E0218 13:59:23.171014 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj24h" podUID="29f1a30b-47d4-452e-9017-dcc9cf78795f" Feb 18 13:59:23 crc kubenswrapper[4817]: E0218 13:59:23.171118 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 13:59:23 crc kubenswrapper[4817]: I0218 13:59:23.355080 4817 generic.go:334] "Generic (PLEG): container finished" podID="539032ff-0878-4adb-88ba-770644cf6912" containerID="c8a9667ebe4aa4543e79a9240b700f8f7e278807a806e27fe026e617bf79a74a" exitCode=0 Feb 18 13:59:23 crc kubenswrapper[4817]: I0218 13:59:23.355163 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pvcsq" event={"ID":"539032ff-0878-4adb-88ba-770644cf6912","Type":"ContainerDied","Data":"c8a9667ebe4aa4543e79a9240b700f8f7e278807a806e27fe026e617bf79a74a"} Feb 18 13:59:23 crc kubenswrapper[4817]: I0218 13:59:23.355208 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pvcsq" event={"ID":"539032ff-0878-4adb-88ba-770644cf6912","Type":"ContainerStarted","Data":"d400ff4b633ae69d8aca2becf1bf69765e7c15e9dad54b9f18c851d3387bd73c"} Feb 18 13:59:23 crc kubenswrapper[4817]: I0218 13:59:23.357425 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xkbz6" event={"ID":"04978aec-7cd4-435f-a1d9-d3e0223c0e75","Type":"ContainerStarted","Data":"f6781392caddb60590b6d79fa957427f81b0bbf46661c2ea8b56564e7b4bac1b"} Feb 18 13:59:23 crc kubenswrapper[4817]: I0218 13:59:23.357488 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xkbz6" event={"ID":"04978aec-7cd4-435f-a1d9-d3e0223c0e75","Type":"ContainerStarted","Data":"0af47b646a829a5063d530befebbe37c4c4a8a684166aa2992eea40bd3e8baba"} Feb 18 13:59:23 crc kubenswrapper[4817]: I0218 13:59:23.360720 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rx2q4" event={"ID":"c405b132-ac6b-43e7-9937-393e89a189b5","Type":"ContainerStarted","Data":"f76cdd9a279046be18fdcab7075d12096545ee3379c92004b7c8d4e2281e6702"} Feb 18 13:59:23 crc kubenswrapper[4817]: I0218 13:59:23.360772 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rx2q4" event={"ID":"c405b132-ac6b-43e7-9937-393e89a189b5","Type":"ContainerStarted","Data":"744ace07b0cde8e45a3573b3e7a50d84dfc73c85d44c4f12a593f685081ac3fa"} Feb 18 13:59:23 crc kubenswrapper[4817]: I0218 13:59:23.363754 4817 generic.go:334] "Generic (PLEG): container finished" podID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerID="87f3456aec91d11793ca513ea95756997ec27fc4e5f51fa639e3fb82b419de95" exitCode=0 Feb 18 13:59:23 crc kubenswrapper[4817]: I0218 13:59:23.364935 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" event={"ID":"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b","Type":"ContainerDied","Data":"87f3456aec91d11793ca513ea95756997ec27fc4e5f51fa639e3fb82b419de95"} Feb 18 13:59:23 crc kubenswrapper[4817]: I0218 13:59:23.365028 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" event={"ID":"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b","Type":"ContainerStarted","Data":"6959bc59ccef4b1b32e4b2c520bee0db960072623fbbc83680c6ae2e722c80d9"} Feb 18 13:59:23 crc kubenswrapper[4817]: I0218 13:59:23.383081 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=2.383050311 podStartE2EDuration="2.383050311s" podCreationTimestamp="2026-02-18 13:59:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:23.382576688 +0000 UTC m=+25.958112671" watchObservedRunningTime="2026-02-18 13:59:23.383050311 +0000 UTC m=+25.958586334" Feb 18 13:59:23 crc kubenswrapper[4817]: I0218 13:59:23.486730 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rx2q4" podStartSLOduration=3.486705863 podStartE2EDuration="3.486705863s" podCreationTimestamp="2026-02-18 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:23.485021339 +0000 UTC m=+26.060557322" watchObservedRunningTime="2026-02-18 13:59:23.486705863 +0000 UTC m=+26.062241846" Feb 18 13:59:23 crc kubenswrapper[4817]: I0218 13:59:23.506898 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xkbz6" podStartSLOduration=4.506869734 podStartE2EDuration="4.506869734s" podCreationTimestamp="2026-02-18 13:59:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:23.506383822 +0000 UTC m=+26.081919805" watchObservedRunningTime="2026-02-18 13:59:23.506869734 +0000 UTC m=+26.082405717" Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.112060 4817 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 06:59:41.24296987 +0000 UTC Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.171576 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.171576 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 13:59:24 crc kubenswrapper[4817]: E0218 13:59:24.171741 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 13:59:24 crc kubenswrapper[4817]: E0218 13:59:24.171783 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.324036 4817 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.326617 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.326649 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.326658 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.326794 4817 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.334577 4817 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.334941 4817 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.336400 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.336460 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.336476 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.336500 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.336521 4817 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-18T13:59:24Z","lastTransitionTime":"2026-02-18T13:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.369722 4817 generic.go:334] "Generic (PLEG): container finished" podID="539032ff-0878-4adb-88ba-770644cf6912" containerID="b17fd04a943106c841bcf810e3d8852a7201667e452d661252f8c1aec91dd380" exitCode=0 Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.369793 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pvcsq" event={"ID":"539032ff-0878-4adb-88ba-770644cf6912","Type":"ContainerDied","Data":"b17fd04a943106c841bcf810e3d8852a7201667e452d661252f8c1aec91dd380"} Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.374449 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" event={"ID":"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b","Type":"ContainerStarted","Data":"0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1"} Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.374492 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" event={"ID":"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b","Type":"ContainerStarted","Data":"006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63"} Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.374512 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" event={"ID":"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b","Type":"ContainerStarted","Data":"e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80"} Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.374523 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" event={"ID":"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b","Type":"ContainerStarted","Data":"74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed"} Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.374534 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" event={"ID":"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b","Type":"ContainerStarted","Data":"d32ca79e186fd02a706c91b03da7c227b89971fa9ce92a1ff85a5554269db82a"} Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.374545 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" event={"ID":"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b","Type":"ContainerStarted","Data":"96802a0db761b37b3d378e229e8e23ac39f92b53a43da83d23b00f66cfb4028f"} Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.384218 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdq9r"] Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.384551 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdq9r" Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.386416 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.387199 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.387439 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.390362 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.420457 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/607dbd3a-6068-4b82-a331-628574003738-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-gdq9r\" (UID: \"607dbd3a-6068-4b82-a331-628574003738\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdq9r" Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.420508 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/607dbd3a-6068-4b82-a331-628574003738-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-gdq9r\" (UID: \"607dbd3a-6068-4b82-a331-628574003738\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdq9r" Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.420572 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/607dbd3a-6068-4b82-a331-628574003738-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-gdq9r\" (UID: \"607dbd3a-6068-4b82-a331-628574003738\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdq9r" Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.420606 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/607dbd3a-6068-4b82-a331-628574003738-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-gdq9r\" (UID: \"607dbd3a-6068-4b82-a331-628574003738\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdq9r" Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.420629 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/607dbd3a-6068-4b82-a331-628574003738-service-ca\") pod \"cluster-version-operator-5c965bbfc6-gdq9r\" (UID: \"607dbd3a-6068-4b82-a331-628574003738\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdq9r" Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.523365 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/607dbd3a-6068-4b82-a331-628574003738-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-gdq9r\" (UID: \"607dbd3a-6068-4b82-a331-628574003738\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdq9r" Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.523455 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/607dbd3a-6068-4b82-a331-628574003738-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-gdq9r\" (UID: \"607dbd3a-6068-4b82-a331-628574003738\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdq9r" Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.523499 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/607dbd3a-6068-4b82-a331-628574003738-service-ca\") pod \"cluster-version-operator-5c965bbfc6-gdq9r\" (UID: \"607dbd3a-6068-4b82-a331-628574003738\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdq9r" Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.523583 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/607dbd3a-6068-4b82-a331-628574003738-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-gdq9r\" (UID: \"607dbd3a-6068-4b82-a331-628574003738\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdq9r" Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.523602 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/607dbd3a-6068-4b82-a331-628574003738-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-gdq9r\" (UID: \"607dbd3a-6068-4b82-a331-628574003738\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdq9r" Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.523692 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/607dbd3a-6068-4b82-a331-628574003738-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-gdq9r\" (UID: \"607dbd3a-6068-4b82-a331-628574003738\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdq9r" Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.523743 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/607dbd3a-6068-4b82-a331-628574003738-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-gdq9r\" (UID: \"607dbd3a-6068-4b82-a331-628574003738\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdq9r" Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.525159 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/607dbd3a-6068-4b82-a331-628574003738-service-ca\") pod \"cluster-version-operator-5c965bbfc6-gdq9r\" (UID: \"607dbd3a-6068-4b82-a331-628574003738\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdq9r" Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.533520 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/607dbd3a-6068-4b82-a331-628574003738-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-gdq9r\" (UID: \"607dbd3a-6068-4b82-a331-628574003738\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdq9r" Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.544254 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/607dbd3a-6068-4b82-a331-628574003738-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-gdq9r\" (UID: \"607dbd3a-6068-4b82-a331-628574003738\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdq9r" Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.702971 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdq9r" Feb 18 13:59:24 crc kubenswrapper[4817]: W0218 13:59:24.718479 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod607dbd3a_6068_4b82_a331_628574003738.slice/crio-9c67cb04e9ca3cc0f7da7ad8400c0a3370d724357164db003524f99d05eb6123 WatchSource:0}: Error finding container 9c67cb04e9ca3cc0f7da7ad8400c0a3370d724357164db003524f99d05eb6123: Status 404 returned error can't find the container with id 9c67cb04e9ca3cc0f7da7ad8400c0a3370d724357164db003524f99d05eb6123 Feb 18 13:59:24 crc kubenswrapper[4817]: I0218 13:59:24.826941 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/29f1a30b-47d4-452e-9017-dcc9cf78795f-metrics-certs\") pod \"network-metrics-daemon-pj24h\" (UID: \"29f1a30b-47d4-452e-9017-dcc9cf78795f\") " pod="openshift-multus/network-metrics-daemon-pj24h" Feb 18 13:59:24 crc kubenswrapper[4817]: E0218 13:59:24.827183 4817 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 13:59:24 crc kubenswrapper[4817]: E0218 13:59:24.827287 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29f1a30b-47d4-452e-9017-dcc9cf78795f-metrics-certs podName:29f1a30b-47d4-452e-9017-dcc9cf78795f nodeName:}" failed. No retries permitted until 2026-02-18 13:59:28.827253046 +0000 UTC m=+31.402789129 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/29f1a30b-47d4-452e-9017-dcc9cf78795f-metrics-certs") pod "network-metrics-daemon-pj24h" (UID: "29f1a30b-47d4-452e-9017-dcc9cf78795f") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 13:59:25 crc kubenswrapper[4817]: I0218 13:59:25.113238 4817 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 08:52:17.68353796 +0000 UTC Feb 18 13:59:25 crc kubenswrapper[4817]: I0218 13:59:25.113373 4817 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 18 13:59:25 crc kubenswrapper[4817]: I0218 13:59:25.124805 4817 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 18 13:59:25 crc kubenswrapper[4817]: I0218 13:59:25.171324 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj24h" Feb 18 13:59:25 crc kubenswrapper[4817]: I0218 13:59:25.171381 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 13:59:25 crc kubenswrapper[4817]: E0218 13:59:25.171462 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj24h" podUID="29f1a30b-47d4-452e-9017-dcc9cf78795f" Feb 18 13:59:25 crc kubenswrapper[4817]: E0218 13:59:25.171546 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 13:59:25 crc kubenswrapper[4817]: I0218 13:59:25.383741 4817 generic.go:334] "Generic (PLEG): container finished" podID="539032ff-0878-4adb-88ba-770644cf6912" containerID="cfae210ba25228cdaf8cef20422e9f72cd8b7d6c4a0941355d56d52afe06a9d9" exitCode=0 Feb 18 13:59:25 crc kubenswrapper[4817]: I0218 13:59:25.384496 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pvcsq" event={"ID":"539032ff-0878-4adb-88ba-770644cf6912","Type":"ContainerDied","Data":"cfae210ba25228cdaf8cef20422e9f72cd8b7d6c4a0941355d56d52afe06a9d9"} Feb 18 13:59:25 crc kubenswrapper[4817]: I0218 13:59:25.387216 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdq9r" event={"ID":"607dbd3a-6068-4b82-a331-628574003738","Type":"ContainerStarted","Data":"6475ac54d14a3033edbaadb69a56275d8475bcee8c3431f5ae0ff48469ad1c1f"} Feb 18 13:59:25 crc kubenswrapper[4817]: I0218 13:59:25.387258 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdq9r" event={"ID":"607dbd3a-6068-4b82-a331-628574003738","Type":"ContainerStarted","Data":"9c67cb04e9ca3cc0f7da7ad8400c0a3370d724357164db003524f99d05eb6123"} Feb 18 13:59:25 crc kubenswrapper[4817]: I0218 13:59:25.426107 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gdq9r" podStartSLOduration=6.42608636 podStartE2EDuration="6.42608636s" podCreationTimestamp="2026-02-18 13:59:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:25.425488074 +0000 UTC m=+28.001024067" watchObservedRunningTime="2026-02-18 13:59:25.42608636 +0000 UTC m=+28.001622363" Feb 18 13:59:25 crc kubenswrapper[4817]: I0218 13:59:25.737793 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:25 crc kubenswrapper[4817]: I0218 13:59:25.738013 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 13:59:25 crc kubenswrapper[4817]: I0218 13:59:25.738076 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 13:59:25 crc kubenswrapper[4817]: E0218 13:59:25.738130 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:33.738096973 +0000 UTC m=+36.313632946 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:25 crc kubenswrapper[4817]: E0218 13:59:25.738209 4817 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 13:59:25 crc kubenswrapper[4817]: I0218 13:59:25.738224 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 13:59:25 crc kubenswrapper[4817]: E0218 13:59:25.738294 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:33.738268577 +0000 UTC m=+36.313804600 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 13:59:25 crc kubenswrapper[4817]: E0218 13:59:25.738319 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 13:59:25 crc kubenswrapper[4817]: I0218 13:59:25.738357 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 13:59:25 crc kubenswrapper[4817]: E0218 13:59:25.738374 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 13:59:25 crc kubenswrapper[4817]: E0218 13:59:25.738391 4817 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 13:59:25 crc kubenswrapper[4817]: E0218 13:59:25.738402 4817 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 13:59:25 crc kubenswrapper[4817]: E0218 13:59:25.738449 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:33.738440361 +0000 UTC m=+36.313976344 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 13:59:25 crc kubenswrapper[4817]: E0218 13:59:25.738499 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:33.738464692 +0000 UTC m=+36.314000715 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 13:59:25 crc kubenswrapper[4817]: E0218 13:59:25.738519 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 13:59:25 crc kubenswrapper[4817]: E0218 13:59:25.738546 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 13:59:25 crc kubenswrapper[4817]: E0218 13:59:25.738565 4817 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 13:59:25 crc kubenswrapper[4817]: E0218 13:59:25.738623 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:33.738608766 +0000 UTC m=+36.314144789 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 13:59:26 crc kubenswrapper[4817]: I0218 13:59:26.170938 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 13:59:26 crc kubenswrapper[4817]: E0218 13:59:26.171333 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 13:59:26 crc kubenswrapper[4817]: I0218 13:59:26.170947 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 13:59:26 crc kubenswrapper[4817]: E0218 13:59:26.171954 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 13:59:26 crc kubenswrapper[4817]: I0218 13:59:26.395292 4817 generic.go:334] "Generic (PLEG): container finished" podID="539032ff-0878-4adb-88ba-770644cf6912" containerID="57258d9348aa2e0d7949061061d42b853a47b3d5daebb3948bd977cb0dbb3f40" exitCode=0 Feb 18 13:59:26 crc kubenswrapper[4817]: I0218 13:59:26.395348 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pvcsq" event={"ID":"539032ff-0878-4adb-88ba-770644cf6912","Type":"ContainerDied","Data":"57258d9348aa2e0d7949061061d42b853a47b3d5daebb3948bd977cb0dbb3f40"} Feb 18 13:59:26 crc kubenswrapper[4817]: I0218 13:59:26.404511 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" event={"ID":"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b","Type":"ContainerStarted","Data":"f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1"} Feb 18 13:59:27 crc kubenswrapper[4817]: I0218 13:59:27.170644 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 13:59:27 crc kubenswrapper[4817]: I0218 13:59:27.170677 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj24h" Feb 18 13:59:27 crc kubenswrapper[4817]: E0218 13:59:27.170883 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 13:59:27 crc kubenswrapper[4817]: E0218 13:59:27.171030 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj24h" podUID="29f1a30b-47d4-452e-9017-dcc9cf78795f" Feb 18 13:59:27 crc kubenswrapper[4817]: I0218 13:59:27.414142 4817 generic.go:334] "Generic (PLEG): container finished" podID="539032ff-0878-4adb-88ba-770644cf6912" containerID="b6c7014f4712b85955079bf0b8957d225304f6397c0bbbca9aed40690d044a34" exitCode=0 Feb 18 13:59:27 crc kubenswrapper[4817]: I0218 13:59:27.414273 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pvcsq" event={"ID":"539032ff-0878-4adb-88ba-770644cf6912","Type":"ContainerDied","Data":"b6c7014f4712b85955079bf0b8957d225304f6397c0bbbca9aed40690d044a34"} Feb 18 13:59:27 crc kubenswrapper[4817]: I0218 13:59:27.945909 4817 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 18 13:59:28 crc kubenswrapper[4817]: I0218 13:59:28.170745 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 13:59:28 crc kubenswrapper[4817]: I0218 13:59:28.170839 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 13:59:28 crc kubenswrapper[4817]: E0218 13:59:28.171604 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 13:59:28 crc kubenswrapper[4817]: E0218 13:59:28.171758 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 13:59:28 crc kubenswrapper[4817]: I0218 13:59:28.422740 4817 generic.go:334] "Generic (PLEG): container finished" podID="539032ff-0878-4adb-88ba-770644cf6912" containerID="42410dc64ed7d5252265dd827631cc1d959ce30a18be7942365bc4d56f3f7cab" exitCode=0 Feb 18 13:59:28 crc kubenswrapper[4817]: I0218 13:59:28.422802 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pvcsq" event={"ID":"539032ff-0878-4adb-88ba-770644cf6912","Type":"ContainerDied","Data":"42410dc64ed7d5252265dd827631cc1d959ce30a18be7942365bc4d56f3f7cab"} Feb 18 13:59:28 crc kubenswrapper[4817]: I0218 13:59:28.875135 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/29f1a30b-47d4-452e-9017-dcc9cf78795f-metrics-certs\") pod \"network-metrics-daemon-pj24h\" (UID: \"29f1a30b-47d4-452e-9017-dcc9cf78795f\") " pod="openshift-multus/network-metrics-daemon-pj24h" Feb 18 13:59:28 crc kubenswrapper[4817]: E0218 13:59:28.875499 4817 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 13:59:28 crc kubenswrapper[4817]: E0218 13:59:28.875704 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29f1a30b-47d4-452e-9017-dcc9cf78795f-metrics-certs podName:29f1a30b-47d4-452e-9017-dcc9cf78795f nodeName:}" failed. No retries permitted until 2026-02-18 13:59:36.875658311 +0000 UTC m=+39.451194364 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/29f1a30b-47d4-452e-9017-dcc9cf78795f-metrics-certs") pod "network-metrics-daemon-pj24h" (UID: "29f1a30b-47d4-452e-9017-dcc9cf78795f") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 18 13:59:29 crc kubenswrapper[4817]: I0218 13:59:29.171401 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 13:59:29 crc kubenswrapper[4817]: I0218 13:59:29.171544 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj24h" Feb 18 13:59:29 crc kubenswrapper[4817]: E0218 13:59:29.171594 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 13:59:29 crc kubenswrapper[4817]: E0218 13:59:29.171751 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj24h" podUID="29f1a30b-47d4-452e-9017-dcc9cf78795f" Feb 18 13:59:29 crc kubenswrapper[4817]: I0218 13:59:29.316010 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 13:59:29 crc kubenswrapper[4817]: I0218 13:59:29.431924 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pvcsq" event={"ID":"539032ff-0878-4adb-88ba-770644cf6912","Type":"ContainerStarted","Data":"1440fce9e6ba5bf854062d192498747a00c390bad7041c741b691fc01fd0c143"} Feb 18 13:59:29 crc kubenswrapper[4817]: I0218 13:59:29.438742 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" event={"ID":"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b","Type":"ContainerStarted","Data":"1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11"} Feb 18 13:59:29 crc kubenswrapper[4817]: I0218 13:59:29.439259 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:29 crc kubenswrapper[4817]: I0218 13:59:29.439298 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:29 crc kubenswrapper[4817]: I0218 13:59:29.439311 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:29 crc kubenswrapper[4817]: I0218 13:59:29.463427 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-pvcsq" podStartSLOduration=10.463390917 podStartE2EDuration="10.463390917s" podCreationTimestamp="2026-02-18 13:59:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:29.460714878 +0000 UTC m=+32.036250871" watchObservedRunningTime="2026-02-18 13:59:29.463390917 +0000 UTC m=+32.038926920" Feb 18 13:59:29 crc kubenswrapper[4817]: I0218 13:59:29.473000 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:29 crc kubenswrapper[4817]: I0218 13:59:29.474654 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:29 crc kubenswrapper[4817]: I0218 13:59:29.490545 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" podStartSLOduration=9.490517319 podStartE2EDuration="9.490517319s" podCreationTimestamp="2026-02-18 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:29.489811811 +0000 UTC m=+32.065347804" watchObservedRunningTime="2026-02-18 13:59:29.490517319 +0000 UTC m=+32.066053322" Feb 18 13:59:30 crc kubenswrapper[4817]: I0218 13:59:30.171411 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 13:59:30 crc kubenswrapper[4817]: E0218 13:59:30.171586 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 13:59:30 crc kubenswrapper[4817]: I0218 13:59:30.172452 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 13:59:30 crc kubenswrapper[4817]: E0218 13:59:30.172674 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 13:59:30 crc kubenswrapper[4817]: I0218 13:59:30.748567 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-pj24h"] Feb 18 13:59:30 crc kubenswrapper[4817]: I0218 13:59:30.748752 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj24h" Feb 18 13:59:30 crc kubenswrapper[4817]: E0218 13:59:30.748892 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj24h" podUID="29f1a30b-47d4-452e-9017-dcc9cf78795f" Feb 18 13:59:31 crc kubenswrapper[4817]: I0218 13:59:31.170675 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 13:59:31 crc kubenswrapper[4817]: E0218 13:59:31.170818 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 13:59:32 crc kubenswrapper[4817]: I0218 13:59:32.171243 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj24h" Feb 18 13:59:32 crc kubenswrapper[4817]: I0218 13:59:32.171371 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 13:59:32 crc kubenswrapper[4817]: E0218 13:59:32.171469 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj24h" podUID="29f1a30b-47d4-452e-9017-dcc9cf78795f" Feb 18 13:59:32 crc kubenswrapper[4817]: E0218 13:59:32.171600 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 18 13:59:32 crc kubenswrapper[4817]: I0218 13:59:32.171782 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 13:59:32 crc kubenswrapper[4817]: E0218 13:59:32.171891 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.170640 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.171336 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.827971 4817 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.828300 4817 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.833283 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.833383 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.833421 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.833474 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:49.833432694 +0000 UTC m=+52.408968717 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.833545 4817 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.833559 4817 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.833581 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.833744 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.833794 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.833822 4817 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.833595 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:49.833583338 +0000 UTC m=+52.409119331 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.833924 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:49.833899307 +0000 UTC m=+52.409435370 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.833968 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.834032 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.834049 4817 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.834061 4817 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.834083 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:49.834061861 +0000 UTC m=+52.409597884 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.834133 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:49.834117142 +0000 UTC m=+52.409653165 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.887595 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-4rsdh"] Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.888412 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.889359 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-v2snv"] Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.890069 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-v2snv" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.890941 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-wwvbf"] Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.891467 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-wwvbf" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.893494 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-l7vpp"] Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.894786 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.894946 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.897061 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.898612 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-x8x2r"] Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.899456 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-x8x2r" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.899609 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr"] Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.900503 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.900938 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bclz6"] Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.901454 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.906556 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr"] Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.908584 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.925123 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.925548 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.927271 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-w7l8m"] Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.928243 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-rpmhp"] Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.928296 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.928321 4817 reflector.go:561] object-"openshift-authentication"/"v4-0-config-user-template-error": failed to list *v1.Secret: secrets "v4-0-config-user-template-error" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.928364 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-user-template-error\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-user-template-error\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.928430 4817 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template": failed to list *v1.Secret: secrets "v4-0-config-system-ocp-branding-template" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.928447 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-ocp-branding-template\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-system-ocp-branding-template\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.928526 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.928710 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.928762 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-rpmhp" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.928781 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.928918 4817 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-serving-cert": failed to list *v1.Secret: secrets "v4-0-config-system-serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.928945 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-system-serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.929087 4817 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-router-certs": failed to list *v1.Secret: secrets "v4-0-config-system-router-certs" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.929106 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-router-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-system-router-certs\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.929168 4817 reflector.go:561] object-"openshift-route-controller-manager"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.929184 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.929248 4817 reflector.go:561] object-"openshift-authentication"/"v4-0-config-user-template-login": failed to list *v1.Secret: secrets "v4-0-config-user-template-login" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.929262 4817 reflector.go:561] object-"openshift-oauth-apiserver"/"trusted-ca-bundle": failed to list *v1.ConfigMap: configmaps "trusted-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.928716 4817 reflector.go:561] object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data": failed to list *v1.Secret: secrets "v4-0-config-user-idp-0-file-data" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.929291 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"trusted-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"trusted-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.929264 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-user-template-login\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-user-template-login\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.929299 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-user-idp-0-file-data\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-user-idp-0-file-data\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.929327 4817 reflector.go:561] object-"openshift-route-controller-manager"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.929360 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.929382 4817 reflector.go:561] object-"openshift-controller-manager"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.929420 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w7l8m" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.929453 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.929518 4817 reflector.go:561] object-"openshift-controller-manager"/"client-ca": failed to list *v1.ConfigMap: configmaps "client-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.929533 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"client-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"client-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.929581 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.929802 4817 reflector.go:561] object-"openshift-controller-manager"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.929818 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.929864 4817 reflector.go:561] object-"openshift-machine-api"/"kube-rbac-proxy": failed to list *v1.ConfigMap: configmaps "kube-rbac-proxy" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.929875 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"kube-rbac-proxy\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-rbac-proxy\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.929937 4817 reflector.go:561] object-"openshift-machine-api"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.929948 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.929951 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.930086 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.930126 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.930210 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.929183 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.930444 4817 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-tls": failed to list *v1.Secret: secrets "machine-api-operator-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.930472 4817 reflector.go:561] object-"openshift-oauth-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.930479 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-api-operator-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.930486 4817 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7": failed to list *v1.Secret: secrets "machine-api-operator-dockercfg-mfbb7" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.930497 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.930509 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-dockercfg-mfbb7\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-api-operator-dockercfg-mfbb7\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.930580 4817 reflector.go:561] object-"openshift-authentication"/"audit": failed to list *v1.ConfigMap: configmaps "audit" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.930597 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"audit\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"audit\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.930583 4817 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-images": failed to list *v1.ConfigMap: configmaps "machine-api-operator-images" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.930624 4817 reflector.go:561] object-"openshift-oauth-apiserver"/"audit-1": failed to list *v1.ConfigMap: configmaps "audit-1" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.930631 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-images\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"machine-api-operator-images\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.930614 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.930645 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"audit-1\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"audit-1\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.930691 4817 reflector.go:561] object-"openshift-oauth-apiserver"/"encryption-config-1": failed to list *v1.Secret: secrets "encryption-config-1" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.930699 4817 reflector.go:561] object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq": failed to list *v1.Secret: secrets "oauth-apiserver-sa-dockercfg-6r2bq" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.930705 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"encryption-config-1\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"encryption-config-1\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.930720 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"oauth-apiserver-sa-dockercfg-6r2bq\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"oauth-apiserver-sa-dockercfg-6r2bq\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.930759 4817 reflector.go:561] object-"openshift-oauth-apiserver"/"etcd-serving-ca": failed to list *v1.ConfigMap: configmaps "etcd-serving-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.930772 4817 reflector.go:561] object-"openshift-oauth-apiserver"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.930776 4817 reflector.go:561] object-"openshift-authentication"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.930787 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.930771 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"etcd-serving-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"etcd-serving-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.930796 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.929444 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.930815 4817 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-session": failed to list *v1.Secret: secrets "v4-0-config-system-session" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.930827 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-session\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-system-session\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.930846 4817 reflector.go:561] object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc": failed to list *v1.Secret: secrets "oauth-openshift-dockercfg-znhcc" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.930848 4817 reflector.go:561] object-"openshift-route-controller-manager"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.930861 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"oauth-openshift-dockercfg-znhcc\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"oauth-openshift-dockercfg-znhcc\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.930866 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.930900 4817 reflector.go:561] object-"openshift-controller-manager"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.930911 4817 reflector.go:561] object-"openshift-controller-manager"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.930916 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.930926 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.931495 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.931670 4817 reflector.go:561] object-"openshift-oauth-apiserver"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.931705 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.931764 4817 reflector.go:561] object-"openshift-authentication"/"v4-0-config-user-template-provider-selection": failed to list *v1.Secret: secrets "v4-0-config-user-template-provider-selection" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.931780 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-user-template-provider-selection\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-user-template-provider-selection\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.931935 4817 reflector.go:561] object-"openshift-route-controller-manager"/"client-ca": failed to list *v1.ConfigMap: configmaps "client-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.931989 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"client-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"client-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.932084 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.932696 4817 reflector.go:561] object-"openshift-authentication"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.932750 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.934523 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gj9xr"] Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.935665 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7wn4g"] Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.935968 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gj9xr" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.936313 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7wn4g" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.936722 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46tjc"] Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.936897 4817 reflector.go:561] object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2": failed to list *v1.Secret: secrets "route-controller-manager-sa-dockercfg-h2zr2" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.936942 4817 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-cliconfig": failed to list *v1.ConfigMap: configmaps "v4-0-config-system-cliconfig" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.937024 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-cliconfig\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"v4-0-config-system-cliconfig\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.936934 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"route-controller-manager-sa-dockercfg-h2zr2\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"route-controller-manager-sa-dockercfg-h2zr2\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.937148 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.937498 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46tjc" Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.939882 4817 reflector.go:561] object-"openshift-oauth-apiserver"/"etcd-client": failed to list *v1.Secret: secrets "etcd-client" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.939924 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"etcd-client\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"etcd-client\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.940076 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.940217 4817 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-service-ca": failed to list *v1.ConfigMap: configmaps "v4-0-config-system-service-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.940245 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-service-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"v4-0-config-system-service-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.940297 4817 reflector.go:561] object-"openshift-route-controller-manager"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.940314 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: W0218 13:59:33.942698 4817 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle": failed to list *v1.ConfigMap: configmaps "v4-0-config-system-trusted-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Feb 18 13:59:33 crc kubenswrapper[4817]: E0218 13:59:33.942729 4817 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-trusted-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"v4-0-config-system-trusted-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.945313 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-l9ndv"] Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.946297 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j4s9g"] Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.946715 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-bwc6d"] Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.947358 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-bwc6d" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.949949 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j4s9g" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.950369 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-l9ndv" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.953667 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.953891 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.957515 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.957904 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.958506 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.958780 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.958942 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.959176 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.959922 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.960329 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.960445 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.960453 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ds5z6"] Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.960900 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.961066 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j4llw"] Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.961202 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.961352 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-dqctb"] Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.973189 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.975116 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.975420 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.975840 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.975911 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.976903 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.978085 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.978782 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.979277 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dqctb" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.979993 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ds5z6" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.981089 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-j4llw" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.983797 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.984294 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.984644 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.984739 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.984782 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.984650 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.985045 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.985740 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.986511 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.986723 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.986843 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 18 13:59:33 crc kubenswrapper[4817]: I0218 13:59:33.989257 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-975fj"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.009293 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.010197 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.011406 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-975fj" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.011544 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9mrg6"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.011944 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.012150 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9mrg6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.012543 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mx7fv"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.013369 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-mx7fv" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.013460 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vs5x4"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.013680 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.014575 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.014956 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.018271 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-vs5x4" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.018746 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.018802 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.020856 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-tg2g5"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.020942 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.021746 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.021910 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tg2g5" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.022000 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-phf44"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.022767 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.023016 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-phf44" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.028371 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mj6tz"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.029164 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.029248 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mj6tz" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.032701 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.034059 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z77sh"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.034865 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z77sh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.036795 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/149bcfc3-9623-403e-8c4c-1019bd5f0c16-oauth-serving-cert\") pod \"console-f9d7485db-v2snv\" (UID: \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\") " pod="openshift-console/console-f9d7485db-v2snv" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.036895 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c2e62f6f-bd21-4255-85df-0c31e9edadaf-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7wn4g\" (UID: \"c2e62f6f-bd21-4255-85df-0c31e9edadaf\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7wn4g" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.037276 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a0286241-5427-4d37-9c39-717227ba63d8-machine-approver-tls\") pod \"machine-approver-56656f9798-w7l8m\" (UID: \"a0286241-5427-4d37-9c39-717227ba63d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w7l8m" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.037406 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbh4k\" (UniqueName: \"kubernetes.io/projected/149bcfc3-9623-403e-8c4c-1019bd5f0c16-kube-api-access-qbh4k\") pod \"console-f9d7485db-v2snv\" (UID: \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\") " pod="openshift-console/console-f9d7485db-v2snv" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.037105 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q6hcz"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.038343 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q6hcz" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.037509 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b5551de-6f87-439b-875c-b66e902f2f25-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-46tjc\" (UID: \"6b5551de-6f87-439b-875c-b66e902f2f25\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46tjc" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.038695 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d96mv\" (UniqueName: \"kubernetes.io/projected/18a3347a-5dd7-4047-8c43-9c073c9321e6-kube-api-access-d96mv\") pod \"marketplace-operator-79b997595-j4llw\" (UID: \"18a3347a-5dd7-4047-8c43-9c073c9321e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-j4llw" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.038737 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vbnz\" (UniqueName: \"kubernetes.io/projected/9f128b3f-8527-4b4b-86d5-b456fe89c804-kube-api-access-9vbnz\") pod \"openshift-config-operator-7777fb866f-dqctb\" (UID: \"9f128b3f-8527-4b4b-86d5-b456fe89c804\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dqctb" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.038768 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c-etcd-serving-ca\") pod \"apiserver-76f77b778f-4rsdh\" (UID: \"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c\") " pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.038794 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25f80446-5f8d-476a-91e7-c42b9572854b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ds5z6\" (UID: \"25f80446-5f8d-476a-91e7-c42b9572854b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ds5z6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.038821 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c-etcd-client\") pod \"apiserver-76f77b778f-4rsdh\" (UID: \"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c\") " pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.038847 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43f553ef-0150-4383-8c39-5db2cbcab63d-serving-cert\") pod \"controller-manager-879f6c89f-l7vpp\" (UID: \"43f553ef-0150-4383-8c39-5db2cbcab63d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.038874 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxmdb\" (UniqueName: \"kubernetes.io/projected/43f553ef-0150-4383-8c39-5db2cbcab63d-kube-api-access-kxmdb\") pod \"controller-manager-879f6c89f-l7vpp\" (UID: \"43f553ef-0150-4383-8c39-5db2cbcab63d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.038744 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-b75zt"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.038965 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.039022 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58vgz\" (UniqueName: \"kubernetes.io/projected/3707018b-031a-4902-8e5c-ba5bc46cc4c4-kube-api-access-58vgz\") pod \"route-controller-manager-6576b87f9c-hrdpr\" (UID: \"3707018b-031a-4902-8e5c-ba5bc46cc4c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.039045 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dthtv\" (UniqueName: \"kubernetes.io/projected/f8d657cd-dbc9-45d6-8ac4-9c22d9709980-kube-api-access-dthtv\") pod \"package-server-manager-789f6589d5-j4s9g\" (UID: \"f8d657cd-dbc9-45d6-8ac4-9c22d9709980\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j4s9g" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.039075 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3db6bfaa-89b3-49e5-9c33-2959670f96f1-config\") pod \"etcd-operator-b45778765-bwc6d\" (UID: \"3db6bfaa-89b3-49e5-9c33-2959670f96f1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bwc6d" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.039107 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.039137 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c-config\") pod \"apiserver-76f77b778f-4rsdh\" (UID: \"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c\") " pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.039160 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bd99710-b175-4115-8944-1fac544145c5-config\") pod \"machine-api-operator-5694c8668f-x8x2r\" (UID: \"3bd99710-b175-4115-8944-1fac544145c5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x8x2r" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.039205 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.039230 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgcch\" (UniqueName: \"kubernetes.io/projected/3db6bfaa-89b3-49e5-9c33-2959670f96f1-kube-api-access-fgcch\") pod \"etcd-operator-b45778765-bwc6d\" (UID: \"3db6bfaa-89b3-49e5-9c33-2959670f96f1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bwc6d" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.039252 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3bd99710-b175-4115-8944-1fac544145c5-images\") pod \"machine-api-operator-5694c8668f-x8x2r\" (UID: \"3bd99710-b175-4115-8944-1fac544145c5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x8x2r" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.039291 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f128b3f-8527-4b4b-86d5-b456fe89c804-serving-cert\") pod \"openshift-config-operator-7777fb866f-dqctb\" (UID: \"9f128b3f-8527-4b4b-86d5-b456fe89c804\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dqctb" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.039337 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a0286241-5427-4d37-9c39-717227ba63d8-auth-proxy-config\") pod \"machine-approver-56656f9798-w7l8m\" (UID: \"a0286241-5427-4d37-9c39-717227ba63d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w7l8m" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.039362 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9k4j\" (UniqueName: \"kubernetes.io/projected/a0286241-5427-4d37-9c39-717227ba63d8-kube-api-access-p9k4j\") pod \"machine-approver-56656f9798-w7l8m\" (UID: \"a0286241-5427-4d37-9c39-717227ba63d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w7l8m" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.039389 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c-serving-cert\") pod \"apiserver-76f77b778f-4rsdh\" (UID: \"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c\") " pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.039411 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bd99710-b175-4115-8944-1fac544145c5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-x8x2r\" (UID: \"3bd99710-b175-4115-8944-1fac544145c5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x8x2r" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.039434 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cc05dc28-13af-4d05-835a-e3ecc993b1ab-trusted-ca\") pod \"console-operator-58897d9998-rpmhp\" (UID: \"cc05dc28-13af-4d05-835a-e3ecc993b1ab\") " pod="openshift-console-operator/console-operator-58897d9998-rpmhp" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.039464 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3707018b-031a-4902-8e5c-ba5bc46cc4c4-client-ca\") pod \"route-controller-manager-6576b87f9c-hrdpr\" (UID: \"3707018b-031a-4902-8e5c-ba5bc46cc4c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.039489 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw69f\" (UniqueName: \"kubernetes.io/projected/25f80446-5f8d-476a-91e7-c42b9572854b-kube-api-access-pw69f\") pod \"openshift-controller-manager-operator-756b6f6bc6-ds5z6\" (UID: \"25f80446-5f8d-476a-91e7-c42b9572854b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ds5z6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.039524 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.039532 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b75zt" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.039547 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/43f553ef-0150-4383-8c39-5db2cbcab63d-client-ca\") pod \"controller-manager-879f6c89f-l7vpp\" (UID: \"43f553ef-0150-4383-8c39-5db2cbcab63d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.039811 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/149bcfc3-9623-403e-8c4c-1019bd5f0c16-console-config\") pod \"console-f9d7485db-v2snv\" (UID: \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\") " pod="openshift-console/console-f9d7485db-v2snv" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.039838 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c-audit-dir\") pod \"apiserver-76f77b778f-4rsdh\" (UID: \"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c\") " pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.039872 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.039908 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.039927 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.039950 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6b5551de-6f87-439b-875c-b66e902f2f25-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-46tjc\" (UID: \"6b5551de-6f87-439b-875c-b66e902f2f25\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46tjc" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.039972 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/78ff2eb3-3f40-4529-b52e-62316f24fd15-metrics-tls\") pod \"dns-operator-744455d44c-l9ndv\" (UID: \"78ff2eb3-3f40-4529-b52e-62316f24fd15\") " pod="openshift-dns-operator/dns-operator-744455d44c-l9ndv" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.040010 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9d682539-5f81-4b10-a11a-50d3a9ef0c1c-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gj9xr\" (UID: \"9d682539-5f81-4b10-a11a-50d3a9ef0c1c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gj9xr" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.040033 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b5551de-6f87-439b-875c-b66e902f2f25-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-46tjc\" (UID: \"6b5551de-6f87-439b-875c-b66e902f2f25\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46tjc" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.040056 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8d657cd-dbc9-45d6-8ac4-9c22d9709980-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-j4s9g\" (UID: \"f8d657cd-dbc9-45d6-8ac4-9c22d9709980\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j4s9g" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.040082 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/149bcfc3-9623-403e-8c4c-1019bd5f0c16-trusted-ca-bundle\") pod \"console-f9d7485db-v2snv\" (UID: \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\") " pod="openshift-console/console-f9d7485db-v2snv" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.040099 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.040124 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtns2\" (UniqueName: \"kubernetes.io/projected/39a56faf-6fea-45d0-9531-fb86f571fd8b-kube-api-access-gtns2\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.040141 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-serving-cert\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.040165 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/149bcfc3-9623-403e-8c4c-1019bd5f0c16-console-oauth-config\") pod \"console-f9d7485db-v2snv\" (UID: \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\") " pod="openshift-console/console-f9d7485db-v2snv" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.040185 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9ncz\" (UniqueName: \"kubernetes.io/projected/78ff2eb3-3f40-4529-b52e-62316f24fd15-kube-api-access-f9ncz\") pod \"dns-operator-744455d44c-l9ndv\" (UID: \"78ff2eb3-3f40-4529-b52e-62316f24fd15\") " pod="openshift-dns-operator/dns-operator-744455d44c-l9ndv" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.040210 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c-image-import-ca\") pod \"apiserver-76f77b778f-4rsdh\" (UID: \"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c\") " pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.040242 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.040278 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.040302 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/39a56faf-6fea-45d0-9531-fb86f571fd8b-audit-dir\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.040320 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.040345 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3db6bfaa-89b3-49e5-9c33-2959670f96f1-serving-cert\") pod \"etcd-operator-b45778765-bwc6d\" (UID: \"3db6bfaa-89b3-49e5-9c33-2959670f96f1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bwc6d" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.040366 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-audit-policies\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.041258 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523705-gm644"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.041961 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523705-gm644" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.042754 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mddfd"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.043570 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.058160 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vqv79"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.059397 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vqv79" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.059690 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.058196 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knjlz\" (UniqueName: \"kubernetes.io/projected/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-kube-api-access-knjlz\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.060127 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-twxwz"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.060520 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c-node-pullsecrets\") pod \"apiserver-76f77b778f-4rsdh\" (UID: \"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c\") " pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.062031 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-twxwz" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.062494 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-rs5bx"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.062631 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c-encryption-config\") pod \"apiserver-76f77b778f-4rsdh\" (UID: \"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c\") " pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.063066 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfgz7\" (UniqueName: \"kubernetes.io/projected/cc05dc28-13af-4d05-835a-e3ecc993b1ab-kube-api-access-jfgz7\") pod \"console-operator-58897d9998-rpmhp\" (UID: \"cc05dc28-13af-4d05-835a-e3ecc993b1ab\") " pod="openshift-console-operator/console-operator-58897d9998-rpmhp" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.064281 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkb2f\" (UniqueName: \"kubernetes.io/projected/5345e1d1-a74f-4d8f-8b86-2bb389a525a2-kube-api-access-nkb2f\") pod \"downloads-7954f5f757-wwvbf\" (UID: \"5345e1d1-a74f-4d8f-8b86-2bb389a525a2\") " pod="openshift-console/downloads-7954f5f757-wwvbf" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.064382 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qd62\" (UniqueName: \"kubernetes.io/projected/6b5551de-6f87-439b-875c-b66e902f2f25-kube-api-access-8qd62\") pod \"cluster-image-registry-operator-dc59b4c8b-46tjc\" (UID: \"6b5551de-6f87-439b-875c-b66e902f2f25\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46tjc" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.064546 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-etcd-client\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.064626 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kjls\" (UniqueName: \"kubernetes.io/projected/67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c-kube-api-access-8kjls\") pod \"apiserver-76f77b778f-4rsdh\" (UID: \"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c\") " pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.064715 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc05dc28-13af-4d05-835a-e3ecc993b1ab-config\") pod \"console-operator-58897d9998-rpmhp\" (UID: \"cc05dc28-13af-4d05-835a-e3ecc993b1ab\") " pod="openshift-console-operator/console-operator-58897d9998-rpmhp" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.064798 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/149bcfc3-9623-403e-8c4c-1019bd5f0c16-console-serving-cert\") pod \"console-f9d7485db-v2snv\" (UID: \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\") " pod="openshift-console/console-f9d7485db-v2snv" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.064875 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/149bcfc3-9623-403e-8c4c-1019bd5f0c16-service-ca\") pod \"console-f9d7485db-v2snv\" (UID: \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\") " pod="openshift-console/console-f9d7485db-v2snv" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.064993 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3db6bfaa-89b3-49e5-9c33-2959670f96f1-etcd-client\") pod \"etcd-operator-b45778765-bwc6d\" (UID: \"3db6bfaa-89b3-49e5-9c33-2959670f96f1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bwc6d" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.065077 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/43f553ef-0150-4383-8c39-5db2cbcab63d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-l7vpp\" (UID: \"43f553ef-0150-4383-8c39-5db2cbcab63d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.065166 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-audit-policies\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.065242 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/18a3347a-5dd7-4047-8c43-9c073c9321e6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-j4llw\" (UID: \"18a3347a-5dd7-4047-8c43-9c073c9321e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-j4llw" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.065319 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0286241-5427-4d37-9c39-717227ba63d8-config\") pod \"machine-approver-56656f9798-w7l8m\" (UID: \"a0286241-5427-4d37-9c39-717227ba63d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w7l8m" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.065393 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c-audit\") pod \"apiserver-76f77b778f-4rsdh\" (UID: \"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c\") " pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.065508 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3707018b-031a-4902-8e5c-ba5bc46cc4c4-serving-cert\") pod \"route-controller-manager-6576b87f9c-hrdpr\" (UID: \"3707018b-031a-4902-8e5c-ba5bc46cc4c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.065589 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs8n9\" (UniqueName: \"kubernetes.io/projected/3bd99710-b175-4115-8944-1fac544145c5-kube-api-access-zs8n9\") pod \"machine-api-operator-5694c8668f-x8x2r\" (UID: \"3bd99710-b175-4115-8944-1fac544145c5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x8x2r" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.065672 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.065751 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3707018b-031a-4902-8e5c-ba5bc46cc4c4-config\") pod \"route-controller-manager-6576b87f9c-hrdpr\" (UID: \"3707018b-031a-4902-8e5c-ba5bc46cc4c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.065833 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c2e62f6f-bd21-4255-85df-0c31e9edadaf-metrics-tls\") pod \"ingress-operator-5b745b69d9-7wn4g\" (UID: \"c2e62f6f-bd21-4255-85df-0c31e9edadaf\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7wn4g" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.065915 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3db6bfaa-89b3-49e5-9c33-2959670f96f1-etcd-ca\") pod \"etcd-operator-b45778765-bwc6d\" (UID: \"3db6bfaa-89b3-49e5-9c33-2959670f96f1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bwc6d" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.066022 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3db6bfaa-89b3-49e5-9c33-2959670f96f1-etcd-service-ca\") pod \"etcd-operator-b45778765-bwc6d\" (UID: \"3db6bfaa-89b3-49e5-9c33-2959670f96f1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bwc6d" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.066444 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlpp7\" (UniqueName: \"kubernetes.io/projected/c2e62f6f-bd21-4255-85df-0c31e9edadaf-kube-api-access-xlpp7\") pod \"ingress-operator-5b745b69d9-7wn4g\" (UID: \"c2e62f6f-bd21-4255-85df-0c31e9edadaf\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7wn4g" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.066587 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25f80446-5f8d-476a-91e7-c42b9572854b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ds5z6\" (UID: \"25f80446-5f8d-476a-91e7-c42b9572854b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ds5z6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.067547 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxqt7\" (UniqueName: \"kubernetes.io/projected/9d682539-5f81-4b10-a11a-50d3a9ef0c1c-kube-api-access-sxqt7\") pod \"cluster-samples-operator-665b6dd947-gj9xr\" (UID: \"9d682539-5f81-4b10-a11a-50d3a9ef0c1c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gj9xr" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.067661 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/18a3347a-5dd7-4047-8c43-9c073c9321e6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-j4llw\" (UID: \"18a3347a-5dd7-4047-8c43-9c073c9321e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-j4llw" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.067740 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-audit-dir\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.067837 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c-trusted-ca-bundle\") pod \"apiserver-76f77b778f-4rsdh\" (UID: \"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c\") " pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.067949 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.068069 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c2e62f6f-bd21-4255-85df-0c31e9edadaf-trusted-ca\") pod \"ingress-operator-5b745b69d9-7wn4g\" (UID: \"c2e62f6f-bd21-4255-85df-0c31e9edadaf\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7wn4g" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.068157 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43f553ef-0150-4383-8c39-5db2cbcab63d-config\") pod \"controller-manager-879f6c89f-l7vpp\" (UID: \"43f553ef-0150-4383-8c39-5db2cbcab63d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.068248 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9f128b3f-8527-4b4b-86d5-b456fe89c804-available-featuregates\") pod \"openshift-config-operator-7777fb866f-dqctb\" (UID: \"9f128b3f-8527-4b4b-86d5-b456fe89c804\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dqctb" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.068327 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-encryption-config\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.068490 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc05dc28-13af-4d05-835a-e3ecc993b1ab-serving-cert\") pod \"console-operator-58897d9998-rpmhp\" (UID: \"cc05dc28-13af-4d05-835a-e3ecc993b1ab\") " pod="openshift-console-operator/console-operator-58897d9998-rpmhp" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.069061 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2b4hg"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.069384 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rs5bx" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.070209 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2b4hg" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.076584 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.089862 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-6w9rz"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.092193 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc74r"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.093396 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc74r" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.093801 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-6w9rz" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.109191 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.110058 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-mxgkh"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.110948 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mxgkh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.111208 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-4rsdh"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.113043 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-x8x2r"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.114487 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.115290 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bclz6"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.116397 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-l7vpp"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.117358 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.118271 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-v2snv"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.120815 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46tjc"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.120946 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j4llw"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.122022 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-pqtc6"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.122951 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pqtc6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.123096 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-x4qx6"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.124239 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ds5z6"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.125367 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j4s9g"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.126045 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-x4qx6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.126480 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9mrg6"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.128337 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-wwvbf"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.129177 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mx7fv"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.129866 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-975fj"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.130875 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.131380 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gj9xr"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.133547 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-bwc6d"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.134756 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-l9ndv"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.136360 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.139122 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-tg2g5"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.140225 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7wn4g"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.141400 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-dqctb"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.143060 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-rpmhp"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.143739 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-87jjf"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.144620 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-87jjf" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.144841 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523705-gm644"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.145822 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-rs5bx"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.147224 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-mxgkh"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.148207 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vs5x4"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.149204 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-pb2jx"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.150076 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-pb2jx" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.150223 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-b75zt"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.150830 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.151476 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mddfd"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.152472 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-phf44"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.153394 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-twxwz"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.154338 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pqtc6"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.155386 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z77sh"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.157111 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q6hcz"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.160484 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-m47g9"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.163259 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mj6tz"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.163309 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2b4hg"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.163433 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-m47g9" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.169606 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/18a3347a-5dd7-4047-8c43-9c073c9321e6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-j4llw\" (UID: \"18a3347a-5dd7-4047-8c43-9c073c9321e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-j4llw" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.169678 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0286241-5427-4d37-9c39-717227ba63d8-config\") pod \"machine-approver-56656f9798-w7l8m\" (UID: \"a0286241-5427-4d37-9c39-717227ba63d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w7l8m" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.169756 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3707018b-031a-4902-8e5c-ba5bc46cc4c4-serving-cert\") pod \"route-controller-manager-6576b87f9c-hrdpr\" (UID: \"3707018b-031a-4902-8e5c-ba5bc46cc4c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.169820 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs8n9\" (UniqueName: \"kubernetes.io/projected/3bd99710-b175-4115-8944-1fac544145c5-kube-api-access-zs8n9\") pod \"machine-api-operator-5694c8668f-x8x2r\" (UID: \"3bd99710-b175-4115-8944-1fac544145c5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x8x2r" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.169864 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mdfh\" (UniqueName: \"kubernetes.io/projected/98bc0caa-ee8a-406e-952d-b400d8c72116-kube-api-access-4mdfh\") pod \"openshift-apiserver-operator-796bbdcf4f-twxwz\" (UID: \"98bc0caa-ee8a-406e-952d-b400d8c72116\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-twxwz" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.169917 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.169952 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c2e62f6f-bd21-4255-85df-0c31e9edadaf-metrics-tls\") pod \"ingress-operator-5b745b69d9-7wn4g\" (UID: \"c2e62f6f-bd21-4255-85df-0c31e9edadaf\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7wn4g" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.170006 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3db6bfaa-89b3-49e5-9c33-2959670f96f1-etcd-service-ca\") pod \"etcd-operator-b45778765-bwc6d\" (UID: \"3db6bfaa-89b3-49e5-9c33-2959670f96f1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bwc6d" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.170112 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlpp7\" (UniqueName: \"kubernetes.io/projected/c2e62f6f-bd21-4255-85df-0c31e9edadaf-kube-api-access-xlpp7\") pod \"ingress-operator-5b745b69d9-7wn4g\" (UID: \"c2e62f6f-bd21-4255-85df-0c31e9edadaf\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7wn4g" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.170197 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c-trusted-ca-bundle\") pod \"apiserver-76f77b778f-4rsdh\" (UID: \"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c\") " pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.170274 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0f3e1cca-80c4-4d17-bb9f-9556fed78aac-srv-cert\") pod \"catalog-operator-68c6474976-vqv79\" (UID: \"0f3e1cca-80c4-4d17-bb9f-9556fed78aac\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vqv79" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.170347 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c2e62f6f-bd21-4255-85df-0c31e9edadaf-trusted-ca\") pod \"ingress-operator-5b745b69d9-7wn4g\" (UID: \"c2e62f6f-bd21-4255-85df-0c31e9edadaf\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7wn4g" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.170377 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43f553ef-0150-4383-8c39-5db2cbcab63d-config\") pod \"controller-manager-879f6c89f-l7vpp\" (UID: \"43f553ef-0150-4383-8c39-5db2cbcab63d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.170428 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlr8v\" (UniqueName: \"kubernetes.io/projected/0a42d5a9-1383-4a55-9c81-bd40eb5ba86f-kube-api-access-xlr8v\") pod \"router-default-5444994796-6w9rz\" (UID: \"0a42d5a9-1383-4a55-9c81-bd40eb5ba86f\") " pod="openshift-ingress/router-default-5444994796-6w9rz" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.170464 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0286241-5427-4d37-9c39-717227ba63d8-config\") pod \"machine-approver-56656f9798-w7l8m\" (UID: \"a0286241-5427-4d37-9c39-717227ba63d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w7l8m" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.170586 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-encryption-config\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.170648 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98bc0caa-ee8a-406e-952d-b400d8c72116-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-twxwz\" (UID: \"98bc0caa-ee8a-406e-952d-b400d8c72116\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-twxwz" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.170684 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a42d5a9-1383-4a55-9c81-bd40eb5ba86f-metrics-certs\") pod \"router-default-5444994796-6w9rz\" (UID: \"0a42d5a9-1383-4a55-9c81-bd40eb5ba86f\") " pod="openshift-ingress/router-default-5444994796-6w9rz" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.170726 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/149bcfc3-9623-403e-8c4c-1019bd5f0c16-oauth-serving-cert\") pod \"console-f9d7485db-v2snv\" (UID: \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\") " pod="openshift-console/console-f9d7485db-v2snv" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.170760 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a0286241-5427-4d37-9c39-717227ba63d8-machine-approver-tls\") pod \"machine-approver-56656f9798-w7l8m\" (UID: \"a0286241-5427-4d37-9c39-717227ba63d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w7l8m" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.170797 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f85543c-dccf-4a3e-be40-7305a2e49d1d-secret-volume\") pod \"collect-profiles-29523705-gm644\" (UID: \"3f85543c-dccf-4a3e-be40-7305a2e49d1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523705-gm644" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.171152 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3db6bfaa-89b3-49e5-9c33-2959670f96f1-etcd-service-ca\") pod \"etcd-operator-b45778765-bwc6d\" (UID: \"3db6bfaa-89b3-49e5-9c33-2959670f96f1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bwc6d" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.171341 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6smt\" (UniqueName: \"kubernetes.io/projected/4ef73bb6-9567-4fe4-81ae-ae50b8c70455-kube-api-access-p6smt\") pod \"kube-storage-version-migrator-operator-b67b599dd-2b4hg\" (UID: \"4ef73bb6-9567-4fe4-81ae-ae50b8c70455\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2b4hg" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.171390 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ed392029-9b45-4464-9422-10e4ec72db07-images\") pod \"machine-config-operator-74547568cd-b75zt\" (UID: \"ed392029-9b45-4464-9422-10e4ec72db07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b75zt" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.171432 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5b9edf7b-9549-4eda-a45e-27b94c137b4a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-9mrg6\" (UID: \"5b9edf7b-9549-4eda-a45e-27b94c137b4a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9mrg6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.171470 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgkpz\" (UniqueName: \"kubernetes.io/projected/9f66536d-c481-41b3-b5e5-8259651a95d9-kube-api-access-cgkpz\") pod \"control-plane-machine-set-operator-78cbb6b69f-mj6tz\" (UID: \"9f66536d-c481-41b3-b5e5-8259651a95d9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mj6tz" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.171538 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b5551de-6f87-439b-875c-b66e902f2f25-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-46tjc\" (UID: \"6b5551de-6f87-439b-875c-b66e902f2f25\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46tjc" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.171574 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c-etcd-serving-ca\") pod \"apiserver-76f77b778f-4rsdh\" (UID: \"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c\") " pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.171608 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25f80446-5f8d-476a-91e7-c42b9572854b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ds5z6\" (UID: \"25f80446-5f8d-476a-91e7-c42b9572854b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ds5z6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.171781 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43f553ef-0150-4383-8c39-5db2cbcab63d-serving-cert\") pod \"controller-manager-879f6c89f-l7vpp\" (UID: \"43f553ef-0150-4383-8c39-5db2cbcab63d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.171842 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.171902 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dthtv\" (UniqueName: \"kubernetes.io/projected/f8d657cd-dbc9-45d6-8ac4-9c22d9709980-kube-api-access-dthtv\") pod \"package-server-manager-789f6589d5-j4s9g\" (UID: \"f8d657cd-dbc9-45d6-8ac4-9c22d9709980\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j4s9g" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.172017 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3db6bfaa-89b3-49e5-9c33-2959670f96f1-config\") pod \"etcd-operator-b45778765-bwc6d\" (UID: \"3db6bfaa-89b3-49e5-9c33-2959670f96f1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bwc6d" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.172063 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqftb\" (UniqueName: \"kubernetes.io/projected/456d086a-23a5-42fa-b637-a090f649ffe5-kube-api-access-wqftb\") pod \"machine-config-controller-84d6567774-mxgkh\" (UID: \"456d086a-23a5-42fa-b637-a090f649ffe5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mxgkh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.172360 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/57387b14-66f5-4d48-9591-784eb4216a13-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-z77sh\" (UID: \"57387b14-66f5-4d48-9591-784eb4216a13\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z77sh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.173011 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c-config\") pod \"apiserver-76f77b778f-4rsdh\" (UID: \"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c\") " pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.173248 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bd99710-b175-4115-8944-1fac544145c5-config\") pod \"machine-api-operator-5694c8668f-x8x2r\" (UID: \"3bd99710-b175-4115-8944-1fac544145c5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x8x2r" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.173520 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c-etcd-serving-ca\") pod \"apiserver-76f77b778f-4rsdh\" (UID: \"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c\") " pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.173620 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.173722 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgcch\" (UniqueName: \"kubernetes.io/projected/3db6bfaa-89b3-49e5-9c33-2959670f96f1-kube-api-access-fgcch\") pod \"etcd-operator-b45778765-bwc6d\" (UID: \"3db6bfaa-89b3-49e5-9c33-2959670f96f1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bwc6d" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.173809 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0a42d5a9-1383-4a55-9c81-bd40eb5ba86f-stats-auth\") pod \"router-default-5444994796-6w9rz\" (UID: \"0a42d5a9-1383-4a55-9c81-bd40eb5ba86f\") " pod="openshift-ingress/router-default-5444994796-6w9rz" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.173860 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9k4j\" (UniqueName: \"kubernetes.io/projected/a0286241-5427-4d37-9c39-717227ba63d8-kube-api-access-p9k4j\") pod \"machine-approver-56656f9798-w7l8m\" (UID: \"a0286241-5427-4d37-9c39-717227ba63d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w7l8m" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.174174 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c-serving-cert\") pod \"apiserver-76f77b778f-4rsdh\" (UID: \"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c\") " pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.174471 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cc05dc28-13af-4d05-835a-e3ecc993b1ab-trusted-ca\") pod \"console-operator-58897d9998-rpmhp\" (UID: \"cc05dc28-13af-4d05-835a-e3ecc993b1ab\") " pod="openshift-console-operator/console-operator-58897d9998-rpmhp" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.174578 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25f80446-5f8d-476a-91e7-c42b9572854b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ds5z6\" (UID: \"25f80446-5f8d-476a-91e7-c42b9572854b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ds5z6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.174947 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3db6bfaa-89b3-49e5-9c33-2959670f96f1-config\") pod \"etcd-operator-b45778765-bwc6d\" (UID: \"3db6bfaa-89b3-49e5-9c33-2959670f96f1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bwc6d" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.170648 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vqv79"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.175661 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c-trusted-ca-bundle\") pod \"apiserver-76f77b778f-4rsdh\" (UID: \"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c\") " pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.175818 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c2e62f6f-bd21-4255-85df-0c31e9edadaf-trusted-ca\") pod \"ingress-operator-5b745b69d9-7wn4g\" (UID: \"c2e62f6f-bd21-4255-85df-0c31e9edadaf\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7wn4g" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.175926 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.176001 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.176306 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ef73bb6-9567-4fe4-81ae-ae50b8c70455-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-2b4hg\" (UID: \"4ef73bb6-9567-4fe4-81ae-ae50b8c70455\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2b4hg" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.176360 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ee154dbc-917d-46f5-bd9e-1a3c11a19a41-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vc74r\" (UID: \"ee154dbc-917d-46f5-bd9e-1a3c11a19a41\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc74r" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.176409 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3707018b-031a-4902-8e5c-ba5bc46cc4c4-client-ca\") pod \"route-controller-manager-6576b87f9c-hrdpr\" (UID: \"3707018b-031a-4902-8e5c-ba5bc46cc4c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.176439 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f85543c-dccf-4a3e-be40-7305a2e49d1d-config-volume\") pod \"collect-profiles-29523705-gm644\" (UID: \"3f85543c-dccf-4a3e-be40-7305a2e49d1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523705-gm644" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.176474 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ef73bb6-9567-4fe4-81ae-ae50b8c70455-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-2b4hg\" (UID: \"4ef73bb6-9567-4fe4-81ae-ae50b8c70455\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2b4hg" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.176583 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee154dbc-917d-46f5-bd9e-1a3c11a19a41-config\") pod \"kube-controller-manager-operator-78b949d7b-vc74r\" (UID: \"ee154dbc-917d-46f5-bd9e-1a3c11a19a41\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc74r" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.176621 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.176655 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/149bcfc3-9623-403e-8c4c-1019bd5f0c16-console-config\") pod \"console-f9d7485db-v2snv\" (UID: \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\") " pod="openshift-console/console-f9d7485db-v2snv" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.176679 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.176945 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.177007 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a42d5a9-1383-4a55-9c81-bd40eb5ba86f-service-ca-bundle\") pod \"router-default-5444994796-6w9rz\" (UID: \"0a42d5a9-1383-4a55-9c81-bd40eb5ba86f\") " pod="openshift-ingress/router-default-5444994796-6w9rz" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.177036 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b5551de-6f87-439b-875c-b66e902f2f25-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-46tjc\" (UID: \"6b5551de-6f87-439b-875c-b66e902f2f25\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46tjc" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.177073 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l7rd\" (UniqueName: \"kubernetes.io/projected/e61f548d-eef7-4f2a-9854-c5bdb6b2b815-kube-api-access-7l7rd\") pod \"authentication-operator-69f744f599-975fj\" (UID: \"e61f548d-eef7-4f2a-9854-c5bdb6b2b815\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-975fj" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.176948 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj24h" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.177309 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98bc0caa-ee8a-406e-952d-b400d8c72116-config\") pod \"openshift-apiserver-operator-796bbdcf4f-twxwz\" (UID: \"98bc0caa-ee8a-406e-952d-b400d8c72116\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-twxwz" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.177515 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9f66536d-c481-41b3-b5e5-8259651a95d9-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-mj6tz\" (UID: \"9f66536d-c481-41b3-b5e5-8259651a95d9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mj6tz" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.177598 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/149bcfc3-9623-403e-8c4c-1019bd5f0c16-trusted-ca-bundle\") pod \"console-f9d7485db-v2snv\" (UID: \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\") " pod="openshift-console/console-f9d7485db-v2snv" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.177635 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-serving-cert\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.178515 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b5551de-6f87-439b-875c-b66e902f2f25-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-46tjc\" (UID: \"6b5551de-6f87-439b-875c-b66e902f2f25\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46tjc" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.179367 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/149bcfc3-9623-403e-8c4c-1019bd5f0c16-oauth-serving-cert\") pod \"console-f9d7485db-v2snv\" (UID: \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\") " pod="openshift-console/console-f9d7485db-v2snv" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.179373 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/149bcfc3-9623-403e-8c4c-1019bd5f0c16-console-oauth-config\") pod \"console-f9d7485db-v2snv\" (UID: \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\") " pod="openshift-console/console-f9d7485db-v2snv" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.179722 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/149bcfc3-9623-403e-8c4c-1019bd5f0c16-trusted-ca-bundle\") pod \"console-f9d7485db-v2snv\" (UID: \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\") " pod="openshift-console/console-f9d7485db-v2snv" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.180158 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/149bcfc3-9623-403e-8c4c-1019bd5f0c16-console-config\") pod \"console-f9d7485db-v2snv\" (UID: \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\") " pod="openshift-console/console-f9d7485db-v2snv" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.180236 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9ncz\" (UniqueName: \"kubernetes.io/projected/78ff2eb3-3f40-4529-b52e-62316f24fd15-kube-api-access-f9ncz\") pod \"dns-operator-744455d44c-l9ndv\" (UID: \"78ff2eb3-3f40-4529-b52e-62316f24fd15\") " pod="openshift-dns-operator/dns-operator-744455d44c-l9ndv" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.180337 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0f3e1cca-80c4-4d17-bb9f-9556fed78aac-profile-collector-cert\") pod \"catalog-operator-68c6474976-vqv79\" (UID: \"0f3e1cca-80c4-4d17-bb9f-9556fed78aac\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vqv79" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.180379 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c2e62f6f-bd21-4255-85df-0c31e9edadaf-metrics-tls\") pod \"ingress-operator-5b745b69d9-7wn4g\" (UID: \"c2e62f6f-bd21-4255-85df-0c31e9edadaf\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7wn4g" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.180444 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/acfbc5ef-21ea-41bb-ba10-c5b5e79dd593-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mx7fv\" (UID: \"acfbc5ef-21ea-41bb-ba10-c5b5e79dd593\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mx7fv" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.180852 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx565\" (UniqueName: \"kubernetes.io/projected/3f85543c-dccf-4a3e-be40-7305a2e49d1d-kube-api-access-xx565\") pod \"collect-profiles-29523705-gm644\" (UID: \"3f85543c-dccf-4a3e-be40-7305a2e49d1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523705-gm644" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.180970 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fd1e3a6-ff0d-4bf0-be88-ce9117cfedf8-config\") pod \"service-ca-operator-777779d784-tg2g5\" (UID: \"1fd1e3a6-ff0d-4bf0-be88-ce9117cfedf8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tg2g5" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.181069 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.181159 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szfbg\" (UniqueName: \"kubernetes.io/projected/0f3e1cca-80c4-4d17-bb9f-9556fed78aac-kube-api-access-szfbg\") pod \"catalog-operator-68c6474976-vqv79\" (UID: \"0f3e1cca-80c4-4d17-bb9f-9556fed78aac\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vqv79" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.181610 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/39a56faf-6fea-45d0-9531-fb86f571fd8b-audit-dir\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.181723 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knjlz\" (UniqueName: \"kubernetes.io/projected/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-kube-api-access-knjlz\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.181760 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/39a56faf-6fea-45d0-9531-fb86f571fd8b-audit-dir\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.181812 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c-node-pullsecrets\") pod \"apiserver-76f77b778f-4rsdh\" (UID: \"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c\") " pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.181962 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-etcd-client\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.182066 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c-node-pullsecrets\") pod \"apiserver-76f77b778f-4rsdh\" (UID: \"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c\") " pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.182211 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/43f553ef-0150-4383-8c39-5db2cbcab63d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-l7vpp\" (UID: \"43f553ef-0150-4383-8c39-5db2cbcab63d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.182284 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ed392029-9b45-4464-9422-10e4ec72db07-auth-proxy-config\") pod \"machine-config-operator-74547568cd-b75zt\" (UID: \"ed392029-9b45-4464-9422-10e4ec72db07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b75zt" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.182507 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-audit-policies\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.182631 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c-audit\") pod \"apiserver-76f77b778f-4rsdh\" (UID: \"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c\") " pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.182773 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c-config\") pod \"apiserver-76f77b778f-4rsdh\" (UID: \"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c\") " pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.182914 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e61f548d-eef7-4f2a-9854-c5bdb6b2b815-config\") pod \"authentication-operator-69f744f599-975fj\" (UID: \"e61f548d-eef7-4f2a-9854-c5bdb6b2b815\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-975fj" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.183075 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3707018b-031a-4902-8e5c-ba5bc46cc4c4-config\") pod \"route-controller-manager-6576b87f9c-hrdpr\" (UID: \"3707018b-031a-4902-8e5c-ba5bc46cc4c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.183111 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3db6bfaa-89b3-49e5-9c33-2959670f96f1-etcd-ca\") pod \"etcd-operator-b45778765-bwc6d\" (UID: \"3db6bfaa-89b3-49e5-9c33-2959670f96f1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bwc6d" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.183148 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ed392029-9b45-4464-9422-10e4ec72db07-proxy-tls\") pod \"machine-config-operator-74547568cd-b75zt\" (UID: \"ed392029-9b45-4464-9422-10e4ec72db07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b75zt" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.183179 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e61f548d-eef7-4f2a-9854-c5bdb6b2b815-service-ca-bundle\") pod \"authentication-operator-69f744f599-975fj\" (UID: \"e61f548d-eef7-4f2a-9854-c5bdb6b2b815\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-975fj" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.183203 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5b9edf7b-9549-4eda-a45e-27b94c137b4a-srv-cert\") pod \"olm-operator-6b444d44fb-9mrg6\" (UID: \"5b9edf7b-9549-4eda-a45e-27b94c137b4a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9mrg6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.183232 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25f80446-5f8d-476a-91e7-c42b9572854b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ds5z6\" (UID: \"25f80446-5f8d-476a-91e7-c42b9572854b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ds5z6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.183260 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxqt7\" (UniqueName: \"kubernetes.io/projected/9d682539-5f81-4b10-a11a-50d3a9ef0c1c-kube-api-access-sxqt7\") pod \"cluster-samples-operator-665b6dd947-gj9xr\" (UID: \"9d682539-5f81-4b10-a11a-50d3a9ef0c1c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gj9xr" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.183295 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/18a3347a-5dd7-4047-8c43-9c073c9321e6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-j4llw\" (UID: \"18a3347a-5dd7-4047-8c43-9c073c9321e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-j4llw" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.183324 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-audit-dir\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.183346 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skw9m\" (UniqueName: \"kubernetes.io/projected/acfbc5ef-21ea-41bb-ba10-c5b5e79dd593-kube-api-access-skw9m\") pod \"multus-admission-controller-857f4d67dd-mx7fv\" (UID: \"acfbc5ef-21ea-41bb-ba10-c5b5e79dd593\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mx7fv" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.183383 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.183406 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9f128b3f-8527-4b4b-86d5-b456fe89c804-available-featuregates\") pod \"openshift-config-operator-7777fb866f-dqctb\" (UID: \"9f128b3f-8527-4b4b-86d5-b456fe89c804\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dqctb" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.183435 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc05dc28-13af-4d05-835a-e3ecc993b1ab-serving-cert\") pod \"console-operator-58897d9998-rpmhp\" (UID: \"cc05dc28-13af-4d05-835a-e3ecc993b1ab\") " pod="openshift-console-operator/console-operator-58897d9998-rpmhp" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.183459 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee154dbc-917d-46f5-bd9e-1a3c11a19a41-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vc74r\" (UID: \"ee154dbc-917d-46f5-bd9e-1a3c11a19a41\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc74r" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.183493 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e61f548d-eef7-4f2a-9854-c5bdb6b2b815-serving-cert\") pod \"authentication-operator-69f744f599-975fj\" (UID: \"e61f548d-eef7-4f2a-9854-c5bdb6b2b815\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-975fj" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.183532 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c2e62f6f-bd21-4255-85df-0c31e9edadaf-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7wn4g\" (UID: \"c2e62f6f-bd21-4255-85df-0c31e9edadaf\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7wn4g" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.183563 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2lg2\" (UniqueName: \"kubernetes.io/projected/5b9edf7b-9549-4eda-a45e-27b94c137b4a-kube-api-access-m2lg2\") pod \"olm-operator-6b444d44fb-9mrg6\" (UID: \"5b9edf7b-9549-4eda-a45e-27b94c137b4a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9mrg6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.183578 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/43f553ef-0150-4383-8c39-5db2cbcab63d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-l7vpp\" (UID: \"43f553ef-0150-4383-8c39-5db2cbcab63d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.183590 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbh4k\" (UniqueName: \"kubernetes.io/projected/149bcfc3-9623-403e-8c4c-1019bd5f0c16-kube-api-access-qbh4k\") pod \"console-f9d7485db-v2snv\" (UID: \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\") " pod="openshift-console/console-f9d7485db-v2snv" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.183599 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c-audit\") pod \"apiserver-76f77b778f-4rsdh\" (UID: \"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c\") " pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.183631 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d96mv\" (UniqueName: \"kubernetes.io/projected/18a3347a-5dd7-4047-8c43-9c073c9321e6-kube-api-access-d96mv\") pod \"marketplace-operator-79b997595-j4llw\" (UID: \"18a3347a-5dd7-4047-8c43-9c073c9321e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-j4llw" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.183703 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/149bcfc3-9623-403e-8c4c-1019bd5f0c16-console-oauth-config\") pod \"console-f9d7485db-v2snv\" (UID: \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\") " pod="openshift-console/console-f9d7485db-v2snv" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.183759 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vbnz\" (UniqueName: \"kubernetes.io/projected/9f128b3f-8527-4b4b-86d5-b456fe89c804-kube-api-access-9vbnz\") pod \"openshift-config-operator-7777fb866f-dqctb\" (UID: \"9f128b3f-8527-4b4b-86d5-b456fe89c804\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dqctb" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.183848 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-audit-dir\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.183856 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3db6bfaa-89b3-49e5-9c33-2959670f96f1-etcd-ca\") pod \"etcd-operator-b45778765-bwc6d\" (UID: \"3db6bfaa-89b3-49e5-9c33-2959670f96f1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bwc6d" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.184014 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c-etcd-client\") pod \"apiserver-76f77b778f-4rsdh\" (UID: \"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c\") " pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.184042 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxmdb\" (UniqueName: \"kubernetes.io/projected/43f553ef-0150-4383-8c39-5db2cbcab63d-kube-api-access-kxmdb\") pod \"controller-manager-879f6c89f-l7vpp\" (UID: \"43f553ef-0150-4383-8c39-5db2cbcab63d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.184067 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.184847 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58vgz\" (UniqueName: \"kubernetes.io/projected/3707018b-031a-4902-8e5c-ba5bc46cc4c4-kube-api-access-58vgz\") pod \"route-controller-manager-6576b87f9c-hrdpr\" (UID: \"3707018b-031a-4902-8e5c-ba5bc46cc4c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.184949 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a0286241-5427-4d37-9c39-717227ba63d8-machine-approver-tls\") pod \"machine-approver-56656f9798-w7l8m\" (UID: \"a0286241-5427-4d37-9c39-717227ba63d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w7l8m" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.184961 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9f128b3f-8527-4b4b-86d5-b456fe89c804-available-featuregates\") pod \"openshift-config-operator-7777fb866f-dqctb\" (UID: \"9f128b3f-8527-4b4b-86d5-b456fe89c804\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dqctb" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.184959 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/456d086a-23a5-42fa-b637-a090f649ffe5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-mxgkh\" (UID: \"456d086a-23a5-42fa-b637-a090f649ffe5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mxgkh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.185109 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.185207 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6df56857-33ce-442e-a600-188003fc196e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q6hcz\" (UID: \"6df56857-33ce-442e-a600-188003fc196e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q6hcz" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.185276 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3bd99710-b175-4115-8944-1fac544145c5-images\") pod \"machine-api-operator-5694c8668f-x8x2r\" (UID: \"3bd99710-b175-4115-8944-1fac544145c5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x8x2r" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.185389 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f128b3f-8527-4b4b-86d5-b456fe89c804-serving-cert\") pod \"openshift-config-operator-7777fb866f-dqctb\" (UID: \"9f128b3f-8527-4b4b-86d5-b456fe89c804\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dqctb" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.185466 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a0286241-5427-4d37-9c39-717227ba63d8-auth-proxy-config\") pod \"machine-approver-56656f9798-w7l8m\" (UID: \"a0286241-5427-4d37-9c39-717227ba63d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w7l8m" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.185495 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c-serving-cert\") pod \"apiserver-76f77b778f-4rsdh\" (UID: \"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c\") " pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.185528 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bd99710-b175-4115-8944-1fac544145c5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-x8x2r\" (UID: \"3bd99710-b175-4115-8944-1fac544145c5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x8x2r" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.185569 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw69f\" (UniqueName: \"kubernetes.io/projected/25f80446-5f8d-476a-91e7-c42b9572854b-kube-api-access-pw69f\") pod \"openshift-controller-manager-operator-756b6f6bc6-ds5z6\" (UID: \"25f80446-5f8d-476a-91e7-c42b9572854b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ds5z6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.185703 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/43f553ef-0150-4383-8c39-5db2cbcab63d-client-ca\") pod \"controller-manager-879f6c89f-l7vpp\" (UID: \"43f553ef-0150-4383-8c39-5db2cbcab63d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.186068 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a0286241-5427-4d37-9c39-717227ba63d8-auth-proxy-config\") pod \"machine-approver-56656f9798-w7l8m\" (UID: \"a0286241-5427-4d37-9c39-717227ba63d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w7l8m" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.186162 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c-audit-dir\") pod \"apiserver-76f77b778f-4rsdh\" (UID: \"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c\") " pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.186263 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0a42d5a9-1383-4a55-9c81-bd40eb5ba86f-default-certificate\") pod \"router-default-5444994796-6w9rz\" (UID: \"0a42d5a9-1383-4a55-9c81-bd40eb5ba86f\") " pod="openshift-ingress/router-default-5444994796-6w9rz" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.186294 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c-audit-dir\") pod \"apiserver-76f77b778f-4rsdh\" (UID: \"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c\") " pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.186389 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.186429 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6b5551de-6f87-439b-875c-b66e902f2f25-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-46tjc\" (UID: \"6b5551de-6f87-439b-875c-b66e902f2f25\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46tjc" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.186498 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cc05dc28-13af-4d05-835a-e3ecc993b1ab-trusted-ca\") pod \"console-operator-58897d9998-rpmhp\" (UID: \"cc05dc28-13af-4d05-835a-e3ecc993b1ab\") " pod="openshift-console-operator/console-operator-58897d9998-rpmhp" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.186503 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57387b14-66f5-4d48-9591-784eb4216a13-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-z77sh\" (UID: \"57387b14-66f5-4d48-9591-784eb4216a13\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z77sh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.186675 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57387b14-66f5-4d48-9591-784eb4216a13-config\") pod \"kube-apiserver-operator-766d6c64bb-z77sh\" (UID: \"57387b14-66f5-4d48-9591-784eb4216a13\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z77sh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.186831 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/78ff2eb3-3f40-4529-b52e-62316f24fd15-metrics-tls\") pod \"dns-operator-744455d44c-l9ndv\" (UID: \"78ff2eb3-3f40-4529-b52e-62316f24fd15\") " pod="openshift-dns-operator/dns-operator-744455d44c-l9ndv" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.186877 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9d682539-5f81-4b10-a11a-50d3a9ef0c1c-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gj9xr\" (UID: \"9d682539-5f81-4b10-a11a-50d3a9ef0c1c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gj9xr" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.186928 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8d657cd-dbc9-45d6-8ac4-9c22d9709980-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-j4s9g\" (UID: \"f8d657cd-dbc9-45d6-8ac4-9c22d9709980\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j4s9g" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.187061 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fd1e3a6-ff0d-4bf0-be88-ce9117cfedf8-serving-cert\") pod \"service-ca-operator-777779d784-tg2g5\" (UID: \"1fd1e3a6-ff0d-4bf0-be88-ce9117cfedf8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tg2g5" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.187146 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.188332 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtns2\" (UniqueName: \"kubernetes.io/projected/39a56faf-6fea-45d0-9531-fb86f571fd8b-kube-api-access-gtns2\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.187737 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c-etcd-client\") pod \"apiserver-76f77b778f-4rsdh\" (UID: \"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c\") " pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.188102 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc05dc28-13af-4d05-835a-e3ecc993b1ab-serving-cert\") pod \"console-operator-58897d9998-rpmhp\" (UID: \"cc05dc28-13af-4d05-835a-e3ecc993b1ab\") " pod="openshift-console-operator/console-operator-58897d9998-rpmhp" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.188577 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v447f\" (UniqueName: \"kubernetes.io/projected/ed392029-9b45-4464-9422-10e4ec72db07-kube-api-access-v447f\") pod \"machine-config-operator-74547568cd-b75zt\" (UID: \"ed392029-9b45-4464-9422-10e4ec72db07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b75zt" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.188685 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6df56857-33ce-442e-a600-188003fc196e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q6hcz\" (UID: \"6df56857-33ce-442e-a600-188003fc196e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q6hcz" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.188724 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c-image-import-ca\") pod \"apiserver-76f77b778f-4rsdh\" (UID: \"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c\") " pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.188755 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/456d086a-23a5-42fa-b637-a090f649ffe5-proxy-tls\") pod \"machine-config-controller-84d6567774-mxgkh\" (UID: \"456d086a-23a5-42fa-b637-a090f649ffe5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mxgkh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.188799 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.188842 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8dtd\" (UniqueName: \"kubernetes.io/projected/1fd1e3a6-ff0d-4bf0-be88-ce9117cfedf8-kube-api-access-c8dtd\") pod \"service-ca-operator-777779d784-tg2g5\" (UID: \"1fd1e3a6-ff0d-4bf0-be88-ce9117cfedf8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tg2g5" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.190966 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-m47g9"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.191031 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc74r"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.191105 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6df56857-33ce-442e-a600-188003fc196e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q6hcz\" (UID: \"6df56857-33ce-442e-a600-188003fc196e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q6hcz" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.191146 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.191193 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3db6bfaa-89b3-49e5-9c33-2959670f96f1-serving-cert\") pod \"etcd-operator-b45778765-bwc6d\" (UID: \"3db6bfaa-89b3-49e5-9c33-2959670f96f1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bwc6d" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.191104 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-x4qx6"] Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.191229 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e61f548d-eef7-4f2a-9854-c5bdb6b2b815-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-975fj\" (UID: \"e61f548d-eef7-4f2a-9854-c5bdb6b2b815\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-975fj" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.191279 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-audit-policies\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.191346 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c-encryption-config\") pod \"apiserver-76f77b778f-4rsdh\" (UID: \"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c\") " pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.191381 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfgz7\" (UniqueName: \"kubernetes.io/projected/cc05dc28-13af-4d05-835a-e3ecc993b1ab-kube-api-access-jfgz7\") pod \"console-operator-58897d9998-rpmhp\" (UID: \"cc05dc28-13af-4d05-835a-e3ecc993b1ab\") " pod="openshift-console-operator/console-operator-58897d9998-rpmhp" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.191435 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppsds\" (UniqueName: \"kubernetes.io/projected/be1e2228-c1df-48a2-83a3-ef74747c69c9-kube-api-access-ppsds\") pod \"migrator-59844c95c7-rs5bx\" (UID: \"be1e2228-c1df-48a2-83a3-ef74747c69c9\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rs5bx" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.191477 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkb2f\" (UniqueName: \"kubernetes.io/projected/5345e1d1-a74f-4d8f-8b86-2bb389a525a2-kube-api-access-nkb2f\") pod \"downloads-7954f5f757-wwvbf\" (UID: \"5345e1d1-a74f-4d8f-8b86-2bb389a525a2\") " pod="openshift-console/downloads-7954f5f757-wwvbf" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.191541 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qd62\" (UniqueName: \"kubernetes.io/projected/6b5551de-6f87-439b-875c-b66e902f2f25-kube-api-access-8qd62\") pod \"cluster-image-registry-operator-dc59b4c8b-46tjc\" (UID: \"6b5551de-6f87-439b-875c-b66e902f2f25\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46tjc" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.191597 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kjls\" (UniqueName: \"kubernetes.io/projected/67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c-kube-api-access-8kjls\") pod \"apiserver-76f77b778f-4rsdh\" (UID: \"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c\") " pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.191612 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f128b3f-8527-4b4b-86d5-b456fe89c804-serving-cert\") pod \"openshift-config-operator-7777fb866f-dqctb\" (UID: \"9f128b3f-8527-4b4b-86d5-b456fe89c804\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dqctb" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.191626 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc05dc28-13af-4d05-835a-e3ecc993b1ab-config\") pod \"console-operator-58897d9998-rpmhp\" (UID: \"cc05dc28-13af-4d05-835a-e3ecc993b1ab\") " pod="openshift-console-operator/console-operator-58897d9998-rpmhp" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.191681 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/149bcfc3-9623-403e-8c4c-1019bd5f0c16-console-serving-cert\") pod \"console-f9d7485db-v2snv\" (UID: \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\") " pod="openshift-console/console-f9d7485db-v2snv" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.191708 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/149bcfc3-9623-403e-8c4c-1019bd5f0c16-service-ca\") pod \"console-f9d7485db-v2snv\" (UID: \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\") " pod="openshift-console/console-f9d7485db-v2snv" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.191761 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3db6bfaa-89b3-49e5-9c33-2959670f96f1-etcd-client\") pod \"etcd-operator-b45778765-bwc6d\" (UID: \"3db6bfaa-89b3-49e5-9c33-2959670f96f1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bwc6d" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.191958 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.192079 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/78ff2eb3-3f40-4529-b52e-62316f24fd15-metrics-tls\") pod \"dns-operator-744455d44c-l9ndv\" (UID: \"78ff2eb3-3f40-4529-b52e-62316f24fd15\") " pod="openshift-dns-operator/dns-operator-744455d44c-l9ndv" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.192612 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25f80446-5f8d-476a-91e7-c42b9572854b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ds5z6\" (UID: \"25f80446-5f8d-476a-91e7-c42b9572854b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ds5z6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.192721 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9d682539-5f81-4b10-a11a-50d3a9ef0c1c-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gj9xr\" (UID: \"9d682539-5f81-4b10-a11a-50d3a9ef0c1c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gj9xr" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.194170 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c-image-import-ca\") pod \"apiserver-76f77b778f-4rsdh\" (UID: \"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c\") " pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.194730 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/149bcfc3-9623-403e-8c4c-1019bd5f0c16-service-ca\") pod \"console-f9d7485db-v2snv\" (UID: \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\") " pod="openshift-console/console-f9d7485db-v2snv" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.194899 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6b5551de-6f87-439b-875c-b66e902f2f25-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-46tjc\" (UID: \"6b5551de-6f87-439b-875c-b66e902f2f25\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46tjc" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.195251 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc05dc28-13af-4d05-835a-e3ecc993b1ab-config\") pod \"console-operator-58897d9998-rpmhp\" (UID: \"cc05dc28-13af-4d05-835a-e3ecc993b1ab\") " pod="openshift-console-operator/console-operator-58897d9998-rpmhp" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.195494 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8d657cd-dbc9-45d6-8ac4-9c22d9709980-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-j4s9g\" (UID: \"f8d657cd-dbc9-45d6-8ac4-9c22d9709980\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j4s9g" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.196537 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3db6bfaa-89b3-49e5-9c33-2959670f96f1-etcd-client\") pod \"etcd-operator-b45778765-bwc6d\" (UID: \"3db6bfaa-89b3-49e5-9c33-2959670f96f1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bwc6d" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.196752 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c-encryption-config\") pod \"apiserver-76f77b778f-4rsdh\" (UID: \"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c\") " pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.197504 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3db6bfaa-89b3-49e5-9c33-2959670f96f1-serving-cert\") pod \"etcd-operator-b45778765-bwc6d\" (UID: \"3db6bfaa-89b3-49e5-9c33-2959670f96f1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bwc6d" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.198117 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/149bcfc3-9623-403e-8c4c-1019bd5f0c16-console-serving-cert\") pod \"console-f9d7485db-v2snv\" (UID: \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\") " pod="openshift-console/console-f9d7485db-v2snv" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.210642 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.230796 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.251915 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.265545 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/18a3347a-5dd7-4047-8c43-9c073c9321e6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-j4llw\" (UID: \"18a3347a-5dd7-4047-8c43-9c073c9321e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-j4llw" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.278580 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.285151 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/18a3347a-5dd7-4047-8c43-9c073c9321e6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-j4llw\" (UID: \"18a3347a-5dd7-4047-8c43-9c073c9321e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-j4llw" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.291301 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.292407 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0a42d5a9-1383-4a55-9c81-bd40eb5ba86f-default-certificate\") pod \"router-default-5444994796-6w9rz\" (UID: \"0a42d5a9-1383-4a55-9c81-bd40eb5ba86f\") " pod="openshift-ingress/router-default-5444994796-6w9rz" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.292510 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57387b14-66f5-4d48-9591-784eb4216a13-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-z77sh\" (UID: \"57387b14-66f5-4d48-9591-784eb4216a13\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z77sh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.292579 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57387b14-66f5-4d48-9591-784eb4216a13-config\") pod \"kube-apiserver-operator-766d6c64bb-z77sh\" (UID: \"57387b14-66f5-4d48-9591-784eb4216a13\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z77sh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.292649 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fd1e3a6-ff0d-4bf0-be88-ce9117cfedf8-serving-cert\") pod \"service-ca-operator-777779d784-tg2g5\" (UID: \"1fd1e3a6-ff0d-4bf0-be88-ce9117cfedf8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tg2g5" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.292745 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v447f\" (UniqueName: \"kubernetes.io/projected/ed392029-9b45-4464-9422-10e4ec72db07-kube-api-access-v447f\") pod \"machine-config-operator-74547568cd-b75zt\" (UID: \"ed392029-9b45-4464-9422-10e4ec72db07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b75zt" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.292869 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6df56857-33ce-442e-a600-188003fc196e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q6hcz\" (UID: \"6df56857-33ce-442e-a600-188003fc196e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q6hcz" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.292967 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/456d086a-23a5-42fa-b637-a090f649ffe5-proxy-tls\") pod \"machine-config-controller-84d6567774-mxgkh\" (UID: \"456d086a-23a5-42fa-b637-a090f649ffe5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mxgkh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.293082 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8dtd\" (UniqueName: \"kubernetes.io/projected/1fd1e3a6-ff0d-4bf0-be88-ce9117cfedf8-kube-api-access-c8dtd\") pod \"service-ca-operator-777779d784-tg2g5\" (UID: \"1fd1e3a6-ff0d-4bf0-be88-ce9117cfedf8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tg2g5" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.293159 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6df56857-33ce-442e-a600-188003fc196e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q6hcz\" (UID: \"6df56857-33ce-442e-a600-188003fc196e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q6hcz" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.293257 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e61f548d-eef7-4f2a-9854-c5bdb6b2b815-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-975fj\" (UID: \"e61f548d-eef7-4f2a-9854-c5bdb6b2b815\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-975fj" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.293345 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppsds\" (UniqueName: \"kubernetes.io/projected/be1e2228-c1df-48a2-83a3-ef74747c69c9-kube-api-access-ppsds\") pod \"migrator-59844c95c7-rs5bx\" (UID: \"be1e2228-c1df-48a2-83a3-ef74747c69c9\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rs5bx" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.293481 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mdfh\" (UniqueName: \"kubernetes.io/projected/98bc0caa-ee8a-406e-952d-b400d8c72116-kube-api-access-4mdfh\") pod \"openshift-apiserver-operator-796bbdcf4f-twxwz\" (UID: \"98bc0caa-ee8a-406e-952d-b400d8c72116\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-twxwz" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.293586 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0f3e1cca-80c4-4d17-bb9f-9556fed78aac-srv-cert\") pod \"catalog-operator-68c6474976-vqv79\" (UID: \"0f3e1cca-80c4-4d17-bb9f-9556fed78aac\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vqv79" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.293661 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlr8v\" (UniqueName: \"kubernetes.io/projected/0a42d5a9-1383-4a55-9c81-bd40eb5ba86f-kube-api-access-xlr8v\") pod \"router-default-5444994796-6w9rz\" (UID: \"0a42d5a9-1383-4a55-9c81-bd40eb5ba86f\") " pod="openshift-ingress/router-default-5444994796-6w9rz" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.293738 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98bc0caa-ee8a-406e-952d-b400d8c72116-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-twxwz\" (UID: \"98bc0caa-ee8a-406e-952d-b400d8c72116\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-twxwz" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.293806 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a42d5a9-1383-4a55-9c81-bd40eb5ba86f-metrics-certs\") pod \"router-default-5444994796-6w9rz\" (UID: \"0a42d5a9-1383-4a55-9c81-bd40eb5ba86f\") " pod="openshift-ingress/router-default-5444994796-6w9rz" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.293890 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f85543c-dccf-4a3e-be40-7305a2e49d1d-secret-volume\") pod \"collect-profiles-29523705-gm644\" (UID: \"3f85543c-dccf-4a3e-be40-7305a2e49d1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523705-gm644" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.294020 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6smt\" (UniqueName: \"kubernetes.io/projected/4ef73bb6-9567-4fe4-81ae-ae50b8c70455-kube-api-access-p6smt\") pod \"kube-storage-version-migrator-operator-b67b599dd-2b4hg\" (UID: \"4ef73bb6-9567-4fe4-81ae-ae50b8c70455\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2b4hg" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.294117 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ed392029-9b45-4464-9422-10e4ec72db07-images\") pod \"machine-config-operator-74547568cd-b75zt\" (UID: \"ed392029-9b45-4464-9422-10e4ec72db07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b75zt" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.294203 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5b9edf7b-9549-4eda-a45e-27b94c137b4a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-9mrg6\" (UID: \"5b9edf7b-9549-4eda-a45e-27b94c137b4a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9mrg6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.294296 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgkpz\" (UniqueName: \"kubernetes.io/projected/9f66536d-c481-41b3-b5e5-8259651a95d9-kube-api-access-cgkpz\") pod \"control-plane-machine-set-operator-78cbb6b69f-mj6tz\" (UID: \"9f66536d-c481-41b3-b5e5-8259651a95d9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mj6tz" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.294422 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqftb\" (UniqueName: \"kubernetes.io/projected/456d086a-23a5-42fa-b637-a090f649ffe5-kube-api-access-wqftb\") pod \"machine-config-controller-84d6567774-mxgkh\" (UID: \"456d086a-23a5-42fa-b637-a090f649ffe5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mxgkh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.294525 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/57387b14-66f5-4d48-9591-784eb4216a13-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-z77sh\" (UID: \"57387b14-66f5-4d48-9591-784eb4216a13\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z77sh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.294707 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0a42d5a9-1383-4a55-9c81-bd40eb5ba86f-stats-auth\") pod \"router-default-5444994796-6w9rz\" (UID: \"0a42d5a9-1383-4a55-9c81-bd40eb5ba86f\") " pod="openshift-ingress/router-default-5444994796-6w9rz" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.294812 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ef73bb6-9567-4fe4-81ae-ae50b8c70455-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-2b4hg\" (UID: \"4ef73bb6-9567-4fe4-81ae-ae50b8c70455\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2b4hg" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.294910 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ee154dbc-917d-46f5-bd9e-1a3c11a19a41-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vc74r\" (UID: \"ee154dbc-917d-46f5-bd9e-1a3c11a19a41\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc74r" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.295047 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f85543c-dccf-4a3e-be40-7305a2e49d1d-config-volume\") pod \"collect-profiles-29523705-gm644\" (UID: \"3f85543c-dccf-4a3e-be40-7305a2e49d1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523705-gm644" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.295139 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ef73bb6-9567-4fe4-81ae-ae50b8c70455-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-2b4hg\" (UID: \"4ef73bb6-9567-4fe4-81ae-ae50b8c70455\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2b4hg" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.295238 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee154dbc-917d-46f5-bd9e-1a3c11a19a41-config\") pod \"kube-controller-manager-operator-78b949d7b-vc74r\" (UID: \"ee154dbc-917d-46f5-bd9e-1a3c11a19a41\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc74r" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.295369 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a42d5a9-1383-4a55-9c81-bd40eb5ba86f-service-ca-bundle\") pod \"router-default-5444994796-6w9rz\" (UID: \"0a42d5a9-1383-4a55-9c81-bd40eb5ba86f\") " pod="openshift-ingress/router-default-5444994796-6w9rz" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.295468 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l7rd\" (UniqueName: \"kubernetes.io/projected/e61f548d-eef7-4f2a-9854-c5bdb6b2b815-kube-api-access-7l7rd\") pod \"authentication-operator-69f744f599-975fj\" (UID: \"e61f548d-eef7-4f2a-9854-c5bdb6b2b815\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-975fj" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.295562 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98bc0caa-ee8a-406e-952d-b400d8c72116-config\") pod \"openshift-apiserver-operator-796bbdcf4f-twxwz\" (UID: \"98bc0caa-ee8a-406e-952d-b400d8c72116\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-twxwz" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.295730 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9f66536d-c481-41b3-b5e5-8259651a95d9-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-mj6tz\" (UID: \"9f66536d-c481-41b3-b5e5-8259651a95d9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mj6tz" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.295909 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0f3e1cca-80c4-4d17-bb9f-9556fed78aac-profile-collector-cert\") pod \"catalog-operator-68c6474976-vqv79\" (UID: \"0f3e1cca-80c4-4d17-bb9f-9556fed78aac\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vqv79" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.296116 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/acfbc5ef-21ea-41bb-ba10-c5b5e79dd593-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mx7fv\" (UID: \"acfbc5ef-21ea-41bb-ba10-c5b5e79dd593\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mx7fv" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.296262 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx565\" (UniqueName: \"kubernetes.io/projected/3f85543c-dccf-4a3e-be40-7305a2e49d1d-kube-api-access-xx565\") pod \"collect-profiles-29523705-gm644\" (UID: \"3f85543c-dccf-4a3e-be40-7305a2e49d1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523705-gm644" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.296374 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fd1e3a6-ff0d-4bf0-be88-ce9117cfedf8-config\") pod \"service-ca-operator-777779d784-tg2g5\" (UID: \"1fd1e3a6-ff0d-4bf0-be88-ce9117cfedf8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tg2g5" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.296471 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szfbg\" (UniqueName: \"kubernetes.io/projected/0f3e1cca-80c4-4d17-bb9f-9556fed78aac-kube-api-access-szfbg\") pod \"catalog-operator-68c6474976-vqv79\" (UID: \"0f3e1cca-80c4-4d17-bb9f-9556fed78aac\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vqv79" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.296598 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ed392029-9b45-4464-9422-10e4ec72db07-auth-proxy-config\") pod \"machine-config-operator-74547568cd-b75zt\" (UID: \"ed392029-9b45-4464-9422-10e4ec72db07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b75zt" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.296709 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e61f548d-eef7-4f2a-9854-c5bdb6b2b815-config\") pod \"authentication-operator-69f744f599-975fj\" (UID: \"e61f548d-eef7-4f2a-9854-c5bdb6b2b815\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-975fj" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.296867 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ed392029-9b45-4464-9422-10e4ec72db07-proxy-tls\") pod \"machine-config-operator-74547568cd-b75zt\" (UID: \"ed392029-9b45-4464-9422-10e4ec72db07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b75zt" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.297022 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e61f548d-eef7-4f2a-9854-c5bdb6b2b815-service-ca-bundle\") pod \"authentication-operator-69f744f599-975fj\" (UID: \"e61f548d-eef7-4f2a-9854-c5bdb6b2b815\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-975fj" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.297197 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5b9edf7b-9549-4eda-a45e-27b94c137b4a-srv-cert\") pod \"olm-operator-6b444d44fb-9mrg6\" (UID: \"5b9edf7b-9549-4eda-a45e-27b94c137b4a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9mrg6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.297291 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skw9m\" (UniqueName: \"kubernetes.io/projected/acfbc5ef-21ea-41bb-ba10-c5b5e79dd593-kube-api-access-skw9m\") pod \"multus-admission-controller-857f4d67dd-mx7fv\" (UID: \"acfbc5ef-21ea-41bb-ba10-c5b5e79dd593\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mx7fv" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.297388 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee154dbc-917d-46f5-bd9e-1a3c11a19a41-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vc74r\" (UID: \"ee154dbc-917d-46f5-bd9e-1a3c11a19a41\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc74r" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.297474 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e61f548d-eef7-4f2a-9854-c5bdb6b2b815-serving-cert\") pod \"authentication-operator-69f744f599-975fj\" (UID: \"e61f548d-eef7-4f2a-9854-c5bdb6b2b815\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-975fj" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.297559 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2lg2\" (UniqueName: \"kubernetes.io/projected/5b9edf7b-9549-4eda-a45e-27b94c137b4a-kube-api-access-m2lg2\") pod \"olm-operator-6b444d44fb-9mrg6\" (UID: \"5b9edf7b-9549-4eda-a45e-27b94c137b4a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9mrg6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.297682 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/456d086a-23a5-42fa-b637-a090f649ffe5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-mxgkh\" (UID: \"456d086a-23a5-42fa-b637-a090f649ffe5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mxgkh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.297767 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6df56857-33ce-442e-a600-188003fc196e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q6hcz\" (UID: \"6df56857-33ce-442e-a600-188003fc196e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q6hcz" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.297997 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ed392029-9b45-4464-9422-10e4ec72db07-auth-proxy-config\") pod \"machine-config-operator-74547568cd-b75zt\" (UID: \"ed392029-9b45-4464-9422-10e4ec72db07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b75zt" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.298691 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/456d086a-23a5-42fa-b637-a090f649ffe5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-mxgkh\" (UID: \"456d086a-23a5-42fa-b637-a090f649ffe5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mxgkh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.324374 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.330619 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.335354 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e61f548d-eef7-4f2a-9854-c5bdb6b2b815-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-975fj\" (UID: \"e61f548d-eef7-4f2a-9854-c5bdb6b2b815\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-975fj" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.350547 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.357940 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e61f548d-eef7-4f2a-9854-c5bdb6b2b815-service-ca-bundle\") pod \"authentication-operator-69f744f599-975fj\" (UID: \"e61f548d-eef7-4f2a-9854-c5bdb6b2b815\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-975fj" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.371476 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.391815 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.406519 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e61f548d-eef7-4f2a-9854-c5bdb6b2b815-serving-cert\") pod \"authentication-operator-69f744f599-975fj\" (UID: \"e61f548d-eef7-4f2a-9854-c5bdb6b2b815\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-975fj" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.411234 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.431716 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.439839 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e61f548d-eef7-4f2a-9854-c5bdb6b2b815-config\") pod \"authentication-operator-69f744f599-975fj\" (UID: \"e61f548d-eef7-4f2a-9854-c5bdb6b2b815\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-975fj" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.451245 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.458934 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f85543c-dccf-4a3e-be40-7305a2e49d1d-secret-volume\") pod \"collect-profiles-29523705-gm644\" (UID: \"3f85543c-dccf-4a3e-be40-7305a2e49d1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523705-gm644" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.459036 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5b9edf7b-9549-4eda-a45e-27b94c137b4a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-9mrg6\" (UID: \"5b9edf7b-9549-4eda-a45e-27b94c137b4a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9mrg6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.461114 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0f3e1cca-80c4-4d17-bb9f-9556fed78aac-profile-collector-cert\") pod \"catalog-operator-68c6474976-vqv79\" (UID: \"0f3e1cca-80c4-4d17-bb9f-9556fed78aac\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vqv79" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.471296 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.480867 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5b9edf7b-9549-4eda-a45e-27b94c137b4a-srv-cert\") pod \"olm-operator-6b444d44fb-9mrg6\" (UID: \"5b9edf7b-9549-4eda-a45e-27b94c137b4a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9mrg6" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.491039 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.511631 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.520466 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/acfbc5ef-21ea-41bb-ba10-c5b5e79dd593-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mx7fv\" (UID: \"acfbc5ef-21ea-41bb-ba10-c5b5e79dd593\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mx7fv" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.551594 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.571224 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.591140 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.611022 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.630885 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.651129 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.671937 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.691194 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.696258 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fd1e3a6-ff0d-4bf0-be88-ce9117cfedf8-serving-cert\") pod \"service-ca-operator-777779d784-tg2g5\" (UID: \"1fd1e3a6-ff0d-4bf0-be88-ce9117cfedf8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tg2g5" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.711561 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.718282 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fd1e3a6-ff0d-4bf0-be88-ce9117cfedf8-config\") pod \"service-ca-operator-777779d784-tg2g5\" (UID: \"1fd1e3a6-ff0d-4bf0-be88-ce9117cfedf8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tg2g5" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.731477 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.751495 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.771373 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.780508 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9f66536d-c481-41b3-b5e5-8259651a95d9-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-mj6tz\" (UID: \"9f66536d-c481-41b3-b5e5-8259651a95d9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mj6tz" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.792060 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.812505 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.832571 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.838616 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57387b14-66f5-4d48-9591-784eb4216a13-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-z77sh\" (UID: \"57387b14-66f5-4d48-9591-784eb4216a13\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z77sh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.854928 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.870931 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.874611 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57387b14-66f5-4d48-9591-784eb4216a13-config\") pod \"kube-apiserver-operator-766d6c64bb-z77sh\" (UID: \"57387b14-66f5-4d48-9591-784eb4216a13\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z77sh" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.893500 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.912550 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.931845 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.938636 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6df56857-33ce-442e-a600-188003fc196e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q6hcz\" (UID: \"6df56857-33ce-442e-a600-188003fc196e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q6hcz" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.951861 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.955232 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6df56857-33ce-442e-a600-188003fc196e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q6hcz\" (UID: \"6df56857-33ce-442e-a600-188003fc196e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q6hcz" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.972893 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.975867 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ed392029-9b45-4464-9422-10e4ec72db07-images\") pod \"machine-config-operator-74547568cd-b75zt\" (UID: \"ed392029-9b45-4464-9422-10e4ec72db07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b75zt" Feb 18 13:59:34 crc kubenswrapper[4817]: I0218 13:59:34.991597 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.011944 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.023328 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ed392029-9b45-4464-9422-10e4ec72db07-proxy-tls\") pod \"machine-config-operator-74547568cd-b75zt\" (UID: \"ed392029-9b45-4464-9422-10e4ec72db07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b75zt" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.030967 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.035823 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f85543c-dccf-4a3e-be40-7305a2e49d1d-config-volume\") pod \"collect-profiles-29523705-gm644\" (UID: \"3f85543c-dccf-4a3e-be40-7305a2e49d1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523705-gm644" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.049834 4817 request.go:700] Waited for 1.007531687s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dcollect-profiles-dockercfg-kzf4t&limit=500&resourceVersion=0 Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.051804 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.072072 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.091021 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.111542 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.131197 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.137747 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0f3e1cca-80c4-4d17-bb9f-9556fed78aac-srv-cert\") pod \"catalog-operator-68c6474976-vqv79\" (UID: \"0f3e1cca-80c4-4d17-bb9f-9556fed78aac\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vqv79" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.151925 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.171238 4817 secret.go:188] Couldn't get secret openshift-route-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.171336 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.171248 4817 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-user-idp-0-file-data: failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.171386 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3707018b-031a-4902-8e5c-ba5bc46cc4c4-serving-cert podName:3707018b-031a-4902-8e5c-ba5bc46cc4c4 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.671339929 +0000 UTC m=+38.246875952 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3707018b-031a-4902-8e5c-ba5bc46cc4c4-serving-cert") pod "route-controller-manager-6576b87f9c-hrdpr" (UID: "3707018b-031a-4902-8e5c-ba5bc46cc4c4") : failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.171391 4817 secret.go:188] Couldn't get secret openshift-oauth-apiserver/encryption-config-1: failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.171639 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-idp-0-file-data podName:39a56faf-6fea-45d0-9531-fb86f571fd8b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.671612466 +0000 UTC m=+38.247148489 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-user-idp-0-file-data" (UniqueName: "kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-idp-0-file-data") pod "oauth-openshift-558db77b4-bclz6" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b") : failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.171673 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-encryption-config podName:ca679b0d-4e7e-4526-af6e-e3b0cb400fd0 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.671657268 +0000 UTC m=+38.247193291 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "encryption-config" (UniqueName: "kubernetes.io/secret/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-encryption-config") pod "apiserver-7bbb656c7d-pq5tr" (UID: "ca679b0d-4e7e-4526-af6e-e3b0cb400fd0") : failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.171725 4817 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.171781 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/43f553ef-0150-4383-8c39-5db2cbcab63d-config podName:43f553ef-0150-4383-8c39-5db2cbcab63d nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.6717572 +0000 UTC m=+38.247293183 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/43f553ef-0150-4383-8c39-5db2cbcab63d-config") pod "controller-manager-879f6c89f-l7vpp" (UID: "43f553ef-0150-4383-8c39-5db2cbcab63d") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.172960 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.173933 4817 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.173993 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-trusted-ca-bundle podName:ca679b0d-4e7e-4526-af6e-e3b0cb400fd0 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.673967327 +0000 UTC m=+38.249503310 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-trusted-ca-bundle") pod "apiserver-7bbb656c7d-pq5tr" (UID: "ca679b0d-4e7e-4526-af6e-e3b0cb400fd0") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.174053 4817 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.174164 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3bd99710-b175-4115-8944-1fac544145c5-config podName:3bd99710-b175-4115-8944-1fac544145c5 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.674128271 +0000 UTC m=+38.249664454 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/3bd99710-b175-4115-8944-1fac544145c5-config") pod "machine-api-operator-5694c8668f-x8x2r" (UID: "3bd99710-b175-4115-8944-1fac544145c5") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.177992 4817 secret.go:188] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.178037 4817 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/etcd-serving-ca: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.178038 4817 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-system-serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.178084 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-etcd-serving-ca podName:ca679b0d-4e7e-4526-af6e-e3b0cb400fd0 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.678071753 +0000 UTC m=+38.253607736 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-etcd-serving-ca") pod "apiserver-7bbb656c7d-pq5tr" (UID: "ca679b0d-4e7e-4526-af6e-e3b0cb400fd0") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.178103 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43f553ef-0150-4383-8c39-5db2cbcab63d-serving-cert podName:43f553ef-0150-4383-8c39-5db2cbcab63d nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.678095184 +0000 UTC m=+38.253631167 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/43f553ef-0150-4383-8c39-5db2cbcab63d-serving-cert") pod "controller-manager-879f6c89f-l7vpp" (UID: "43f553ef-0150-4383-8c39-5db2cbcab63d") : failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.177995 4817 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.178119 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-serving-cert podName:39a56faf-6fea-45d0-9531-fb86f571fd8b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.678111774 +0000 UTC m=+38.253647757 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-serving-cert" (UniqueName: "kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-serving-cert") pod "oauth-openshift-558db77b4-bclz6" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b") : failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.178146 4817 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.178167 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-cliconfig podName:39a56faf-6fea-45d0-9531-fb86f571fd8b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.678148245 +0000 UTC m=+38.253684428 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-cliconfig") pod "oauth-openshift-558db77b4-bclz6" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.178282 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3707018b-031a-4902-8e5c-ba5bc46cc4c4-client-ca podName:3707018b-031a-4902-8e5c-ba5bc46cc4c4 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.678255928 +0000 UTC m=+38.253791951 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/3707018b-031a-4902-8e5c-ba5bc46cc4c4-client-ca") pod "route-controller-manager-6576b87f9c-hrdpr" (UID: "3707018b-031a-4902-8e5c-ba5bc46cc4c4") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.181686 4817 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-user-template-provider-selection: failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.181775 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-provider-selection podName:39a56faf-6fea-45d0-9531-fb86f571fd8b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.681755169 +0000 UTC m=+38.257291192 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-user-template-provider-selection" (UniqueName: "kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-provider-selection") pod "oauth-openshift-558db77b4-bclz6" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b") : failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.181941 4817 secret.go:188] Couldn't get secret openshift-oauth-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.182036 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-serving-cert podName:ca679b0d-4e7e-4526-af6e-e3b0cb400fd0 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.682019596 +0000 UTC m=+38.257555619 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-serving-cert") pod "apiserver-7bbb656c7d-pq5tr" (UID: "ca679b0d-4e7e-4526-af6e-e3b0cb400fd0") : failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.184399 4817 secret.go:188] Couldn't get secret openshift-oauth-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.184451 4817 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.184487 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-etcd-client podName:ca679b0d-4e7e-4526-af6e-e3b0cb400fd0 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.684467869 +0000 UTC m=+38.260004082 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-etcd-client") pod "apiserver-7bbb656c7d-pq5tr" (UID: "ca679b0d-4e7e-4526-af6e-e3b0cb400fd0") : failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.184537 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3707018b-031a-4902-8e5c-ba5bc46cc4c4-config podName:3707018b-031a-4902-8e5c-ba5bc46cc4c4 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.6845083 +0000 UTC m=+38.260044323 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/3707018b-031a-4902-8e5c-ba5bc46cc4c4-config") pod "route-controller-manager-6576b87f9c-hrdpr" (UID: "3707018b-031a-4902-8e5c-ba5bc46cc4c4") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.184605 4817 configmap.go:193] Couldn't get configMap openshift-authentication/audit: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.184678 4817 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-system-ocp-branding-template: failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.184730 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-audit-policies podName:39a56faf-6fea-45d0-9531-fb86f571fd8b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.684707245 +0000 UTC m=+38.260243268 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit-policies" (UniqueName: "kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-audit-policies") pod "oauth-openshift-558db77b4-bclz6" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.184765 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-ocp-branding-template podName:39a56faf-6fea-45d0-9531-fb86f571fd8b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.684749496 +0000 UTC m=+38.260285519 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-ocp-branding-template" (UniqueName: "kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-ocp-branding-template") pod "oauth-openshift-558db77b4-bclz6" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b") : failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.185276 4817 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.185413 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-trusted-ca-bundle podName:39a56faf-6fea-45d0-9531-fb86f571fd8b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.685391613 +0000 UTC m=+38.260927636 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-trusted-ca-bundle") pod "oauth-openshift-558db77b4-bclz6" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.185534 4817 configmap.go:193] Couldn't get configMap openshift-machine-api/machine-api-operator-images: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.185599 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3bd99710-b175-4115-8944-1fac544145c5-images podName:3bd99710-b175-4115-8944-1fac544145c5 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.685581848 +0000 UTC m=+38.261117861 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/3bd99710-b175-4115-8944-1fac544145c5-images") pod "machine-api-operator-5694c8668f-x8x2r" (UID: "3bd99710-b175-4115-8944-1fac544145c5") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.185434 4817 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-system-session: failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.185698 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-session podName:39a56faf-6fea-45d0-9531-fb86f571fd8b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.68568039 +0000 UTC m=+38.261216403 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-session" (UniqueName: "kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-session") pod "oauth-openshift-558db77b4-bclz6" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b") : failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.193075 4817 secret.go:188] Couldn't get secret openshift-machine-api/machine-api-operator-tls: failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.193131 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bd99710-b175-4115-8944-1fac544145c5-machine-api-operator-tls podName:3bd99710-b175-4115-8944-1fac544145c5 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.693120893 +0000 UTC m=+38.268656876 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/3bd99710-b175-4115-8944-1fac544145c5-machine-api-operator-tls") pod "machine-api-operator-5694c8668f-x8x2r" (UID: "3bd99710-b175-4115-8944-1fac544145c5") : failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.193176 4817 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-service-ca: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.193183 4817 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-system-router-certs: failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.193356 4817 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-user-template-error: failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.193411 4817 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/audit-1: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.193444 4817 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.193470 4817 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-user-template-login: failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.193199 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-service-ca podName:39a56faf-6fea-45d0-9531-fb86f571fd8b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.693192925 +0000 UTC m=+38.268728908 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-service-ca" (UniqueName: "kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-service-ca") pod "oauth-openshift-558db77b4-bclz6" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.193844 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-error podName:39a56faf-6fea-45d0-9531-fb86f571fd8b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.693810451 +0000 UTC m=+38.269346474 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-user-template-error" (UniqueName: "kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-error") pod "oauth-openshift-558db77b4-bclz6" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b") : failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.193878 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-audit-policies podName:ca679b0d-4e7e-4526-af6e-e3b0cb400fd0 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.693862802 +0000 UTC m=+38.269398825 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit-policies" (UniqueName: "kubernetes.io/configmap/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-audit-policies") pod "apiserver-7bbb656c7d-pq5tr" (UID: "ca679b0d-4e7e-4526-af6e-e3b0cb400fd0") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.193914 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/43f553ef-0150-4383-8c39-5db2cbcab63d-client-ca podName:43f553ef-0150-4383-8c39-5db2cbcab63d nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.693892433 +0000 UTC m=+38.269428456 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/43f553ef-0150-4383-8c39-5db2cbcab63d-client-ca") pod "controller-manager-879f6c89f-l7vpp" (UID: "43f553ef-0150-4383-8c39-5db2cbcab63d") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.193945 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-login podName:39a56faf-6fea-45d0-9531-fb86f571fd8b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.693928974 +0000 UTC m=+38.269464997 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-user-template-login" (UniqueName: "kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-login") pod "oauth-openshift-558db77b4-bclz6" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b") : failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.195112 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-router-certs podName:39a56faf-6fea-45d0-9531-fb86f571fd8b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.695079234 +0000 UTC m=+38.270615247 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-router-certs" (UniqueName: "kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-router-certs") pod "oauth-openshift-558db77b4-bclz6" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b") : failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.196533 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.208321 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98bc0caa-ee8a-406e-952d-b400d8c72116-config\") pod \"openshift-apiserver-operator-796bbdcf4f-twxwz\" (UID: \"98bc0caa-ee8a-406e-952d-b400d8c72116\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-twxwz" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.211609 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.234616 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.252593 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.257391 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98bc0caa-ee8a-406e-952d-b400d8c72116-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-twxwz\" (UID: \"98bc0caa-ee8a-406e-952d-b400d8c72116\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-twxwz" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.272064 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.291464 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.293187 4817 secret.go:188] Couldn't get secret openshift-machine-config-operator/mcc-proxy-tls: failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.293323 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/456d086a-23a5-42fa-b637-a090f649ffe5-proxy-tls podName:456d086a-23a5-42fa-b637-a090f649ffe5 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.793281134 +0000 UTC m=+38.368817157 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/456d086a-23a5-42fa-b637-a090f649ffe5-proxy-tls") pod "machine-config-controller-84d6567774-mxgkh" (UID: "456d086a-23a5-42fa-b637-a090f649ffe5") : failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.294182 4817 secret.go:188] Couldn't get secret openshift-ingress/router-certs-default: failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.294253 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a42d5a9-1383-4a55-9c81-bd40eb5ba86f-default-certificate podName:0a42d5a9-1383-4a55-9c81-bd40eb5ba86f nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.794237959 +0000 UTC m=+38.369773952 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-certificate" (UniqueName: "kubernetes.io/secret/0a42d5a9-1383-4a55-9c81-bd40eb5ba86f-default-certificate") pod "router-default-5444994796-6w9rz" (UID: "0a42d5a9-1383-4a55-9c81-bd40eb5ba86f") : failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.294504 4817 secret.go:188] Couldn't get secret openshift-ingress/router-metrics-certs-default: failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.294769 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a42d5a9-1383-4a55-9c81-bd40eb5ba86f-metrics-certs podName:0a42d5a9-1383-4a55-9c81-bd40eb5ba86f nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.794717602 +0000 UTC m=+38.370253595 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0a42d5a9-1383-4a55-9c81-bd40eb5ba86f-metrics-certs") pod "router-default-5444994796-6w9rz" (UID: "0a42d5a9-1383-4a55-9c81-bd40eb5ba86f") : failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.294933 4817 secret.go:188] Couldn't get secret openshift-ingress/router-stats-default: failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.294954 4817 secret.go:188] Couldn't get secret openshift-kube-storage-version-migrator-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.295312 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ef73bb6-9567-4fe4-81ae-ae50b8c70455-serving-cert podName:4ef73bb6-9567-4fe4-81ae-ae50b8c70455 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.795295936 +0000 UTC m=+38.370831929 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/4ef73bb6-9567-4fe4-81ae-ae50b8c70455-serving-cert") pod "kube-storage-version-migrator-operator-b67b599dd-2b4hg" (UID: "4ef73bb6-9567-4fe4-81ae-ae50b8c70455") : failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.295474 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a42d5a9-1383-4a55-9c81-bd40eb5ba86f-stats-auth podName:0a42d5a9-1383-4a55-9c81-bd40eb5ba86f nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.795460331 +0000 UTC m=+38.370996324 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "stats-auth" (UniqueName: "kubernetes.io/secret/0a42d5a9-1383-4a55-9c81-bd40eb5ba86f-stats-auth") pod "router-default-5444994796-6w9rz" (UID: "0a42d5a9-1383-4a55-9c81-bd40eb5ba86f") : failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.295487 4817 configmap.go:193] Couldn't get configMap openshift-kube-controller-manager-operator/kube-controller-manager-operator-config: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.295826 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ee154dbc-917d-46f5-bd9e-1a3c11a19a41-config podName:ee154dbc-917d-46f5-bd9e-1a3c11a19a41 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.79581521 +0000 UTC m=+38.371351203 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/ee154dbc-917d-46f5-bd9e-1a3c11a19a41-config") pod "kube-controller-manager-operator-78b949d7b-vc74r" (UID: "ee154dbc-917d-46f5-bd9e-1a3c11a19a41") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.295481 4817 configmap.go:193] Couldn't get configMap openshift-kube-storage-version-migrator-operator/config: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.295550 4817 configmap.go:193] Couldn't get configMap openshift-ingress/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.296297 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4ef73bb6-9567-4fe4-81ae-ae50b8c70455-config podName:4ef73bb6-9567-4fe4-81ae-ae50b8c70455 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.796240601 +0000 UTC m=+38.371776614 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/4ef73bb6-9567-4fe4-81ae-ae50b8c70455-config") pod "kube-storage-version-migrator-operator-b67b599dd-2b4hg" (UID: "4ef73bb6-9567-4fe4-81ae-ae50b8c70455") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.296403 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0a42d5a9-1383-4a55-9c81-bd40eb5ba86f-service-ca-bundle podName:0a42d5a9-1383-4a55-9c81-bd40eb5ba86f nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.796386505 +0000 UTC m=+38.371922528 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/0a42d5a9-1383-4a55-9c81-bd40eb5ba86f-service-ca-bundle") pod "router-default-5444994796-6w9rz" (UID: "0a42d5a9-1383-4a55-9c81-bd40eb5ba86f") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.299327 4817 secret.go:188] Couldn't get secret openshift-kube-controller-manager-operator/kube-controller-manager-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: E0218 13:59:35.299421 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee154dbc-917d-46f5-bd9e-1a3c11a19a41-serving-cert podName:ee154dbc-917d-46f5-bd9e-1a3c11a19a41 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:35.799397293 +0000 UTC m=+38.374933316 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/ee154dbc-917d-46f5-bd9e-1a3c11a19a41-serving-cert") pod "kube-controller-manager-operator-78b949d7b-vc74r" (UID: "ee154dbc-917d-46f5-bd9e-1a3c11a19a41") : failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.312917 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.330887 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.350975 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.371765 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.391729 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.412194 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.431753 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.451801 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.473214 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.493484 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.512854 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.531789 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.552859 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.570933 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.592024 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.622331 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.633147 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.652321 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.692490 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.712146 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.732200 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.733228 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.734805 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.734878 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3bd99710-b175-4115-8944-1fac544145c5-images\") pod \"machine-api-operator-5694c8668f-x8x2r\" (UID: \"3bd99710-b175-4115-8944-1fac544145c5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x8x2r" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.734933 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bd99710-b175-4115-8944-1fac544145c5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-x8x2r\" (UID: \"3bd99710-b175-4115-8944-1fac544145c5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x8x2r" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.735037 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/43f553ef-0150-4383-8c39-5db2cbcab63d-client-ca\") pod \"controller-manager-879f6c89f-l7vpp\" (UID: \"43f553ef-0150-4383-8c39-5db2cbcab63d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.735093 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.735136 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.735227 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.735316 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.735404 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-audit-policies\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.735528 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3707018b-031a-4902-8e5c-ba5bc46cc4c4-serving-cert\") pod \"route-controller-manager-6576b87f9c-hrdpr\" (UID: \"3707018b-031a-4902-8e5c-ba5bc46cc4c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.735969 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.736134 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43f553ef-0150-4383-8c39-5db2cbcab63d-config\") pod \"controller-manager-879f6c89f-l7vpp\" (UID: \"43f553ef-0150-4383-8c39-5db2cbcab63d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.736244 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-encryption-config\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.736371 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43f553ef-0150-4383-8c39-5db2cbcab63d-serving-cert\") pod \"controller-manager-879f6c89f-l7vpp\" (UID: \"43f553ef-0150-4383-8c39-5db2cbcab63d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.736468 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bd99710-b175-4115-8944-1fac544145c5-config\") pod \"machine-api-operator-5694c8668f-x8x2r\" (UID: \"3bd99710-b175-4115-8944-1fac544145c5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x8x2r" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.736513 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.736623 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3707018b-031a-4902-8e5c-ba5bc46cc4c4-client-ca\") pod \"route-controller-manager-6576b87f9c-hrdpr\" (UID: \"3707018b-031a-4902-8e5c-ba5bc46cc4c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.736734 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.736796 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.736844 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.736961 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-serving-cert\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.737074 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.737131 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-etcd-client\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.737175 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-audit-policies\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.737270 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3707018b-031a-4902-8e5c-ba5bc46cc4c4-config\") pod \"route-controller-manager-6576b87f9c-hrdpr\" (UID: \"3707018b-031a-4902-8e5c-ba5bc46cc4c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.737350 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.750693 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.772566 4817 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.791967 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.812707 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.832244 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.846361 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a42d5a9-1383-4a55-9c81-bd40eb5ba86f-metrics-certs\") pod \"router-default-5444994796-6w9rz\" (UID: \"0a42d5a9-1383-4a55-9c81-bd40eb5ba86f\") " pod="openshift-ingress/router-default-5444994796-6w9rz" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.846749 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0a42d5a9-1383-4a55-9c81-bd40eb5ba86f-stats-auth\") pod \"router-default-5444994796-6w9rz\" (UID: \"0a42d5a9-1383-4a55-9c81-bd40eb5ba86f\") " pod="openshift-ingress/router-default-5444994796-6w9rz" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.846830 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ef73bb6-9567-4fe4-81ae-ae50b8c70455-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-2b4hg\" (UID: \"4ef73bb6-9567-4fe4-81ae-ae50b8c70455\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2b4hg" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.846922 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ef73bb6-9567-4fe4-81ae-ae50b8c70455-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-2b4hg\" (UID: \"4ef73bb6-9567-4fe4-81ae-ae50b8c70455\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2b4hg" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.847025 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee154dbc-917d-46f5-bd9e-1a3c11a19a41-config\") pod \"kube-controller-manager-operator-78b949d7b-vc74r\" (UID: \"ee154dbc-917d-46f5-bd9e-1a3c11a19a41\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc74r" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.847172 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a42d5a9-1383-4a55-9c81-bd40eb5ba86f-service-ca-bundle\") pod \"router-default-5444994796-6w9rz\" (UID: \"0a42d5a9-1383-4a55-9c81-bd40eb5ba86f\") " pod="openshift-ingress/router-default-5444994796-6w9rz" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.847565 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee154dbc-917d-46f5-bd9e-1a3c11a19a41-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vc74r\" (UID: \"ee154dbc-917d-46f5-bd9e-1a3c11a19a41\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc74r" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.848297 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0a42d5a9-1383-4a55-9c81-bd40eb5ba86f-default-certificate\") pod \"router-default-5444994796-6w9rz\" (UID: \"0a42d5a9-1383-4a55-9c81-bd40eb5ba86f\") " pod="openshift-ingress/router-default-5444994796-6w9rz" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.848447 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/456d086a-23a5-42fa-b637-a090f649ffe5-proxy-tls\") pod \"machine-config-controller-84d6567774-mxgkh\" (UID: \"456d086a-23a5-42fa-b637-a090f649ffe5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mxgkh" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.849735 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee154dbc-917d-46f5-bd9e-1a3c11a19a41-config\") pod \"kube-controller-manager-operator-78b949d7b-vc74r\" (UID: \"ee154dbc-917d-46f5-bd9e-1a3c11a19a41\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc74r" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.850501 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a42d5a9-1383-4a55-9c81-bd40eb5ba86f-service-ca-bundle\") pod \"router-default-5444994796-6w9rz\" (UID: \"0a42d5a9-1383-4a55-9c81-bd40eb5ba86f\") " pod="openshift-ingress/router-default-5444994796-6w9rz" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.852510 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ef73bb6-9567-4fe4-81ae-ae50b8c70455-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-2b4hg\" (UID: \"4ef73bb6-9567-4fe4-81ae-ae50b8c70455\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2b4hg" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.854028 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a42d5a9-1383-4a55-9c81-bd40eb5ba86f-metrics-certs\") pod \"router-default-5444994796-6w9rz\" (UID: \"0a42d5a9-1383-4a55-9c81-bd40eb5ba86f\") " pod="openshift-ingress/router-default-5444994796-6w9rz" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.855019 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.856152 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0a42d5a9-1383-4a55-9c81-bd40eb5ba86f-stats-auth\") pod \"router-default-5444994796-6w9rz\" (UID: \"0a42d5a9-1383-4a55-9c81-bd40eb5ba86f\") " pod="openshift-ingress/router-default-5444994796-6w9rz" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.857494 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/456d086a-23a5-42fa-b637-a090f649ffe5-proxy-tls\") pod \"machine-config-controller-84d6567774-mxgkh\" (UID: \"456d086a-23a5-42fa-b637-a090f649ffe5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mxgkh" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.858934 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee154dbc-917d-46f5-bd9e-1a3c11a19a41-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vc74r\" (UID: \"ee154dbc-917d-46f5-bd9e-1a3c11a19a41\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc74r" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.859944 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ef73bb6-9567-4fe4-81ae-ae50b8c70455-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-2b4hg\" (UID: \"4ef73bb6-9567-4fe4-81ae-ae50b8c70455\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2b4hg" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.862114 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0a42d5a9-1383-4a55-9c81-bd40eb5ba86f-default-certificate\") pod \"router-default-5444994796-6w9rz\" (UID: \"0a42d5a9-1383-4a55-9c81-bd40eb5ba86f\") " pod="openshift-ingress/router-default-5444994796-6w9rz" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.872110 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.892864 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.912193 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.931425 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 18 13:59:35 crc kubenswrapper[4817]: I0218 13:59:35.952871 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.011225 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlpp7\" (UniqueName: \"kubernetes.io/projected/c2e62f6f-bd21-4255-85df-0c31e9edadaf-kube-api-access-xlpp7\") pod \"ingress-operator-5b745b69d9-7wn4g\" (UID: \"c2e62f6f-bd21-4255-85df-0c31e9edadaf\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7wn4g" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.033695 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dthtv\" (UniqueName: \"kubernetes.io/projected/f8d657cd-dbc9-45d6-8ac4-9c22d9709980-kube-api-access-dthtv\") pod \"package-server-manager-789f6589d5-j4s9g\" (UID: \"f8d657cd-dbc9-45d6-8ac4-9c22d9709980\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j4s9g" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.046113 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgcch\" (UniqueName: \"kubernetes.io/projected/3db6bfaa-89b3-49e5-9c33-2959670f96f1-kube-api-access-fgcch\") pod \"etcd-operator-b45778765-bwc6d\" (UID: \"3db6bfaa-89b3-49e5-9c33-2959670f96f1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bwc6d" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.069793 4817 request.go:700] Waited for 1.89310823s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&limit=500&resourceVersion=0 Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.071461 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9k4j\" (UniqueName: \"kubernetes.io/projected/a0286241-5427-4d37-9c39-717227ba63d8-kube-api-access-p9k4j\") pod \"machine-approver-56656f9798-w7l8m\" (UID: \"a0286241-5427-4d37-9c39-717227ba63d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w7l8m" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.072598 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.091871 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.112020 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.134322 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.152205 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.170878 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.190111 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w7l8m" Feb 18 13:59:36 crc kubenswrapper[4817]: W0218 13:59:36.210669 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0286241_5427_4d37_9c39_717227ba63d8.slice/crio-7e6b77fa3ed395e4a97c922bdd856d5861f7d5a1cbf93c0de030c2d7245fc97f WatchSource:0}: Error finding container 7e6b77fa3ed395e4a97c922bdd856d5861f7d5a1cbf93c0de030c2d7245fc97f: Status 404 returned error can't find the container with id 7e6b77fa3ed395e4a97c922bdd856d5861f7d5a1cbf93c0de030c2d7245fc97f Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.213684 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b5551de-6f87-439b-875c-b66e902f2f25-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-46tjc\" (UID: \"6b5551de-6f87-439b-875c-b66e902f2f25\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46tjc" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.242259 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9ncz\" (UniqueName: \"kubernetes.io/projected/78ff2eb3-3f40-4529-b52e-62316f24fd15-kube-api-access-f9ncz\") pod \"dns-operator-744455d44c-l9ndv\" (UID: \"78ff2eb3-3f40-4529-b52e-62316f24fd15\") " pod="openshift-dns-operator/dns-operator-744455d44c-l9ndv" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.252063 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j4s9g" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.263915 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-l9ndv" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.265948 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d96mv\" (UniqueName: \"kubernetes.io/projected/18a3347a-5dd7-4047-8c43-9c073c9321e6-kube-api-access-d96mv\") pod \"marketplace-operator-79b997595-j4llw\" (UID: \"18a3347a-5dd7-4047-8c43-9c073c9321e6\") " pod="openshift-marketplace/marketplace-operator-79b997595-j4llw" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.280963 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-bwc6d" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.288245 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c2e62f6f-bd21-4255-85df-0c31e9edadaf-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7wn4g\" (UID: \"c2e62f6f-bd21-4255-85df-0c31e9edadaf\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7wn4g" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.313122 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxqt7\" (UniqueName: \"kubernetes.io/projected/9d682539-5f81-4b10-a11a-50d3a9ef0c1c-kube-api-access-sxqt7\") pod \"cluster-samples-operator-665b6dd947-gj9xr\" (UID: \"9d682539-5f81-4b10-a11a-50d3a9ef0c1c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gj9xr" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.332334 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbh4k\" (UniqueName: \"kubernetes.io/projected/149bcfc3-9623-403e-8c4c-1019bd5f0c16-kube-api-access-qbh4k\") pod \"console-f9d7485db-v2snv\" (UID: \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\") " pod="openshift-console/console-f9d7485db-v2snv" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.343520 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-j4llw" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.368270 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vbnz\" (UniqueName: \"kubernetes.io/projected/9f128b3f-8527-4b4b-86d5-b456fe89c804-kube-api-access-9vbnz\") pod \"openshift-config-operator-7777fb866f-dqctb\" (UID: \"9f128b3f-8527-4b4b-86d5-b456fe89c804\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dqctb" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.407436 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw69f\" (UniqueName: \"kubernetes.io/projected/25f80446-5f8d-476a-91e7-c42b9572854b-kube-api-access-pw69f\") pod \"openshift-controller-manager-operator-756b6f6bc6-ds5z6\" (UID: \"25f80446-5f8d-476a-91e7-c42b9572854b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ds5z6" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.445897 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qd62\" (UniqueName: \"kubernetes.io/projected/6b5551de-6f87-439b-875c-b66e902f2f25-kube-api-access-8qd62\") pod \"cluster-image-registry-operator-dc59b4c8b-46tjc\" (UID: \"6b5551de-6f87-439b-875c-b66e902f2f25\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46tjc" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.466479 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfgz7\" (UniqueName: \"kubernetes.io/projected/cc05dc28-13af-4d05-835a-e3ecc993b1ab-kube-api-access-jfgz7\") pod \"console-operator-58897d9998-rpmhp\" (UID: \"cc05dc28-13af-4d05-835a-e3ecc993b1ab\") " pod="openshift-console-operator/console-operator-58897d9998-rpmhp" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.503170 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w7l8m" event={"ID":"a0286241-5427-4d37-9c39-717227ba63d8","Type":"ContainerStarted","Data":"7e6b77fa3ed395e4a97c922bdd856d5861f7d5a1cbf93c0de030c2d7245fc97f"} Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.506360 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kjls\" (UniqueName: \"kubernetes.io/projected/67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c-kube-api-access-8kjls\") pod \"apiserver-76f77b778f-4rsdh\" (UID: \"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c\") " pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.520968 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkb2f\" (UniqueName: \"kubernetes.io/projected/5345e1d1-a74f-4d8f-8b86-2bb389a525a2-kube-api-access-nkb2f\") pod \"downloads-7954f5f757-wwvbf\" (UID: \"5345e1d1-a74f-4d8f-8b86-2bb389a525a2\") " pod="openshift-console/downloads-7954f5f757-wwvbf" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.523323 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-rpmhp" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.529274 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gj9xr" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.530684 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v447f\" (UniqueName: \"kubernetes.io/projected/ed392029-9b45-4464-9422-10e4ec72db07-kube-api-access-v447f\") pod \"machine-config-operator-74547568cd-b75zt\" (UID: \"ed392029-9b45-4464-9422-10e4ec72db07\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b75zt" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.535914 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7wn4g" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.545575 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46tjc" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.549494 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8dtd\" (UniqueName: \"kubernetes.io/projected/1fd1e3a6-ff0d-4bf0-be88-ce9117cfedf8-kube-api-access-c8dtd\") pod \"service-ca-operator-777779d784-tg2g5\" (UID: \"1fd1e3a6-ff0d-4bf0-be88-ce9117cfedf8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tg2g5" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.566764 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppsds\" (UniqueName: \"kubernetes.io/projected/be1e2228-c1df-48a2-83a3-ef74747c69c9-kube-api-access-ppsds\") pod \"migrator-59844c95c7-rs5bx\" (UID: \"be1e2228-c1df-48a2-83a3-ef74747c69c9\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rs5bx" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.589721 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mdfh\" (UniqueName: \"kubernetes.io/projected/98bc0caa-ee8a-406e-952d-b400d8c72116-kube-api-access-4mdfh\") pod \"openshift-apiserver-operator-796bbdcf4f-twxwz\" (UID: \"98bc0caa-ee8a-406e-952d-b400d8c72116\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-twxwz" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.604814 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlr8v\" (UniqueName: \"kubernetes.io/projected/0a42d5a9-1383-4a55-9c81-bd40eb5ba86f-kube-api-access-xlr8v\") pod \"router-default-5444994796-6w9rz\" (UID: \"0a42d5a9-1383-4a55-9c81-bd40eb5ba86f\") " pod="openshift-ingress/router-default-5444994796-6w9rz" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.614775 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.619858 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j4s9g"] Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.620936 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j4llw"] Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.626734 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dqctb" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.627787 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6smt\" (UniqueName: \"kubernetes.io/projected/4ef73bb6-9567-4fe4-81ae-ae50b8c70455-kube-api-access-p6smt\") pod \"kube-storage-version-migrator-operator-b67b599dd-2b4hg\" (UID: \"4ef73bb6-9567-4fe4-81ae-ae50b8c70455\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2b4hg" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.628231 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-v2snv" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.633748 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ds5z6" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.644234 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-wwvbf" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.665011 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqftb\" (UniqueName: \"kubernetes.io/projected/456d086a-23a5-42fa-b637-a090f649ffe5-kube-api-access-wqftb\") pod \"machine-config-controller-84d6567774-mxgkh\" (UID: \"456d086a-23a5-42fa-b637-a090f649ffe5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mxgkh" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.687203 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/57387b14-66f5-4d48-9591-784eb4216a13-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-z77sh\" (UID: \"57387b14-66f5-4d48-9591-784eb4216a13\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z77sh" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.695793 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tg2g5" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.717581 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z77sh" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.728147 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ee154dbc-917d-46f5-bd9e-1a3c11a19a41-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vc74r\" (UID: \"ee154dbc-917d-46f5-bd9e-1a3c11a19a41\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc74r" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.733874 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b75zt" Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.734461 4817 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.734825 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-trusted-ca-bundle podName:39a56faf-6fea-45d0-9531-fb86f571fd8b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:37.734550085 +0000 UTC m=+40.310086058 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-trusted-ca-bundle") pod "oauth-openshift-558db77b4-bclz6" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.735871 4817 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-system-router-certs: failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.735912 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-router-certs podName:39a56faf-6fea-45d0-9531-fb86f571fd8b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:37.73590115 +0000 UTC m=+40.311437133 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-router-certs" (UniqueName: "kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-router-certs") pod "oauth-openshift-558db77b4-bclz6" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b") : failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.735966 4817 configmap.go:193] Couldn't get configMap openshift-machine-api/machine-api-operator-images: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.735975 4817 secret.go:188] Couldn't get secret openshift-machine-api/machine-api-operator-tls: failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.736034 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3bd99710-b175-4115-8944-1fac544145c5-images podName:3bd99710-b175-4115-8944-1fac544145c5 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:37.736026373 +0000 UTC m=+40.311562356 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/3bd99710-b175-4115-8944-1fac544145c5-images") pod "machine-api-operator-5694c8668f-x8x2r" (UID: "3bd99710-b175-4115-8944-1fac544145c5") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.736053 4817 secret.go:188] Couldn't get secret openshift-route-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.736069 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bd99710-b175-4115-8944-1fac544145c5-machine-api-operator-tls podName:3bd99710-b175-4115-8944-1fac544145c5 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:37.736044874 +0000 UTC m=+40.311580857 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/3bd99710-b175-4115-8944-1fac544145c5-machine-api-operator-tls") pod "machine-api-operator-5694c8668f-x8x2r" (UID: "3bd99710-b175-4115-8944-1fac544145c5") : failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.736084 4817 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.736091 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3707018b-031a-4902-8e5c-ba5bc46cc4c4-serving-cert podName:3707018b-031a-4902-8e5c-ba5bc46cc4c4 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:37.736083085 +0000 UTC m=+40.311619068 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3707018b-031a-4902-8e5c-ba5bc46cc4c4-serving-cert") pod "route-controller-manager-6576b87f9c-hrdpr" (UID: "3707018b-031a-4902-8e5c-ba5bc46cc4c4") : failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.736107 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/43f553ef-0150-4383-8c39-5db2cbcab63d-client-ca podName:43f553ef-0150-4383-8c39-5db2cbcab63d nodeName:}" failed. No retries permitted until 2026-02-18 13:59:37.736100675 +0000 UTC m=+40.311636758 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/43f553ef-0150-4383-8c39-5db2cbcab63d-client-ca") pod "controller-manager-879f6c89f-l7vpp" (UID: "43f553ef-0150-4383-8c39-5db2cbcab63d") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.736107 4817 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-system-session: failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.736116 4817 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-user-template-login: failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.736135 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-session podName:39a56faf-6fea-45d0-9531-fb86f571fd8b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:37.736129466 +0000 UTC m=+40.311665439 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-session" (UniqueName: "kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-session") pod "oauth-openshift-558db77b4-bclz6" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b") : failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.736147 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-login podName:39a56faf-6fea-45d0-9531-fb86f571fd8b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:37.736141776 +0000 UTC m=+40.311677759 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-user-template-login" (UniqueName: "kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-login") pod "oauth-openshift-558db77b4-bclz6" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b") : failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.736145 4817 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-user-template-error: failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.736192 4817 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/audit-1: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.736211 4817 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-service-ca: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.736222 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-audit-policies podName:ca679b0d-4e7e-4526-af6e-e3b0cb400fd0 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:37.736213428 +0000 UTC m=+40.311749511 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "audit-policies" (UniqueName: "kubernetes.io/configmap/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-audit-policies") pod "apiserver-7bbb656c7d-pq5tr" (UID: "ca679b0d-4e7e-4526-af6e-e3b0cb400fd0") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.736240 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-service-ca podName:39a56faf-6fea-45d0-9531-fb86f571fd8b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:37.736231588 +0000 UTC m=+40.311767571 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-service-ca" (UniqueName: "kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-service-ca") pod "oauth-openshift-558db77b4-bclz6" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.736249 4817 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-user-idp-0-file-data: failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.736274 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-idp-0-file-data podName:39a56faf-6fea-45d0-9531-fb86f571fd8b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:37.736268079 +0000 UTC m=+40.311804062 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-user-idp-0-file-data" (UniqueName: "kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-idp-0-file-data") pod "oauth-openshift-558db77b4-bclz6" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b") : failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.736297 4817 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.736321 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/43f553ef-0150-4383-8c39-5db2cbcab63d-config podName:43f553ef-0150-4383-8c39-5db2cbcab63d nodeName:}" failed. No retries permitted until 2026-02-18 13:59:37.73631333 +0000 UTC m=+40.311849313 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/43f553ef-0150-4383-8c39-5db2cbcab63d-config") pod "controller-manager-879f6c89f-l7vpp" (UID: "43f553ef-0150-4383-8c39-5db2cbcab63d") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.736395 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-error podName:39a56faf-6fea-45d0-9531-fb86f571fd8b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:37.736349981 +0000 UTC m=+40.311885964 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-user-template-error" (UniqueName: "kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-error") pod "oauth-openshift-558db77b4-bclz6" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b") : failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.737032 4817 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/etcd-serving-ca: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.737065 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-etcd-serving-ca podName:ca679b0d-4e7e-4526-af6e-e3b0cb400fd0 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:37.73705546 +0000 UTC m=+40.312591443 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-etcd-serving-ca") pod "apiserver-7bbb656c7d-pq5tr" (UID: "ca679b0d-4e7e-4526-af6e-e3b0cb400fd0") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.737092 4817 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.737114 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3bd99710-b175-4115-8944-1fac544145c5-config podName:3bd99710-b175-4115-8944-1fac544145c5 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:37.737108031 +0000 UTC m=+40.312644014 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/3bd99710-b175-4115-8944-1fac544145c5-config") pod "machine-api-operator-5694c8668f-x8x2r" (UID: "3bd99710-b175-4115-8944-1fac544145c5") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.737129 4817 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.737160 4817 secret.go:188] Couldn't get secret openshift-oauth-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.737180 4817 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-user-template-provider-selection: failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.737131 4817 secret.go:188] Couldn't get secret openshift-oauth-apiserver/encryption-config-1: failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.737212 4817 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.737220 4817 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-system-serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.737245 4817 secret.go:188] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.737149 4817 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.737277 4817 secret.go:188] Couldn't get secret openshift-oauth-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.737187 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-serving-cert podName:ca679b0d-4e7e-4526-af6e-e3b0cb400fd0 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:37.737180033 +0000 UTC m=+40.312716016 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-serving-cert") pod "apiserver-7bbb656c7d-pq5tr" (UID: "ca679b0d-4e7e-4526-af6e-e3b0cb400fd0") : failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.737315 4817 configmap.go:193] Couldn't get configMap openshift-authentication/audit: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.737331 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-cliconfig podName:39a56faf-6fea-45d0-9531-fb86f571fd8b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:37.737318786 +0000 UTC m=+40.312854839 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-cliconfig") pod "oauth-openshift-558db77b4-bclz6" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.737355 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-provider-selection podName:39a56faf-6fea-45d0-9531-fb86f571fd8b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:37.737346647 +0000 UTC m=+40.312882740 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-user-template-provider-selection" (UniqueName: "kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-provider-selection") pod "oauth-openshift-558db77b4-bclz6" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b") : failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.737402 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-trusted-ca-bundle podName:ca679b0d-4e7e-4526-af6e-e3b0cb400fd0 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:37.737393178 +0000 UTC m=+40.312929231 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-trusted-ca-bundle") pod "apiserver-7bbb656c7d-pq5tr" (UID: "ca679b0d-4e7e-4526-af6e-e3b0cb400fd0") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.737423 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-serving-cert podName:39a56faf-6fea-45d0-9531-fb86f571fd8b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:37.737413199 +0000 UTC m=+40.312949342 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-serving-cert" (UniqueName: "kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-serving-cert") pod "oauth-openshift-558db77b4-bclz6" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b") : failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.737439 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-encryption-config podName:ca679b0d-4e7e-4526-af6e-e3b0cb400fd0 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:37.737431739 +0000 UTC m=+40.312967822 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "encryption-config" (UniqueName: "kubernetes.io/secret/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-encryption-config") pod "apiserver-7bbb656c7d-pq5tr" (UID: "ca679b0d-4e7e-4526-af6e-e3b0cb400fd0") : failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.737448 4817 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.737491 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43f553ef-0150-4383-8c39-5db2cbcab63d-serving-cert podName:43f553ef-0150-4383-8c39-5db2cbcab63d nodeName:}" failed. No retries permitted until 2026-02-18 13:59:37.737480081 +0000 UTC m=+40.313016164 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/43f553ef-0150-4383-8c39-5db2cbcab63d-serving-cert") pod "controller-manager-879f6c89f-l7vpp" (UID: "43f553ef-0150-4383-8c39-5db2cbcab63d") : failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.737571 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3707018b-031a-4902-8e5c-ba5bc46cc4c4-client-ca podName:3707018b-031a-4902-8e5c-ba5bc46cc4c4 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:37.737537512 +0000 UTC m=+40.313073495 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/3707018b-031a-4902-8e5c-ba5bc46cc4c4-client-ca") pod "route-controller-manager-6576b87f9c-hrdpr" (UID: "3707018b-031a-4902-8e5c-ba5bc46cc4c4") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.737600 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-etcd-client podName:ca679b0d-4e7e-4526-af6e-e3b0cb400fd0 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:37.737588253 +0000 UTC m=+40.313124326 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-etcd-client") pod "apiserver-7bbb656c7d-pq5tr" (UID: "ca679b0d-4e7e-4526-af6e-e3b0cb400fd0") : failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.737611 4817 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-system-ocp-branding-template: failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.737622 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-audit-policies podName:39a56faf-6fea-45d0-9531-fb86f571fd8b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:37.737615144 +0000 UTC m=+40.313151237 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "audit-policies" (UniqueName: "kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-audit-policies") pod "oauth-openshift-558db77b4-bclz6" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.737659 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3707018b-031a-4902-8e5c-ba5bc46cc4c4-config podName:3707018b-031a-4902-8e5c-ba5bc46cc4c4 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:37.737651795 +0000 UTC m=+40.313187778 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/3707018b-031a-4902-8e5c-ba5bc46cc4c4-config") pod "route-controller-manager-6576b87f9c-hrdpr" (UID: "3707018b-031a-4902-8e5c-ba5bc46cc4c4") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.737679 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-ocp-branding-template podName:39a56faf-6fea-45d0-9531-fb86f571fd8b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:37.737669566 +0000 UTC m=+40.313205639 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-ocp-branding-template" (UniqueName: "kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-ocp-branding-template") pod "oauth-openshift-558db77b4-bclz6" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b") : failed to sync secret cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.741565 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l7rd\" (UniqueName: \"kubernetes.io/projected/e61f548d-eef7-4f2a-9854-c5bdb6b2b815-kube-api-access-7l7rd\") pod \"authentication-operator-69f744f599-975fj\" (UID: \"e61f548d-eef7-4f2a-9854-c5bdb6b2b815\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-975fj" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.762059 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-bwc6d"] Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.767932 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-twxwz" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.775383 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx565\" (UniqueName: \"kubernetes.io/projected/3f85543c-dccf-4a3e-be40-7305a2e49d1d-kube-api-access-xx565\") pod \"collect-profiles-29523705-gm644\" (UID: \"3f85543c-dccf-4a3e-be40-7305a2e49d1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523705-gm644" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.775557 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rs5bx" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.778361 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szfbg\" (UniqueName: \"kubernetes.io/projected/0f3e1cca-80c4-4d17-bb9f-9556fed78aac-kube-api-access-szfbg\") pod \"catalog-operator-68c6474976-vqv79\" (UID: \"0f3e1cca-80c4-4d17-bb9f-9556fed78aac\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vqv79" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.780866 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-l9ndv"] Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.783851 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2b4hg" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.791277 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc74r" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.796076 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2lg2\" (UniqueName: \"kubernetes.io/projected/5b9edf7b-9549-4eda-a45e-27b94c137b4a-kube-api-access-m2lg2\") pod \"olm-operator-6b444d44fb-9mrg6\" (UID: \"5b9edf7b-9549-4eda-a45e-27b94c137b4a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9mrg6" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.802242 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-6w9rz" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.816240 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7wn4g"] Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.816527 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mxgkh" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.818421 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skw9m\" (UniqueName: \"kubernetes.io/projected/acfbc5ef-21ea-41bb-ba10-c5b5e79dd593-kube-api-access-skw9m\") pod \"multus-admission-controller-857f4d67dd-mx7fv\" (UID: \"acfbc5ef-21ea-41bb-ba10-c5b5e79dd593\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mx7fv" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.831535 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-rpmhp"] Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.834199 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6df56857-33ce-442e-a600-188003fc196e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q6hcz\" (UID: \"6df56857-33ce-442e-a600-188003fc196e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q6hcz" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.850907 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.855195 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gj9xr"] Feb 18 13:59:36 crc kubenswrapper[4817]: W0218 13:59:36.867209 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2e62f6f_bd21_4255_85df_0c31e9edadaf.slice/crio-f652aa2f2dca99ab529b72e79ff83953efd451066d7822ca9dc5e56882371ed0 WatchSource:0}: Error finding container f652aa2f2dca99ab529b72e79ff83953efd451066d7822ca9dc5e56882371ed0: Status 404 returned error can't find the container with id f652aa2f2dca99ab529b72e79ff83953efd451066d7822ca9dc5e56882371ed0 Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.873692 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.890922 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.910785 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.933390 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.953741 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-975fj" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.960129 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9mrg6" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.972637 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.973534 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.976653 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46tjc"] Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.980465 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-mx7fv" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.981651 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/29f1a30b-47d4-452e-9017-dcc9cf78795f-metrics-certs\") pod \"network-metrics-daemon-pj24h\" (UID: \"29f1a30b-47d4-452e-9017-dcc9cf78795f\") " pod="openshift-multus/network-metrics-daemon-pj24h" Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.982741 4817 projected.go:288] Couldn't get configMap openshift-machine-api/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.982801 4817 projected.go:194] Error preparing data for projected volume kube-api-access-zs8n9 for pod openshift-machine-api/machine-api-operator-5694c8668f-x8x2r: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: E0218 13:59:36.983053 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3bd99710-b175-4115-8944-1fac544145c5-kube-api-access-zs8n9 podName:3bd99710-b175-4115-8944-1fac544145c5 nodeName:}" failed. No retries permitted until 2026-02-18 13:59:37.4828812 +0000 UTC m=+40.058417183 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-zs8n9" (UniqueName: "kubernetes.io/projected/3bd99710-b175-4115-8944-1fac544145c5-kube-api-access-zs8n9") pod "machine-api-operator-5694c8668f-x8x2r" (UID: "3bd99710-b175-4115-8944-1fac544145c5") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.996800 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 18 13:59:36 crc kubenswrapper[4817]: I0218 13:59:36.997054 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/29f1a30b-47d4-452e-9017-dcc9cf78795f-metrics-certs\") pod \"network-metrics-daemon-pj24h\" (UID: \"29f1a30b-47d4-452e-9017-dcc9cf78795f\") " pod="openshift-multus/network-metrics-daemon-pj24h" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.014097 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.029426 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q6hcz" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.035990 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.038013 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-v2snv"] Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.040432 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523705-gm644" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.049810 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-dqctb"] Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.053411 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.060313 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vqv79" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.071338 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.093024 4817 request.go:700] Waited for 2.145247584s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/secrets?fieldSelector=metadata.name%3Dv4-0-config-user-idp-0-file-data&limit=500&resourceVersion=0 Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.097735 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 18 13:59:37 crc kubenswrapper[4817]: W0218 13:59:37.099770 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f128b3f_8527_4b4b_86d5_b456fe89c804.slice/crio-e30aba4a4f5a914e00f764e1307a90f5b16bdd0ae7a701fc099a61a3c5acf150 WatchSource:0}: Error finding container e30aba4a4f5a914e00f764e1307a90f5b16bdd0ae7a701fc099a61a3c5acf150: Status 404 returned error can't find the container with id e30aba4a4f5a914e00f764e1307a90f5b16bdd0ae7a701fc099a61a3c5acf150 Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.112250 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.131648 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.144385 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-wwvbf"] Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.159649 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.180949 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.191110 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-4rsdh"] Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.192993 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.206504 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z77sh"] Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.208326 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj24h" Feb 18 13:59:37 crc kubenswrapper[4817]: W0218 13:59:37.212829 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5345e1d1_a74f_4d8f_8b86_2bb389a525a2.slice/crio-e2de8bef69dfe3c0d7a8a40533054f6a2c6ccef38b763fee5fc5d766dcecfd57 WatchSource:0}: Error finding container e2de8bef69dfe3c0d7a8a40533054f6a2c6ccef38b763fee5fc5d766dcecfd57: Status 404 returned error can't find the container with id e2de8bef69dfe3c0d7a8a40533054f6a2c6ccef38b763fee5fc5d766dcecfd57 Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.213013 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.230545 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtns2\" (UniqueName: \"kubernetes.io/projected/39a56faf-6fea-45d0-9531-fb86f571fd8b-kube-api-access-gtns2\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.231158 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.246026 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc74r"] Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.251406 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 18 13:59:37 crc kubenswrapper[4817]: W0218 13:59:37.258669 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67bbfc49_2d2d_4e26_8f38_cc7d2fd6765c.slice/crio-00b3749a02417ed70d3c0dfcfdc1adf7571b65fcd4e2da7631c7b6465f442797 WatchSource:0}: Error finding container 00b3749a02417ed70d3c0dfcfdc1adf7571b65fcd4e2da7631c7b6465f442797: Status 404 returned error can't find the container with id 00b3749a02417ed70d3c0dfcfdc1adf7571b65fcd4e2da7631c7b6465f442797 Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.270872 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 18 13:59:37 crc kubenswrapper[4817]: W0218 13:59:37.282502 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee154dbc_917d_46f5_bd9e_1a3c11a19a41.slice/crio-73ce85e5c04770ce161e73d3f12c1e2844ef935a90ee5fefa4b1e3d8bba08d5b WatchSource:0}: Error finding container 73ce85e5c04770ce161e73d3f12c1e2844ef935a90ee5fefa4b1e3d8bba08d5b: Status 404 returned error can't find the container with id 73ce85e5c04770ce161e73d3f12c1e2844ef935a90ee5fefa4b1e3d8bba08d5b Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.296313 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.310741 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-b75zt"] Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.312965 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.321053 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ds5z6"] Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.336295 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 18 13:59:37 crc kubenswrapper[4817]: W0218 13:59:37.339214 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57387b14_66f5_4d48_9591_784eb4216a13.slice/crio-9e2e61a7f1908b9c18ae3493f5b403bbdb7230b41c2442c94ca57abe39033e85 WatchSource:0}: Error finding container 9e2e61a7f1908b9c18ae3493f5b403bbdb7230b41c2442c94ca57abe39033e85: Status 404 returned error can't find the container with id 9e2e61a7f1908b9c18ae3493f5b403bbdb7230b41c2442c94ca57abe39033e85 Feb 18 13:59:37 crc kubenswrapper[4817]: E0218 13:59:37.344807 4817 projected.go:288] Couldn't get configMap openshift-controller-manager/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:37 crc kubenswrapper[4817]: E0218 13:59:37.344865 4817 projected.go:194] Error preparing data for projected volume kube-api-access-kxmdb for pod openshift-controller-manager/controller-manager-879f6c89f-l7vpp: failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:37 crc kubenswrapper[4817]: E0218 13:59:37.344943 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/43f553ef-0150-4383-8c39-5db2cbcab63d-kube-api-access-kxmdb podName:43f553ef-0150-4383-8c39-5db2cbcab63d nodeName:}" failed. No retries permitted until 2026-02-18 13:59:37.844919007 +0000 UTC m=+40.420454990 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-kxmdb" (UniqueName: "kubernetes.io/projected/43f553ef-0150-4383-8c39-5db2cbcab63d-kube-api-access-kxmdb") pod "controller-manager-879f6c89f-l7vpp" (UID: "43f553ef-0150-4383-8c39-5db2cbcab63d") : failed to sync configmap cache: timed out waiting for the condition Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.358114 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.358426 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-mxgkh"] Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.377625 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.393327 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.401471 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2b4hg"] Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.411050 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.415997 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-tg2g5"] Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.418489 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgkpz\" (UniqueName: \"kubernetes.io/projected/9f66536d-c481-41b3-b5e5-8259651a95d9-kube-api-access-cgkpz\") pod \"control-plane-machine-set-operator-78cbb6b69f-mj6tz\" (UID: \"9f66536d-c481-41b3-b5e5-8259651a95d9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mj6tz" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.432217 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 18 13:59:37 crc kubenswrapper[4817]: W0218 13:59:37.438821 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25f80446_5f8d_476a_91e7_c42b9572854b.slice/crio-d8bca4dfc5c93f238ae10372ace4f798c254a28744092f24b3e3c4f719b13a5b WatchSource:0}: Error finding container d8bca4dfc5c93f238ae10372ace4f798c254a28744092f24b3e3c4f719b13a5b: Status 404 returned error can't find the container with id d8bca4dfc5c93f238ae10372ace4f798c254a28744092f24b3e3c4f719b13a5b Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.439740 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knjlz\" (UniqueName: \"kubernetes.io/projected/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-kube-api-access-knjlz\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.452934 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.461895 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-rs5bx"] Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.471897 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.493364 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.528201 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs8n9\" (UniqueName: \"kubernetes.io/projected/3bd99710-b175-4115-8944-1fac544145c5-kube-api-access-zs8n9\") pod \"machine-api-operator-5694c8668f-x8x2r\" (UID: \"3bd99710-b175-4115-8944-1fac544145c5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x8x2r" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.555199 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-twxwz"] Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.574067 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.574515 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs8n9\" (UniqueName: \"kubernetes.io/projected/3bd99710-b175-4115-8944-1fac544145c5-kube-api-access-zs8n9\") pod \"machine-api-operator-5694c8668f-x8x2r\" (UID: \"3bd99710-b175-4115-8944-1fac544145c5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x8x2r" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.575194 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.575745 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.576529 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.598942 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.607663 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mj6tz" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.612069 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.612470 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-v2snv" event={"ID":"149bcfc3-9623-403e-8c4c-1019bd5f0c16","Type":"ContainerStarted","Data":"6ac37bfaa3dea22921cdf36cd458fa7757e771b2e5b3b88783b1d68962ee8573"} Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.621601 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-rpmhp" event={"ID":"cc05dc28-13af-4d05-835a-e3ecc993b1ab","Type":"ContainerStarted","Data":"1c6dcdaf571f11a240b101ca06c859ebb820fe3f988b1fb7a95f688151bf3b8a"} Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.642395 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.650374 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2b4hg" event={"ID":"4ef73bb6-9567-4fe4-81ae-ae50b8c70455","Type":"ContainerStarted","Data":"0cbf0139b0285c72af103d4a0b8cecf7767c11963cb40fd71de12595d80710d4"} Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.660701 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rs5bx" event={"ID":"be1e2228-c1df-48a2-83a3-ef74747c69c9","Type":"ContainerStarted","Data":"40b4f197580116cb69bdda031bf00811d330e5bdde32a82ab6d4ff594a33eb2f"} Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.665414 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-l9ndv" event={"ID":"78ff2eb3-3f40-4529-b52e-62316f24fd15","Type":"ContainerStarted","Data":"cfc0af401c9858e70e507d3a97fd250a7d81c6e61624479bf70f28841cca25a1"} Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.665494 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58vgz\" (UniqueName: \"kubernetes.io/projected/3707018b-031a-4902-8e5c-ba5bc46cc4c4-kube-api-access-58vgz\") pod \"route-controller-manager-6576b87f9c-hrdpr\" (UID: \"3707018b-031a-4902-8e5c-ba5bc46cc4c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.679935 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6w9rz" event={"ID":"0a42d5a9-1383-4a55-9c81-bd40eb5ba86f","Type":"ContainerStarted","Data":"eb50fcdd5b521c5d82fdbec43ca51e2c60b11d5daba7f6e7ac41f62626e40ea5"} Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.693394 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7wn4g" event={"ID":"c2e62f6f-bd21-4255-85df-0c31e9edadaf","Type":"ContainerStarted","Data":"f652aa2f2dca99ab529b72e79ff83953efd451066d7822ca9dc5e56882371ed0"} Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.711260 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc74r" event={"ID":"ee154dbc-917d-46f5-bd9e-1a3c11a19a41","Type":"ContainerStarted","Data":"73ce85e5c04770ce161e73d3f12c1e2844ef935a90ee5fefa4b1e3d8bba08d5b"} Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.725315 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mxgkh" event={"ID":"456d086a-23a5-42fa-b637-a090f649ffe5","Type":"ContainerStarted","Data":"f778e7f7741259cd8a60eff7feab5d730037e45a4ee48ae9424529b9614a6e43"} Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.732045 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gj9xr" event={"ID":"9d682539-5f81-4b10-a11a-50d3a9ef0c1c","Type":"ContainerStarted","Data":"d97379657f7ce27699027ca979037c5946e80e5356811213e0b23dd7cc6d21ca"} Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.736123 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b75zt" event={"ID":"ed392029-9b45-4464-9422-10e4ec72db07","Type":"ContainerStarted","Data":"ace56c5780ee325282475982678a34523f5b469c95664b5ee725aa291e1961a4"} Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.741469 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w7l8m" event={"ID":"a0286241-5427-4d37-9c39-717227ba63d8","Type":"ContainerStarted","Data":"f627b00bb290920041c07dd35159f4aa8dc14f3d94144de2399d18f9c86f378f"} Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.741531 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w7l8m" event={"ID":"a0286241-5427-4d37-9c39-717227ba63d8","Type":"ContainerStarted","Data":"8cef2c6f121846d3e14760a611335d4e600a5a1f484e6c39b03b81842f4d9ff2"} Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.742962 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-audit-policies\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.743046 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3707018b-031a-4902-8e5c-ba5bc46cc4c4-config\") pod \"route-controller-manager-6576b87f9c-hrdpr\" (UID: \"3707018b-031a-4902-8e5c-ba5bc46cc4c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.743078 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.743167 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.743193 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.743217 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3bd99710-b175-4115-8944-1fac544145c5-images\") pod \"machine-api-operator-5694c8668f-x8x2r\" (UID: \"3bd99710-b175-4115-8944-1fac544145c5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x8x2r" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.743236 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bd99710-b175-4115-8944-1fac544145c5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-x8x2r\" (UID: \"3bd99710-b175-4115-8944-1fac544145c5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x8x2r" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.743268 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4faf1743-e825-477d-b191-830513a39317-registry-tls\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.743290 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/43f553ef-0150-4383-8c39-5db2cbcab63d-client-ca\") pod \"controller-manager-879f6c89f-l7vpp\" (UID: \"43f553ef-0150-4383-8c39-5db2cbcab63d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.743313 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4faf1743-e825-477d-b191-830513a39317-trusted-ca\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.743335 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.743369 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tcgb\" (UniqueName: \"kubernetes.io/projected/e983df69-9be7-498d-a703-90a309bd98e8-kube-api-access-4tcgb\") pod \"packageserver-d55dfcdfc-phf44\" (UID: \"e983df69-9be7-498d-a703-90a309bd98e8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-phf44" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.743397 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.743421 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1ae1a25b-712e-42ec-9b7b-4925ba7f97af-signing-key\") pod \"service-ca-9c57cc56f-vs5x4\" (UID: \"1ae1a25b-712e-42ec-9b7b-4925ba7f97af\") " pod="openshift-service-ca/service-ca-9c57cc56f-vs5x4" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.743457 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.743480 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.743502 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-audit-policies\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.743547 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4faf1743-e825-477d-b191-830513a39317-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.743570 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3707018b-031a-4902-8e5c-ba5bc46cc4c4-serving-cert\") pod \"route-controller-manager-6576b87f9c-hrdpr\" (UID: \"3707018b-031a-4902-8e5c-ba5bc46cc4c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.743603 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.743622 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4faf1743-e825-477d-b191-830513a39317-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.743639 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e983df69-9be7-498d-a703-90a309bd98e8-apiservice-cert\") pod \"packageserver-d55dfcdfc-phf44\" (UID: \"e983df69-9be7-498d-a703-90a309bd98e8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-phf44" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.743663 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43f553ef-0150-4383-8c39-5db2cbcab63d-config\") pod \"controller-manager-879f6c89f-l7vpp\" (UID: \"43f553ef-0150-4383-8c39-5db2cbcab63d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.743683 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-encryption-config\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.743702 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4faf1743-e825-477d-b191-830513a39317-bound-sa-token\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.744558 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-audit-policies\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.745467 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.747177 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-wwvbf" event={"ID":"5345e1d1-a74f-4d8f-8b86-2bb389a525a2","Type":"ContainerStarted","Data":"e2de8bef69dfe3c0d7a8a40533054f6a2c6ccef38b763fee5fc5d766dcecfd57"} Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.747195 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/43f553ef-0150-4383-8c39-5db2cbcab63d-client-ca\") pod \"controller-manager-879f6c89f-l7vpp\" (UID: \"43f553ef-0150-4383-8c39-5db2cbcab63d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.748229 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43f553ef-0150-4383-8c39-5db2cbcab63d-serving-cert\") pod \"controller-manager-879f6c89f-l7vpp\" (UID: \"43f553ef-0150-4383-8c39-5db2cbcab63d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.748279 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bd99710-b175-4115-8944-1fac544145c5-config\") pod \"machine-api-operator-5694c8668f-x8x2r\" (UID: \"3bd99710-b175-4115-8944-1fac544145c5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x8x2r" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.749945 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-audit-policies\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.752897 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tg2g5" event={"ID":"1fd1e3a6-ff0d-4bf0-be88-ce9117cfedf8","Type":"ContainerStarted","Data":"c88abd0cb9efaaff7109faa10f25022727d9e306333fb79fef0676d5c5e97c00"} Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.753054 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.753173 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3707018b-031a-4902-8e5c-ba5bc46cc4c4-client-ca\") pod \"route-controller-manager-6576b87f9c-hrdpr\" (UID: \"3707018b-031a-4902-8e5c-ba5bc46cc4c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.753203 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.753276 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4faf1743-e825-477d-b191-830513a39317-registry-certificates\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.753378 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e983df69-9be7-498d-a703-90a309bd98e8-webhook-cert\") pod \"packageserver-d55dfcdfc-phf44\" (UID: \"e983df69-9be7-498d-a703-90a309bd98e8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-phf44" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.753410 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.753433 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.753471 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-serving-cert\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.754727 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bd99710-b175-4115-8944-1fac544145c5-config\") pod \"machine-api-operator-5694c8668f-x8x2r\" (UID: \"3bd99710-b175-4115-8944-1fac544145c5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x8x2r" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.755213 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43f553ef-0150-4383-8c39-5db2cbcab63d-config\") pod \"controller-manager-879f6c89f-l7vpp\" (UID: \"43f553ef-0150-4383-8c39-5db2cbcab63d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.756016 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.756227 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3707018b-031a-4902-8e5c-ba5bc46cc4c4-config\") pod \"route-controller-manager-6576b87f9c-hrdpr\" (UID: \"3707018b-031a-4902-8e5c-ba5bc46cc4c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.757636 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3707018b-031a-4902-8e5c-ba5bc46cc4c4-client-ca\") pod \"route-controller-manager-6576b87f9c-hrdpr\" (UID: \"3707018b-031a-4902-8e5c-ba5bc46cc4c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.758443 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.758626 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.758796 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq49p\" (UniqueName: \"kubernetes.io/projected/1ae1a25b-712e-42ec-9b7b-4925ba7f97af-kube-api-access-gq49p\") pod \"service-ca-9c57cc56f-vs5x4\" (UID: \"1ae1a25b-712e-42ec-9b7b-4925ba7f97af\") " pod="openshift-service-ca/service-ca-9c57cc56f-vs5x4" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.759114 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.759237 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1ae1a25b-712e-42ec-9b7b-4925ba7f97af-signing-cabundle\") pod \"service-ca-9c57cc56f-vs5x4\" (UID: \"1ae1a25b-712e-42ec-9b7b-4925ba7f97af\") " pod="openshift-service-ca/service-ca-9c57cc56f-vs5x4" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.759558 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9ztb\" (UniqueName: \"kubernetes.io/projected/4faf1743-e825-477d-b191-830513a39317-kube-api-access-j9ztb\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:37 crc kubenswrapper[4817]: E0218 13:59:37.759576 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:38.259557715 +0000 UTC m=+40.835093698 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.759607 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.760034 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e983df69-9be7-498d-a703-90a309bd98e8-tmpfs\") pod \"packageserver-d55dfcdfc-phf44\" (UID: \"e983df69-9be7-498d-a703-90a309bd98e8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-phf44" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.760056 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-bwc6d" event={"ID":"3db6bfaa-89b3-49e5-9c33-2959670f96f1","Type":"ContainerStarted","Data":"d35018df4c586092d14df607914e7ab3cad3e05cb20a35e8796f43a5c7880d07"} Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.760135 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-etcd-client\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.761390 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ds5z6" event={"ID":"25f80446-5f8d-476a-91e7-c42b9572854b","Type":"ContainerStarted","Data":"d8bca4dfc5c93f238ae10372ace4f798c254a28744092f24b3e3c4f719b13a5b"} Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.761595 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.761939 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.762750 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.762768 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.764478 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-etcd-client\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.765583 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.766459 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-encryption-config\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.766504 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.766759 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3707018b-031a-4902-8e5c-ba5bc46cc4c4-serving-cert\") pod \"route-controller-manager-6576b87f9c-hrdpr\" (UID: \"3707018b-031a-4902-8e5c-ba5bc46cc4c4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.767114 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.768383 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3bd99710-b175-4115-8944-1fac544145c5-images\") pod \"machine-api-operator-5694c8668f-x8x2r\" (UID: \"3bd99710-b175-4115-8944-1fac544145c5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x8x2r" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.769827 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca679b0d-4e7e-4526-af6e-e3b0cb400fd0-serving-cert\") pod \"apiserver-7bbb656c7d-pq5tr\" (UID: \"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.771338 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.773840 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bclz6\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.782446 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bd99710-b175-4115-8944-1fac544145c5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-x8x2r\" (UID: \"3bd99710-b175-4115-8944-1fac544145c5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x8x2r" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.784467 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mx7fv"] Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.803841 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43f553ef-0150-4383-8c39-5db2cbcab63d-serving-cert\") pod \"controller-manager-879f6c89f-l7vpp\" (UID: \"43f553ef-0150-4383-8c39-5db2cbcab63d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.809639 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46tjc" event={"ID":"6b5551de-6f87-439b-875c-b66e902f2f25","Type":"ContainerStarted","Data":"0dc1a0e9ef69c417a550c3528126f0d25d1eb9c61c0c5d0a70619d3b57fd2632"} Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.809679 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vqv79"] Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.816256 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dqctb" event={"ID":"9f128b3f-8527-4b4b-86d5-b456fe89c804","Type":"ContainerStarted","Data":"e30aba4a4f5a914e00f764e1307a90f5b16bdd0ae7a701fc099a61a3c5acf150"} Feb 18 13:59:37 crc kubenswrapper[4817]: W0218 13:59:37.834501 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacfbc5ef_21ea_41bb_ba10_c5b5e79dd593.slice/crio-0a66b332be5119f73667756ee922ad8f6c38c1286b275b683de2b492e1304c6b WatchSource:0}: Error finding container 0a66b332be5119f73667756ee922ad8f6c38c1286b275b683de2b492e1304c6b: Status 404 returned error can't find the container with id 0a66b332be5119f73667756ee922ad8f6c38c1286b275b683de2b492e1304c6b Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.837026 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z77sh" event={"ID":"57387b14-66f5-4d48-9591-784eb4216a13","Type":"ContainerStarted","Data":"9e2e61a7f1908b9c18ae3493f5b403bbdb7230b41c2442c94ca57abe39033e85"} Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.851807 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j4s9g" event={"ID":"f8d657cd-dbc9-45d6-8ac4-9c22d9709980","Type":"ContainerStarted","Data":"d2ae3846da8387764fdf002cebeb62bb5425863eebc33030e83142bdad5d3345"} Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.851850 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j4s9g" event={"ID":"f8d657cd-dbc9-45d6-8ac4-9c22d9709980","Type":"ContainerStarted","Data":"6899aec716a892aa8bf93b84ff3fecfa0d6235679f7265510d26e3583e54524f"} Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.852617 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j4s9g" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.858165 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" event={"ID":"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c","Type":"ContainerStarted","Data":"00b3749a02417ed70d3c0dfcfdc1adf7571b65fcd4e2da7631c7b6465f442797"} Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.861249 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.861555 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4faf1743-e825-477d-b191-830513a39317-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.861585 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e983df69-9be7-498d-a703-90a309bd98e8-apiservice-cert\") pod \"packageserver-d55dfcdfc-phf44\" (UID: \"e983df69-9be7-498d-a703-90a309bd98e8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-phf44" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.861668 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ffb4e423-81f2-4b61-9b22-c4f6d8504861-node-bootstrap-token\") pod \"machine-config-server-87jjf\" (UID: \"ffb4e423-81f2-4b61-9b22-c4f6d8504861\") " pod="openshift-machine-config-operator/machine-config-server-87jjf" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.861696 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/614109c1-b0a9-4ff3-b3d4-04325985c7df-plugins-dir\") pod \"csi-hostpathplugin-x4qx6\" (UID: \"614109c1-b0a9-4ff3-b3d4-04325985c7df\") " pod="hostpath-provisioner/csi-hostpathplugin-x4qx6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.861723 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4faf1743-e825-477d-b191-830513a39317-bound-sa-token\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.861794 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ffb4e423-81f2-4b61-9b22-c4f6d8504861-certs\") pod \"machine-config-server-87jjf\" (UID: \"ffb4e423-81f2-4b61-9b22-c4f6d8504861\") " pod="openshift-machine-config-operator/machine-config-server-87jjf" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.861810 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/09b37c68-ecb2-4b3c-8cac-4d457be545ec-cert\") pod \"ingress-canary-m47g9\" (UID: \"09b37c68-ecb2-4b3c-8cac-4d457be545ec\") " pod="openshift-ingress-canary/ingress-canary-m47g9" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.861955 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/614109c1-b0a9-4ff3-b3d4-04325985c7df-csi-data-dir\") pod \"csi-hostpathplugin-x4qx6\" (UID: \"614109c1-b0a9-4ff3-b3d4-04325985c7df\") " pod="hostpath-provisioner/csi-hostpathplugin-x4qx6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.862000 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4faf1743-e825-477d-b191-830513a39317-registry-certificates\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.862022 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e983df69-9be7-498d-a703-90a309bd98e8-webhook-cert\") pod \"packageserver-d55dfcdfc-phf44\" (UID: \"e983df69-9be7-498d-a703-90a309bd98e8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-phf44" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.862089 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982-ready\") pod \"cni-sysctl-allowlist-ds-pb2jx\" (UID: \"3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982\") " pod="openshift-multus/cni-sysctl-allowlist-ds-pb2jx" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.862116 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq49p\" (UniqueName: \"kubernetes.io/projected/1ae1a25b-712e-42ec-9b7b-4925ba7f97af-kube-api-access-gq49p\") pod \"service-ca-9c57cc56f-vs5x4\" (UID: \"1ae1a25b-712e-42ec-9b7b-4925ba7f97af\") " pod="openshift-service-ca/service-ca-9c57cc56f-vs5x4" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.862175 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1ae1a25b-712e-42ec-9b7b-4925ba7f97af-signing-cabundle\") pod \"service-ca-9c57cc56f-vs5x4\" (UID: \"1ae1a25b-712e-42ec-9b7b-4925ba7f97af\") " pod="openshift-service-ca/service-ca-9c57cc56f-vs5x4" Feb 18 13:59:37 crc kubenswrapper[4817]: E0218 13:59:37.862607 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:38.36256068 +0000 UTC m=+40.938096653 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.863927 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-j4llw" event={"ID":"18a3347a-5dd7-4047-8c43-9c073c9321e6","Type":"ContainerStarted","Data":"b330da5cfdbd2ae9feb6cc659fbeba65b247ef572f1156c86b73d62b6a02dacf"} Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.863971 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-j4llw" event={"ID":"18a3347a-5dd7-4047-8c43-9c073c9321e6","Type":"ContainerStarted","Data":"0ab67841e6ac80e55979c94e709143b08600c0a3f7865376e902b3ba086a052e"} Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.865218 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-j4llw" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.867997 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4faf1743-e825-477d-b191-830513a39317-registry-certificates\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.868922 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1ae1a25b-712e-42ec-9b7b-4925ba7f97af-signing-cabundle\") pod \"service-ca-9c57cc56f-vs5x4\" (UID: \"1ae1a25b-712e-42ec-9b7b-4925ba7f97af\") " pod="openshift-service-ca/service-ca-9c57cc56f-vs5x4" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.869307 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4faf1743-e825-477d-b191-830513a39317-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.870040 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9ztb\" (UniqueName: \"kubernetes.io/projected/4faf1743-e825-477d-b191-830513a39317-kube-api-access-j9ztb\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.870136 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42vsc\" (UniqueName: \"kubernetes.io/projected/e3015431-de34-487b-b9d9-473c6da0578b-kube-api-access-42vsc\") pod \"dns-default-pqtc6\" (UID: \"e3015431-de34-487b-b9d9-473c6da0578b\") " pod="openshift-dns/dns-default-pqtc6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.870315 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e983df69-9be7-498d-a703-90a309bd98e8-tmpfs\") pod \"packageserver-d55dfcdfc-phf44\" (UID: \"e983df69-9be7-498d-a703-90a309bd98e8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-phf44" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.870838 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e983df69-9be7-498d-a703-90a309bd98e8-tmpfs\") pod \"packageserver-d55dfcdfc-phf44\" (UID: \"e983df69-9be7-498d-a703-90a309bd98e8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-phf44" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.870880 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v67qq\" (UniqueName: \"kubernetes.io/projected/614109c1-b0a9-4ff3-b3d4-04325985c7df-kube-api-access-v67qq\") pod \"csi-hostpathplugin-x4qx6\" (UID: \"614109c1-b0a9-4ff3-b3d4-04325985c7df\") " pod="hostpath-provisioner/csi-hostpathplugin-x4qx6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.871369 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3015431-de34-487b-b9d9-473c6da0578b-config-volume\") pod \"dns-default-pqtc6\" (UID: \"e3015431-de34-487b-b9d9-473c6da0578b\") " pod="openshift-dns/dns-default-pqtc6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.871728 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv6xh\" (UniqueName: \"kubernetes.io/projected/09b37c68-ecb2-4b3c-8cac-4d457be545ec-kube-api-access-xv6xh\") pod \"ingress-canary-m47g9\" (UID: \"09b37c68-ecb2-4b3c-8cac-4d457be545ec\") " pod="openshift-ingress-canary/ingress-canary-m47g9" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.872652 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/614109c1-b0a9-4ff3-b3d4-04325985c7df-mountpoint-dir\") pod \"csi-hostpathplugin-x4qx6\" (UID: \"614109c1-b0a9-4ff3-b3d4-04325985c7df\") " pod="hostpath-provisioner/csi-hostpathplugin-x4qx6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.874769 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e983df69-9be7-498d-a703-90a309bd98e8-apiservice-cert\") pod \"packageserver-d55dfcdfc-phf44\" (UID: \"e983df69-9be7-498d-a703-90a309bd98e8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-phf44" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.892203 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsknh\" (UniqueName: \"kubernetes.io/projected/3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982-kube-api-access-rsknh\") pod \"cni-sysctl-allowlist-ds-pb2jx\" (UID: \"3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982\") " pod="openshift-multus/cni-sysctl-allowlist-ds-pb2jx" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.892739 4817 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-j4llw container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/healthz\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.892811 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-j4llw" podUID="18a3347a-5dd7-4047-8c43-9c073c9321e6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.15:8080/healthz\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.894445 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxmdb\" (UniqueName: \"kubernetes.io/projected/43f553ef-0150-4383-8c39-5db2cbcab63d-kube-api-access-kxmdb\") pod \"controller-manager-879f6c89f-l7vpp\" (UID: \"43f553ef-0150-4383-8c39-5db2cbcab63d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.894515 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-pb2jx\" (UID: \"3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982\") " pod="openshift-multus/cni-sysctl-allowlist-ds-pb2jx" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.895467 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e983df69-9be7-498d-a703-90a309bd98e8-webhook-cert\") pod \"packageserver-d55dfcdfc-phf44\" (UID: \"e983df69-9be7-498d-a703-90a309bd98e8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-phf44" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.895818 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/614109c1-b0a9-4ff3-b3d4-04325985c7df-registration-dir\") pod \"csi-hostpathplugin-x4qx6\" (UID: \"614109c1-b0a9-4ff3-b3d4-04325985c7df\") " pod="hostpath-provisioner/csi-hostpathplugin-x4qx6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.896781 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4faf1743-e825-477d-b191-830513a39317-registry-tls\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.897546 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4faf1743-e825-477d-b191-830513a39317-trusted-ca\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.897677 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tcgb\" (UniqueName: \"kubernetes.io/projected/e983df69-9be7-498d-a703-90a309bd98e8-kube-api-access-4tcgb\") pod \"packageserver-d55dfcdfc-phf44\" (UID: \"e983df69-9be7-498d-a703-90a309bd98e8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-phf44" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.904609 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1ae1a25b-712e-42ec-9b7b-4925ba7f97af-signing-key\") pod \"service-ca-9c57cc56f-vs5x4\" (UID: \"1ae1a25b-712e-42ec-9b7b-4925ba7f97af\") " pod="openshift-service-ca/service-ca-9c57cc56f-vs5x4" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.904939 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/614109c1-b0a9-4ff3-b3d4-04325985c7df-socket-dir\") pod \"csi-hostpathplugin-x4qx6\" (UID: \"614109c1-b0a9-4ff3-b3d4-04325985c7df\") " pod="hostpath-provisioner/csi-hostpathplugin-x4qx6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.904985 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4faf1743-e825-477d-b191-830513a39317-bound-sa-token\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.905024 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-pb2jx\" (UID: \"3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982\") " pod="openshift-multus/cni-sysctl-allowlist-ds-pb2jx" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.905047 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4b67\" (UniqueName: \"kubernetes.io/projected/ffb4e423-81f2-4b61-9b22-c4f6d8504861-kube-api-access-t4b67\") pod \"machine-config-server-87jjf\" (UID: \"ffb4e423-81f2-4b61-9b22-c4f6d8504861\") " pod="openshift-machine-config-operator/machine-config-server-87jjf" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.908213 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-pj24h"] Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.909030 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxmdb\" (UniqueName: \"kubernetes.io/projected/43f553ef-0150-4383-8c39-5db2cbcab63d-kube-api-access-kxmdb\") pod \"controller-manager-879f6c89f-l7vpp\" (UID: \"43f553ef-0150-4383-8c39-5db2cbcab63d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.909602 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4faf1743-e825-477d-b191-830513a39317-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.909640 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e3015431-de34-487b-b9d9-473c6da0578b-metrics-tls\") pod \"dns-default-pqtc6\" (UID: \"e3015431-de34-487b-b9d9-473c6da0578b\") " pod="openshift-dns/dns-default-pqtc6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.911486 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.911796 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq49p\" (UniqueName: \"kubernetes.io/projected/1ae1a25b-712e-42ec-9b7b-4925ba7f97af-kube-api-access-gq49p\") pod \"service-ca-9c57cc56f-vs5x4\" (UID: \"1ae1a25b-712e-42ec-9b7b-4925ba7f97af\") " pod="openshift-service-ca/service-ca-9c57cc56f-vs5x4" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.914665 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4faf1743-e825-477d-b191-830513a39317-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.915821 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4faf1743-e825-477d-b191-830513a39317-registry-tls\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.916220 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4faf1743-e825-477d-b191-830513a39317-trusted-ca\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.920837 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1ae1a25b-712e-42ec-9b7b-4925ba7f97af-signing-key\") pod \"service-ca-9c57cc56f-vs5x4\" (UID: \"1ae1a25b-712e-42ec-9b7b-4925ba7f97af\") " pod="openshift-service-ca/service-ca-9c57cc56f-vs5x4" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.928404 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-x8x2r" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.937942 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.948489 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.956824 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9ztb\" (UniqueName: \"kubernetes.io/projected/4faf1743-e825-477d-b191-830513a39317-kube-api-access-j9ztb\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:37 crc kubenswrapper[4817]: W0218 13:59:37.961329 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29f1a30b_47d4_452e_9017_dcc9cf78795f.slice/crio-5eef7139bb99b611967904d61985f279a9e32bb5f542db826e7ed1c12e4a6e8e WatchSource:0}: Error finding container 5eef7139bb99b611967904d61985f279a9e32bb5f542db826e7ed1c12e4a6e8e: Status 404 returned error can't find the container with id 5eef7139bb99b611967904d61985f279a9e32bb5f542db826e7ed1c12e4a6e8e Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.978140 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr" Feb 18 13:59:37 crc kubenswrapper[4817]: I0218 13:59:37.984050 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tcgb\" (UniqueName: \"kubernetes.io/projected/e983df69-9be7-498d-a703-90a309bd98e8-kube-api-access-4tcgb\") pod \"packageserver-d55dfcdfc-phf44\" (UID: \"e983df69-9be7-498d-a703-90a309bd98e8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-phf44" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.019029 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523705-gm644"] Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.029150 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsknh\" (UniqueName: \"kubernetes.io/projected/3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982-kube-api-access-rsknh\") pod \"cni-sysctl-allowlist-ds-pb2jx\" (UID: \"3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982\") " pod="openshift-multus/cni-sysctl-allowlist-ds-pb2jx" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.029235 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-pb2jx\" (UID: \"3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982\") " pod="openshift-multus/cni-sysctl-allowlist-ds-pb2jx" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.029259 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/614109c1-b0a9-4ff3-b3d4-04325985c7df-registration-dir\") pod \"csi-hostpathplugin-x4qx6\" (UID: \"614109c1-b0a9-4ff3-b3d4-04325985c7df\") " pod="hostpath-provisioner/csi-hostpathplugin-x4qx6" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.029330 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/614109c1-b0a9-4ff3-b3d4-04325985c7df-socket-dir\") pod \"csi-hostpathplugin-x4qx6\" (UID: \"614109c1-b0a9-4ff3-b3d4-04325985c7df\") " pod="hostpath-provisioner/csi-hostpathplugin-x4qx6" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.029350 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4b67\" (UniqueName: \"kubernetes.io/projected/ffb4e423-81f2-4b61-9b22-c4f6d8504861-kube-api-access-t4b67\") pod \"machine-config-server-87jjf\" (UID: \"ffb4e423-81f2-4b61-9b22-c4f6d8504861\") " pod="openshift-machine-config-operator/machine-config-server-87jjf" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.029469 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-pb2jx\" (UID: \"3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982\") " pod="openshift-multus/cni-sysctl-allowlist-ds-pb2jx" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.029496 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e3015431-de34-487b-b9d9-473c6da0578b-metrics-tls\") pod \"dns-default-pqtc6\" (UID: \"e3015431-de34-487b-b9d9-473c6da0578b\") " pod="openshift-dns/dns-default-pqtc6" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.029630 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/614109c1-b0a9-4ff3-b3d4-04325985c7df-plugins-dir\") pod \"csi-hostpathplugin-x4qx6\" (UID: \"614109c1-b0a9-4ff3-b3d4-04325985c7df\") " pod="hostpath-provisioner/csi-hostpathplugin-x4qx6" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.029656 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ffb4e423-81f2-4b61-9b22-c4f6d8504861-node-bootstrap-token\") pod \"machine-config-server-87jjf\" (UID: \"ffb4e423-81f2-4b61-9b22-c4f6d8504861\") " pod="openshift-machine-config-operator/machine-config-server-87jjf" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.029709 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/09b37c68-ecb2-4b3c-8cac-4d457be545ec-cert\") pod \"ingress-canary-m47g9\" (UID: \"09b37c68-ecb2-4b3c-8cac-4d457be545ec\") " pod="openshift-ingress-canary/ingress-canary-m47g9" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.029731 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ffb4e423-81f2-4b61-9b22-c4f6d8504861-certs\") pod \"machine-config-server-87jjf\" (UID: \"ffb4e423-81f2-4b61-9b22-c4f6d8504861\") " pod="openshift-machine-config-operator/machine-config-server-87jjf" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.029812 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/614109c1-b0a9-4ff3-b3d4-04325985c7df-csi-data-dir\") pod \"csi-hostpathplugin-x4qx6\" (UID: \"614109c1-b0a9-4ff3-b3d4-04325985c7df\") " pod="hostpath-provisioner/csi-hostpathplugin-x4qx6" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.030040 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982-ready\") pod \"cni-sysctl-allowlist-ds-pb2jx\" (UID: \"3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982\") " pod="openshift-multus/cni-sysctl-allowlist-ds-pb2jx" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.030094 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.032432 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42vsc\" (UniqueName: \"kubernetes.io/projected/e3015431-de34-487b-b9d9-473c6da0578b-kube-api-access-42vsc\") pod \"dns-default-pqtc6\" (UID: \"e3015431-de34-487b-b9d9-473c6da0578b\") " pod="openshift-dns/dns-default-pqtc6" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.032520 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v67qq\" (UniqueName: \"kubernetes.io/projected/614109c1-b0a9-4ff3-b3d4-04325985c7df-kube-api-access-v67qq\") pod \"csi-hostpathplugin-x4qx6\" (UID: \"614109c1-b0a9-4ff3-b3d4-04325985c7df\") " pod="hostpath-provisioner/csi-hostpathplugin-x4qx6" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.032596 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3015431-de34-487b-b9d9-473c6da0578b-config-volume\") pod \"dns-default-pqtc6\" (UID: \"e3015431-de34-487b-b9d9-473c6da0578b\") " pod="openshift-dns/dns-default-pqtc6" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.032619 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv6xh\" (UniqueName: \"kubernetes.io/projected/09b37c68-ecb2-4b3c-8cac-4d457be545ec-kube-api-access-xv6xh\") pod \"ingress-canary-m47g9\" (UID: \"09b37c68-ecb2-4b3c-8cac-4d457be545ec\") " pod="openshift-ingress-canary/ingress-canary-m47g9" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.032698 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/614109c1-b0a9-4ff3-b3d4-04325985c7df-mountpoint-dir\") pod \"csi-hostpathplugin-x4qx6\" (UID: \"614109c1-b0a9-4ff3-b3d4-04325985c7df\") " pod="hostpath-provisioner/csi-hostpathplugin-x4qx6" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.038963 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/614109c1-b0a9-4ff3-b3d4-04325985c7df-csi-data-dir\") pod \"csi-hostpathplugin-x4qx6\" (UID: \"614109c1-b0a9-4ff3-b3d4-04325985c7df\") " pod="hostpath-provisioner/csi-hostpathplugin-x4qx6" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.040441 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982-ready\") pod \"cni-sysctl-allowlist-ds-pb2jx\" (UID: \"3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982\") " pod="openshift-multus/cni-sysctl-allowlist-ds-pb2jx" Feb 18 13:59:38 crc kubenswrapper[4817]: E0218 13:59:38.042697 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:38.5426515 +0000 UTC m=+41.118187483 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.043894 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3015431-de34-487b-b9d9-473c6da0578b-config-volume\") pod \"dns-default-pqtc6\" (UID: \"e3015431-de34-487b-b9d9-473c6da0578b\") " pod="openshift-dns/dns-default-pqtc6" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.044100 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/614109c1-b0a9-4ff3-b3d4-04325985c7df-registration-dir\") pod \"csi-hostpathplugin-x4qx6\" (UID: \"614109c1-b0a9-4ff3-b3d4-04325985c7df\") " pod="hostpath-provisioner/csi-hostpathplugin-x4qx6" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.044123 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/614109c1-b0a9-4ff3-b3d4-04325985c7df-mountpoint-dir\") pod \"csi-hostpathplugin-x4qx6\" (UID: \"614109c1-b0a9-4ff3-b3d4-04325985c7df\") " pod="hostpath-provisioner/csi-hostpathplugin-x4qx6" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.044246 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-pb2jx\" (UID: \"3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982\") " pod="openshift-multus/cni-sysctl-allowlist-ds-pb2jx" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.044311 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/614109c1-b0a9-4ff3-b3d4-04325985c7df-plugins-dir\") pod \"csi-hostpathplugin-x4qx6\" (UID: \"614109c1-b0a9-4ff3-b3d4-04325985c7df\") " pod="hostpath-provisioner/csi-hostpathplugin-x4qx6" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.045004 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-pb2jx\" (UID: \"3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982\") " pod="openshift-multus/cni-sysctl-allowlist-ds-pb2jx" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.048997 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/614109c1-b0a9-4ff3-b3d4-04325985c7df-socket-dir\") pod \"csi-hostpathplugin-x4qx6\" (UID: \"614109c1-b0a9-4ff3-b3d4-04325985c7df\") " pod="hostpath-provisioner/csi-hostpathplugin-x4qx6" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.051691 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9mrg6"] Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.065061 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/09b37c68-ecb2-4b3c-8cac-4d457be545ec-cert\") pod \"ingress-canary-m47g9\" (UID: \"09b37c68-ecb2-4b3c-8cac-4d457be545ec\") " pod="openshift-ingress-canary/ingress-canary-m47g9" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.074071 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsknh\" (UniqueName: \"kubernetes.io/projected/3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982-kube-api-access-rsknh\") pod \"cni-sysctl-allowlist-ds-pb2jx\" (UID: \"3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982\") " pod="openshift-multus/cni-sysctl-allowlist-ds-pb2jx" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.076142 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-pb2jx" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.082162 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q6hcz"] Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.090036 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ffb4e423-81f2-4b61-9b22-c4f6d8504861-certs\") pod \"machine-config-server-87jjf\" (UID: \"ffb4e423-81f2-4b61-9b22-c4f6d8504861\") " pod="openshift-machine-config-operator/machine-config-server-87jjf" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.092617 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ffb4e423-81f2-4b61-9b22-c4f6d8504861-node-bootstrap-token\") pod \"machine-config-server-87jjf\" (UID: \"ffb4e423-81f2-4b61-9b22-c4f6d8504861\") " pod="openshift-machine-config-operator/machine-config-server-87jjf" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.093023 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e3015431-de34-487b-b9d9-473c6da0578b-metrics-tls\") pod \"dns-default-pqtc6\" (UID: \"e3015431-de34-487b-b9d9-473c6da0578b\") " pod="openshift-dns/dns-default-pqtc6" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.096146 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-975fj"] Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.118128 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42vsc\" (UniqueName: \"kubernetes.io/projected/e3015431-de34-487b-b9d9-473c6da0578b-kube-api-access-42vsc\") pod \"dns-default-pqtc6\" (UID: \"e3015431-de34-487b-b9d9-473c6da0578b\") " pod="openshift-dns/dns-default-pqtc6" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.137833 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v67qq\" (UniqueName: \"kubernetes.io/projected/614109c1-b0a9-4ff3-b3d4-04325985c7df-kube-api-access-v67qq\") pod \"csi-hostpathplugin-x4qx6\" (UID: \"614109c1-b0a9-4ff3-b3d4-04325985c7df\") " pod="hostpath-provisioner/csi-hostpathplugin-x4qx6" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.141284 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:38 crc kubenswrapper[4817]: E0218 13:59:38.141681 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:38.641660711 +0000 UTC m=+41.217196694 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.150912 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4b67\" (UniqueName: \"kubernetes.io/projected/ffb4e423-81f2-4b61-9b22-c4f6d8504861-kube-api-access-t4b67\") pod \"machine-config-server-87jjf\" (UID: \"ffb4e423-81f2-4b61-9b22-c4f6d8504861\") " pod="openshift-machine-config-operator/machine-config-server-87jjf" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.152613 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv6xh\" (UniqueName: \"kubernetes.io/projected/09b37c68-ecb2-4b3c-8cac-4d457be545ec-kube-api-access-xv6xh\") pod \"ingress-canary-m47g9\" (UID: \"09b37c68-ecb2-4b3c-8cac-4d457be545ec\") " pod="openshift-ingress-canary/ingress-canary-m47g9" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.192282 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-vs5x4" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.202362 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-phf44" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.211207 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mj6tz"] Feb 18 13:59:38 crc kubenswrapper[4817]: W0218 13:59:38.232064 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode61f548d_eef7_4f2a_9854_c5bdb6b2b815.slice/crio-a7421feaf249a7a6e25d8dede2ce5a4c29ac011b10dadd7373b6aeaec5195dae WatchSource:0}: Error finding container a7421feaf249a7a6e25d8dede2ce5a4c29ac011b10dadd7373b6aeaec5195dae: Status 404 returned error can't find the container with id a7421feaf249a7a6e25d8dede2ce5a4c29ac011b10dadd7373b6aeaec5195dae Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.244569 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:38 crc kubenswrapper[4817]: E0218 13:59:38.245140 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:38.745124738 +0000 UTC m=+41.320660721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.328044 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pqtc6" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.341153 4817 csr.go:261] certificate signing request csr-p5pc2 is approved, waiting to be issued Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.345601 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.345610 4817 csr.go:257] certificate signing request csr-p5pc2 is issued Feb 18 13:59:38 crc kubenswrapper[4817]: E0218 13:59:38.345721 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:38.845697781 +0000 UTC m=+41.421233764 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.347431 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:38 crc kubenswrapper[4817]: E0218 13:59:38.347849 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:38.847832556 +0000 UTC m=+41.423368539 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.352558 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-x4qx6" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.367474 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-87jjf" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.382065 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-m47g9" Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.449338 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:38 crc kubenswrapper[4817]: E0218 13:59:38.454676 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:38.950334118 +0000 UTC m=+41.525870101 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.554033 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:38 crc kubenswrapper[4817]: E0218 13:59:38.554830 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:39.0548122 +0000 UTC m=+41.630348183 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.665724 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:38 crc kubenswrapper[4817]: E0218 13:59:38.673305 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:39.173281185 +0000 UTC m=+41.748817168 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.764290 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bclz6"] Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.772951 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:38 crc kubenswrapper[4817]: E0218 13:59:38.773329 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:39.273316584 +0000 UTC m=+41.848852557 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.866877 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-x8x2r"] Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.874553 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:38 crc kubenswrapper[4817]: E0218 13:59:38.875105 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:39.375084887 +0000 UTC m=+41.950620870 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.961020 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-l7vpp"] Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.972575 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tg2g5" event={"ID":"1fd1e3a6-ff0d-4bf0-be88-ce9117cfedf8","Type":"ContainerStarted","Data":"347608b4c781a7a44fa1d54e846739e6d95bb7bd04376092aabf7bd652b97268"} Feb 18 13:59:38 crc kubenswrapper[4817]: I0218 13:59:38.977699 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:38 crc kubenswrapper[4817]: E0218 13:59:38.978227 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:39.478211975 +0000 UTC m=+42.053747958 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:38.997676 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mx7fv" event={"ID":"acfbc5ef-21ea-41bb-ba10-c5b5e79dd593","Type":"ContainerStarted","Data":"9c06e9c6cd7c2dab3eae87c33366400d98516ad1790db4f03423198891baa7d6"} Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.000094 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mx7fv" event={"ID":"acfbc5ef-21ea-41bb-ba10-c5b5e79dd593","Type":"ContainerStarted","Data":"0a66b332be5119f73667756ee922ad8f6c38c1286b275b683de2b492e1304c6b"} Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.050435 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46tjc" event={"ID":"6b5551de-6f87-439b-875c-b66e902f2f25","Type":"ContainerStarted","Data":"9010ac41e3bba5aac8149d88be6cb713a7197226c6f92963e7ef6795b4af6e12"} Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.086624 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:39 crc kubenswrapper[4817]: E0218 13:59:39.086886 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:39.586845005 +0000 UTC m=+42.162380988 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.087612 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:39 crc kubenswrapper[4817]: E0218 13:59:39.088106 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:39.588086658 +0000 UTC m=+42.163622631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.123870 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523705-gm644" event={"ID":"3f85543c-dccf-4a3e-be40-7305a2e49d1d","Type":"ContainerStarted","Data":"23b7bdfebe45c1cdd3b38d114864281e4380b7ed9d829a4ee15d265d3a4509f8"} Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.126712 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr"] Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.128678 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vqv79" event={"ID":"0f3e1cca-80c4-4d17-bb9f-9556fed78aac","Type":"ContainerStarted","Data":"c1eed4bf4d097ecfc641a739d8a72c62b2d0666f1ff54d6e76417514635101c7"} Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.130885 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-wwvbf" event={"ID":"5345e1d1-a74f-4d8f-8b86-2bb389a525a2","Type":"ContainerStarted","Data":"4d7636139350bd5480f1df8e256fde2f53dfb5231eb8f3d22177c9c86bb3f5da"} Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.133993 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-wwvbf" Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.134065 4817 patch_prober.go:28] interesting pod/downloads-7954f5f757-wwvbf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.134103 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wwvbf" podUID="5345e1d1-a74f-4d8f-8b86-2bb389a525a2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.167404 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q6hcz" event={"ID":"6df56857-33ce-442e-a600-188003fc196e","Type":"ContainerStarted","Data":"fd9df5542ee42cf334cbb68b2068c7beec676c3fb06ccd7c2641bd302aa1c9d2"} Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.185099 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pj24h" event={"ID":"29f1a30b-47d4-452e-9017-dcc9cf78795f","Type":"ContainerStarted","Data":"5eef7139bb99b611967904d61985f279a9e32bb5f542db826e7ed1c12e4a6e8e"} Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.188849 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:39 crc kubenswrapper[4817]: E0218 13:59:39.190734 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:39.690708723 +0000 UTC m=+42.266244706 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.269657 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w7l8m" podStartSLOduration=20.269635215 podStartE2EDuration="20.269635215s" podCreationTimestamp="2026-02-18 13:59:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:39.262378787 +0000 UTC m=+41.837914770" watchObservedRunningTime="2026-02-18 13:59:39.269635215 +0000 UTC m=+41.845171198" Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.300222 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-twxwz" event={"ID":"98bc0caa-ee8a-406e-952d-b400d8c72116","Type":"ContainerStarted","Data":"109a5bfc36280c70fc9b4ed6fa84c26d9bd029f4364fa9dd9034fc6ab08cf5db"} Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.300726 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-twxwz" event={"ID":"98bc0caa-ee8a-406e-952d-b400d8c72116","Type":"ContainerStarted","Data":"7b30b6a63a5640ed833cfc6d53306ceb63a19b1c08ae4fc32f354caa54fa4f0d"} Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.302186 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:39 crc kubenswrapper[4817]: E0218 13:59:39.305505 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:39.805487683 +0000 UTC m=+42.381023666 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.341414 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-j4llw" podStartSLOduration=19.341386591 podStartE2EDuration="19.341386591s" podCreationTimestamp="2026-02-18 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:39.298398529 +0000 UTC m=+41.873934512" watchObservedRunningTime="2026-02-18 13:59:39.341386591 +0000 UTC m=+41.916922574" Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.370900 4817 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-18 13:54:38 +0000 UTC, rotation deadline is 2026-12-05 23:39:41.162997969 +0000 UTC Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.370957 4817 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6969h40m1.792043353s for next certificate rotation Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.378545 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-phf44"] Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.380013 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j4s9g" event={"ID":"f8d657cd-dbc9-45d6-8ac4-9c22d9709980","Type":"ContainerStarted","Data":"9e98870048c5088ec3f6a30b9499451ec6e90a61292feebc03770e2340e59091"} Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.405807 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:39 crc kubenswrapper[4817]: E0218 13:59:39.407138 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:39.907075991 +0000 UTC m=+42.482611974 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.416375 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j4s9g" podStartSLOduration=19.416341181 podStartE2EDuration="19.416341181s" podCreationTimestamp="2026-02-18 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:39.413175279 +0000 UTC m=+41.988711272" watchObservedRunningTime="2026-02-18 13:59:39.416341181 +0000 UTC m=+41.991877164" Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.416588 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ds5z6" event={"ID":"25f80446-5f8d-476a-91e7-c42b9572854b","Type":"ContainerStarted","Data":"6dcb53a767e02121ec533aa9bdebdba079022956c2b61f9edfdf568b942b731c"} Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.452123 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-bwc6d" event={"ID":"3db6bfaa-89b3-49e5-9c33-2959670f96f1","Type":"ContainerStarted","Data":"c990c6a5cc65b209e3fbe6c5787eb7cb8646a6ee004f1eb0770330d2ebe66fa0"} Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.509778 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:39 crc kubenswrapper[4817]: E0218 13:59:39.511410 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:40.01139603 +0000 UTC m=+42.586932013 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.520103 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-l9ndv" event={"ID":"78ff2eb3-3f40-4529-b52e-62316f24fd15","Type":"ContainerStarted","Data":"401806139b756380054a4310dac29dd520766a664336a5e473d24b8193d43062"} Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.520157 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-l9ndv" event={"ID":"78ff2eb3-3f40-4529-b52e-62316f24fd15","Type":"ContainerStarted","Data":"1e326dc804f0fedac86e3d7954fcd36b2504acfe5cdb09158a1e8b07df18c567"} Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.539364 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-rpmhp" event={"ID":"cc05dc28-13af-4d05-835a-e3ecc993b1ab","Type":"ContainerStarted","Data":"a9b341866976b62d2bf677c5b84e087b3f1bc3f2a7424bc13ea80d57b4854de5"} Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.540463 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-rpmhp" Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.550327 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr"] Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.564107 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-rpmhp" Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.570747 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mxgkh" event={"ID":"456d086a-23a5-42fa-b637-a090f649ffe5","Type":"ContainerStarted","Data":"bbfd10350e8a615da0b0ee3675bc399f905718f3020df1a1144498d914b6adc9"} Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.570824 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mxgkh" event={"ID":"456d086a-23a5-42fa-b637-a090f649ffe5","Type":"ContainerStarted","Data":"615bc2090160a2e0a2fa6c744afbdd7e8c0fe3952b6f04fa38daa56c1a8ca05b"} Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.608812 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9mrg6" event={"ID":"5b9edf7b-9549-4eda-a45e-27b94c137b4a","Type":"ContainerStarted","Data":"4ec8e9a439fd5eff9632aa8a1699f065c4988453d36b2848d006dc5d8489ac40"} Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.612446 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:39 crc kubenswrapper[4817]: E0218 13:59:39.612905 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:40.112884966 +0000 UTC m=+42.688420949 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.664197 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b75zt" event={"ID":"ed392029-9b45-4464-9422-10e4ec72db07","Type":"ContainerStarted","Data":"1a6e381623c2bb1e97af53809aed801a6db8cae6521df7e20803bd40ab6999f5"} Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.664268 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b75zt" event={"ID":"ed392029-9b45-4464-9422-10e4ec72db07","Type":"ContainerStarted","Data":"18f66ccf61fcc822615a7daac503079b2d643633c4a33f27937cc59acc3ea1d2"} Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.696105 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mj6tz" event={"ID":"9f66536d-c481-41b3-b5e5-8259651a95d9","Type":"ContainerStarted","Data":"de308d929e6fd31bdef2b0ec8db75ede8b20e3a3a43cb3b7f8de5a613a9bf85d"} Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.717074 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:39 crc kubenswrapper[4817]: E0218 13:59:39.717424 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:40.21741129 +0000 UTC m=+42.792947273 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.779947 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46tjc" podStartSLOduration=19.779923488 podStartE2EDuration="19.779923488s" podCreationTimestamp="2026-02-18 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:39.779145068 +0000 UTC m=+42.354681051" watchObservedRunningTime="2026-02-18 13:59:39.779923488 +0000 UTC m=+42.355459471" Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.788403 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-975fj" event={"ID":"e61f548d-eef7-4f2a-9854-c5bdb6b2b815","Type":"ContainerStarted","Data":"a7421feaf249a7a6e25d8dede2ce5a4c29ac011b10dadd7373b6aeaec5195dae"} Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.801258 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-pb2jx" event={"ID":"3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982","Type":"ContainerStarted","Data":"4e261bdd0ab08dd24f717eaf074c2768ee0497563b058db330449c524941fcbe"} Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.821942 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:39 crc kubenswrapper[4817]: E0218 13:59:39.822205 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:40.322174171 +0000 UTC m=+42.897710164 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.824154 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:39 crc kubenswrapper[4817]: E0218 13:59:39.824919 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:40.324911602 +0000 UTC m=+42.900447575 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.855525 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2b4hg" event={"ID":"4ef73bb6-9567-4fe4-81ae-ae50b8c70455","Type":"ContainerStarted","Data":"8d68fb4554e0b5196137b0b94f1929ddbfadf2490674009253f8d268846907c7"} Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.870193 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vs5x4"] Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.872273 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z77sh" event={"ID":"57387b14-66f5-4d48-9591-784eb4216a13","Type":"ContainerStarted","Data":"c552f662636834c73c061a296b6ac933ece5caae12b1f6a31ed1cc023d272d64"} Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.899331 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6w9rz" event={"ID":"0a42d5a9-1383-4a55-9c81-bd40eb5ba86f","Type":"ContainerStarted","Data":"c2ee898046d219993f4b6a4ea638364da4ba9c083ebb3c18ee3b213417aced54"} Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.928677 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:39 crc kubenswrapper[4817]: E0218 13:59:39.929118 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:40.429098177 +0000 UTC m=+43.004634160 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:39 crc kubenswrapper[4817]: I0218 13:59:39.998261 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc74r" event={"ID":"ee154dbc-917d-46f5-bd9e-1a3c11a19a41","Type":"ContainerStarted","Data":"66804c1837263a05817576c936cd5a81c16b6f7085bd61da10b9d7600c7151ea"} Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.031548 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:40 crc kubenswrapper[4817]: E0218 13:59:40.032502 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:40.532484922 +0000 UTC m=+43.108020905 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.105280 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rs5bx" event={"ID":"be1e2228-c1df-48a2-83a3-ef74747c69c9","Type":"ContainerStarted","Data":"1029240aa58320ab42e1b2bbe122f87fad7597b3d488433520febb06c76f213a"} Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.123720 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mj6tz" podStartSLOduration=20.123693382 podStartE2EDuration="20.123693382s" podCreationTimestamp="2026-02-18 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:39.99296065 +0000 UTC m=+42.568496663" watchObservedRunningTime="2026-02-18 13:59:40.123693382 +0000 UTC m=+42.699229365" Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.124557 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-x4qx6"] Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.138412 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:40 crc kubenswrapper[4817]: E0218 13:59:40.138898 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:40.638858555 +0000 UTC m=+43.214394538 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:40 crc kubenswrapper[4817]: E0218 13:59:40.155519 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:40.655499795 +0000 UTC m=+43.231035778 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.156192 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.166071 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gj9xr" event={"ID":"9d682539-5f81-4b10-a11a-50d3a9ef0c1c","Type":"ContainerStarted","Data":"a1e90089f77c6a2e9b84823c579ada33aa1758999942d922f6534d202e6b758f"} Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.166242 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gj9xr" event={"ID":"9d682539-5f81-4b10-a11a-50d3a9ef0c1c","Type":"ContainerStarted","Data":"f69eb212023b0a37af39af11515c888f301ac9e3b4033431802c739a8bd46c6e"} Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.170728 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-m47g9"] Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.257643 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:40 crc kubenswrapper[4817]: E0218 13:59:40.259218 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:40.759186318 +0000 UTC m=+43.334722301 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.268845 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-wwvbf" podStartSLOduration=20.268817097 podStartE2EDuration="20.268817097s" podCreationTimestamp="2026-02-18 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:40.242371413 +0000 UTC m=+42.817907396" watchObservedRunningTime="2026-02-18 13:59:40.268817097 +0000 UTC m=+42.844353080" Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.283448 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-v2snv" event={"ID":"149bcfc3-9623-403e-8c4c-1019bd5f0c16","Type":"ContainerStarted","Data":"1c4b06f7ab77447089188bafbdd63f0f794c321dcb0d86ec86ac187279f2f3b2"} Feb 18 13:59:40 crc kubenswrapper[4817]: W0218 13:59:40.284313 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09b37c68_ecb2_4b3c_8cac_4d457be545ec.slice/crio-2b2e019e932cb307a70d7396ce2458771703f0090535c0581ef6bdc5ab568199 WatchSource:0}: Error finding container 2b2e019e932cb307a70d7396ce2458771703f0090535c0581ef6bdc5ab568199: Status 404 returned error can't find the container with id 2b2e019e932cb307a70d7396ce2458771703f0090535c0581ef6bdc5ab568199 Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.311938 4817 generic.go:334] "Generic (PLEG): container finished" podID="9f128b3f-8527-4b4b-86d5-b456fe89c804" containerID="1164cb814f318dbdea0105c8a6edf5f56d1e2fde972ecfb98213ca16173d8832" exitCode=0 Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.312093 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dqctb" event={"ID":"9f128b3f-8527-4b4b-86d5-b456fe89c804","Type":"ContainerDied","Data":"1164cb814f318dbdea0105c8a6edf5f56d1e2fde972ecfb98213ca16173d8832"} Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.334710 4817 generic.go:334] "Generic (PLEG): container finished" podID="67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c" containerID="cb39b5f473e216fa983435f7d8aacce4226f27cc92c329bd5615d461a6e65ec0" exitCode=0 Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.334863 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" event={"ID":"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c","Type":"ContainerDied","Data":"cb39b5f473e216fa983435f7d8aacce4226f27cc92c329bd5615d461a6e65ec0"} Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.341783 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pqtc6"] Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.360668 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.364639 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7wn4g" event={"ID":"c2e62f6f-bd21-4255-85df-0c31e9edadaf","Type":"ContainerStarted","Data":"a854ffcf6c43daa224f1dd090b5028c89a79064af071721cb2bad245abe1fe8f"} Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.364692 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7wn4g" event={"ID":"c2e62f6f-bd21-4255-85df-0c31e9edadaf","Type":"ContainerStarted","Data":"eb3c0fc8addf2b0ed91c1414141991086b5d442fa8f38c50c9d6e05162223dd6"} Feb 18 13:59:40 crc kubenswrapper[4817]: E0218 13:59:40.368950 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:40.868909017 +0000 UTC m=+43.444445000 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.389449 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-j4llw" Feb 18 13:59:40 crc kubenswrapper[4817]: W0218 13:59:40.441029 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3015431_de34_487b_b9d9_473c6da0578b.slice/crio-8864956dbc95284d64d05137c5739eaf067f6905ebf2b8f512e1101798868fdd WatchSource:0}: Error finding container 8864956dbc95284d64d05137c5739eaf067f6905ebf2b8f512e1101798868fdd: Status 404 returned error can't find the container with id 8864956dbc95284d64d05137c5739eaf067f6905ebf2b8f512e1101798868fdd Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.470670 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:40 crc kubenswrapper[4817]: E0218 13:59:40.471173 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:40.971133012 +0000 UTC m=+43.546669005 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.471715 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:40 crc kubenswrapper[4817]: E0218 13:59:40.474148 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:40.974131109 +0000 UTC m=+43.549667092 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.477146 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-6w9rz" podStartSLOduration=20.477117867 podStartE2EDuration="20.477117867s" podCreationTimestamp="2026-02-18 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:40.467704863 +0000 UTC m=+43.043240856" watchObservedRunningTime="2026-02-18 13:59:40.477117867 +0000 UTC m=+43.052653850" Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.477722 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z77sh" podStartSLOduration=20.477716962 podStartE2EDuration="20.477716962s" podCreationTimestamp="2026-02-18 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:40.428527609 +0000 UTC m=+43.004063592" watchObservedRunningTime="2026-02-18 13:59:40.477716962 +0000 UTC m=+43.053252945" Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.512214 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-bwc6d" podStartSLOduration=20.512184834 podStartE2EDuration="20.512184834s" podCreationTimestamp="2026-02-18 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:40.48767481 +0000 UTC m=+43.063210793" watchObservedRunningTime="2026-02-18 13:59:40.512184834 +0000 UTC m=+43.087720817" Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.516507 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tg2g5" podStartSLOduration=20.516488805 podStartE2EDuration="20.516488805s" podCreationTimestamp="2026-02-18 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:40.514948255 +0000 UTC m=+43.090484238" watchObservedRunningTime="2026-02-18 13:59:40.516488805 +0000 UTC m=+43.092024788" Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.554637 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29523705-gm644" podStartSLOduration=21.554614892 podStartE2EDuration="21.554614892s" podCreationTimestamp="2026-02-18 13:59:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:40.547657742 +0000 UTC m=+43.123193725" watchObservedRunningTime="2026-02-18 13:59:40.554614892 +0000 UTC m=+43.130150865" Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.573234 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:40 crc kubenswrapper[4817]: E0218 13:59:40.573453 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:41.073426038 +0000 UTC m=+43.648962021 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.573505 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:40 crc kubenswrapper[4817]: E0218 13:59:40.574288 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:41.07427982 +0000 UTC m=+43.649815793 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.592726 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mxgkh" podStartSLOduration=20.592697207 podStartE2EDuration="20.592697207s" podCreationTimestamp="2026-02-18 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:40.585527101 +0000 UTC m=+43.161063084" watchObservedRunningTime="2026-02-18 13:59:40.592697207 +0000 UTC m=+43.168233180" Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.648620 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-b75zt" podStartSLOduration=20.648594593 podStartE2EDuration="20.648594593s" podCreationTimestamp="2026-02-18 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:40.648392748 +0000 UTC m=+43.223928731" watchObservedRunningTime="2026-02-18 13:59:40.648594593 +0000 UTC m=+43.224130576" Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.678496 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:40 crc kubenswrapper[4817]: E0218 13:59:40.678710 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:41.178675072 +0000 UTC m=+43.754211055 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.678830 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:40 crc kubenswrapper[4817]: E0218 13:59:40.679427 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:41.17940618 +0000 UTC m=+43.754942163 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.728024 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2b4hg" podStartSLOduration=20.728002288 podStartE2EDuration="20.728002288s" podCreationTimestamp="2026-02-18 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:40.690936959 +0000 UTC m=+43.266472942" watchObservedRunningTime="2026-02-18 13:59:40.728002288 +0000 UTC m=+43.303538271" Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.780694 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:40 crc kubenswrapper[4817]: E0218 13:59:40.781284 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:41.281267916 +0000 UTC m=+43.856803899 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.805998 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-6w9rz" Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.814751 4817 patch_prober.go:28] interesting pod/router-default-5444994796-6w9rz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 13:59:40 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Feb 18 13:59:40 crc kubenswrapper[4817]: [+]process-running ok Feb 18 13:59:40 crc kubenswrapper[4817]: healthz check failed Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.814820 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6w9rz" podUID="0a42d5a9-1383-4a55-9c81-bd40eb5ba86f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.837408 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ds5z6" podStartSLOduration=20.837387708 podStartE2EDuration="20.837387708s" podCreationTimestamp="2026-02-18 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:40.807008322 +0000 UTC m=+43.382544305" watchObservedRunningTime="2026-02-18 13:59:40.837387708 +0000 UTC m=+43.412923691" Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.883355 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:40 crc kubenswrapper[4817]: E0218 13:59:40.883812 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:41.383794029 +0000 UTC m=+43.959330032 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.900462 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-l9ndv" podStartSLOduration=20.900438969 podStartE2EDuration="20.900438969s" podCreationTimestamp="2026-02-18 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:40.837381368 +0000 UTC m=+43.412917351" watchObservedRunningTime="2026-02-18 13:59:40.900438969 +0000 UTC m=+43.475974942" Feb 18 13:59:40 crc kubenswrapper[4817]: I0218 13:59:40.994306 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:40 crc kubenswrapper[4817]: E0218 13:59:40.995237 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:41.495193241 +0000 UTC m=+44.070729224 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.019927 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-rpmhp" podStartSLOduration=22.01990588 podStartE2EDuration="22.01990588s" podCreationTimestamp="2026-02-18 13:59:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:41.018477153 +0000 UTC m=+43.594013146" watchObservedRunningTime="2026-02-18 13:59:41.01990588 +0000 UTC m=+43.595441863" Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.100753 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-twxwz" podStartSLOduration=22.100730562 podStartE2EDuration="22.100730562s" podCreationTimestamp="2026-02-18 13:59:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:41.053441558 +0000 UTC m=+43.628977541" watchObservedRunningTime="2026-02-18 13:59:41.100730562 +0000 UTC m=+43.676266545" Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.110709 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:41 crc kubenswrapper[4817]: E0218 13:59:41.111134 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:41.61112112 +0000 UTC m=+44.186657103 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.127665 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-v2snv" podStartSLOduration=22.127638878 podStartE2EDuration="22.127638878s" podCreationTimestamp="2026-02-18 13:59:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:41.125611985 +0000 UTC m=+43.701147968" watchObservedRunningTime="2026-02-18 13:59:41.127638878 +0000 UTC m=+43.703174861" Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.214258 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:41 crc kubenswrapper[4817]: E0218 13:59:41.215121 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:41.715100041 +0000 UTC m=+44.290636014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.313030 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gj9xr" podStartSLOduration=22.313009314 podStartE2EDuration="22.313009314s" podCreationTimestamp="2026-02-18 13:59:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:41.270933125 +0000 UTC m=+43.846469118" watchObservedRunningTime="2026-02-18 13:59:41.313009314 +0000 UTC m=+43.888545297" Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.317359 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:41 crc kubenswrapper[4817]: E0218 13:59:41.317911 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:41.81789217 +0000 UTC m=+44.393428153 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.349663 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rs5bx" podStartSLOduration=21.349638202 podStartE2EDuration="21.349638202s" podCreationTimestamp="2026-02-18 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:41.349259942 +0000 UTC m=+43.924795925" watchObservedRunningTime="2026-02-18 13:59:41.349638202 +0000 UTC m=+43.925174185" Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.349876 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7wn4g" podStartSLOduration=21.349870818 podStartE2EDuration="21.349870818s" podCreationTimestamp="2026-02-18 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:41.312387598 +0000 UTC m=+43.887923581" watchObservedRunningTime="2026-02-18 13:59:41.349870818 +0000 UTC m=+43.925406801" Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.375149 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pqtc6" event={"ID":"e3015431-de34-487b-b9d9-473c6da0578b","Type":"ContainerStarted","Data":"8864956dbc95284d64d05137c5739eaf067f6905ebf2b8f512e1101798868fdd"} Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.376148 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" event={"ID":"43f553ef-0150-4383-8c39-5db2cbcab63d","Type":"ContainerStarted","Data":"72e0b658c7f8a4f66e8521c14702f84177b35fba5b01b6da44ee82f081f3819e"} Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.376171 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" event={"ID":"43f553ef-0150-4383-8c39-5db2cbcab63d","Type":"ContainerStarted","Data":"4e3851f97c3471d3814ea376705e8761238b95fdfd86b1dcc595c576a6eea6f2"} Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.377315 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.385244 4817 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-l7vpp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.385305 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" podUID="43f553ef-0150-4383-8c39-5db2cbcab63d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.417096 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc74r" podStartSLOduration=21.417076636 podStartE2EDuration="21.417076636s" podCreationTimestamp="2026-02-18 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:41.415786643 +0000 UTC m=+43.991322626" watchObservedRunningTime="2026-02-18 13:59:41.417076636 +0000 UTC m=+43.992612619" Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.418785 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:41 crc kubenswrapper[4817]: E0218 13:59:41.419532 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:41.919480879 +0000 UTC m=+44.495016862 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.426362 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-pb2jx" event={"ID":"3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982","Type":"ContainerStarted","Data":"1c94d838fdef8a73cec016eeb66e2e241cbdafd8f76a6afa7ad7b14c12985adc"} Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.427174 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-pb2jx" Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.482707 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dqctb" event={"ID":"9f128b3f-8527-4b4b-86d5-b456fe89c804","Type":"ContainerStarted","Data":"0d6937cc3638438e9623d1edcd2cce5f75f873faf959108feb974817b1b9c75e"} Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.483398 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dqctb" Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.491183 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9mrg6" event={"ID":"5b9edf7b-9549-4eda-a45e-27b94c137b4a","Type":"ContainerStarted","Data":"160e27066ca9992948e36c0db76389dd7e73c5ac21cec59b50c979afe0a2bffc"} Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.491984 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9mrg6" Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.522997 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.532082 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-pb2jx" Feb 18 13:59:41 crc kubenswrapper[4817]: E0218 13:59:41.536297 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:42.0362724 +0000 UTC m=+44.611808383 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.549707 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rs5bx" event={"ID":"be1e2228-c1df-48a2-83a3-ef74747c69c9","Type":"ContainerStarted","Data":"67ac63d671ed22531ba6f4176e76e0360dbd1f1dcb495a23129b87a9140ce2a6"} Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.561603 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" podStartSLOduration=21.561582004999998 podStartE2EDuration="21.561582005s" podCreationTimestamp="2026-02-18 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:41.516440097 +0000 UTC m=+44.091976080" watchObservedRunningTime="2026-02-18 13:59:41.561582005 +0000 UTC m=+44.137117988" Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.578514 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dqctb" podStartSLOduration=22.578488362999998 podStartE2EDuration="22.578488363s" podCreationTimestamp="2026-02-18 13:59:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:41.560869297 +0000 UTC m=+44.136405280" watchObservedRunningTime="2026-02-18 13:59:41.578488363 +0000 UTC m=+44.154024336" Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.580711 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-phf44" event={"ID":"e983df69-9be7-498d-a703-90a309bd98e8","Type":"ContainerStarted","Data":"f5948edc362c9eadd4f1d2672eaa96915524f0253551bfd3a749edc62bcc907f"} Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.580800 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-phf44" Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.580814 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-phf44" event={"ID":"e983df69-9be7-498d-a703-90a309bd98e8","Type":"ContainerStarted","Data":"39747294a579c83aad7d1a67bc8c4476d57ece7e6d35ff30b0f1ebf076c34f25"} Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.580945 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9mrg6" Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.587322 4817 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-phf44 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" start-of-body= Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.587378 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-phf44" podUID="e983df69-9be7-498d-a703-90a309bd98e8" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.598071 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-pb2jx" podStartSLOduration=7.598055269 podStartE2EDuration="7.598055269s" podCreationTimestamp="2026-02-18 13:59:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:41.595669837 +0000 UTC m=+44.171205820" watchObservedRunningTime="2026-02-18 13:59:41.598055269 +0000 UTC m=+44.173591252" Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.626901 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mj6tz" event={"ID":"9f66536d-c481-41b3-b5e5-8259651a95d9","Type":"ContainerStarted","Data":"803d9e37846a4ce2f284791aefe457229383b30f7007ff4d64aef0c74f13cb56"} Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.627855 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:41 crc kubenswrapper[4817]: E0218 13:59:41.629104 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:42.129083332 +0000 UTC m=+44.704619315 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.682783 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-87jjf" event={"ID":"ffb4e423-81f2-4b61-9b22-c4f6d8504861","Type":"ContainerStarted","Data":"0057d8b145fb7c75397c6501367ae0f3d77db5b5e2712b741d9484ca06e274f3"} Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.682848 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-87jjf" event={"ID":"ffb4e423-81f2-4b61-9b22-c4f6d8504861","Type":"ContainerStarted","Data":"211478254a83c58811c3fa9a298ba62e96067e9539b72ec39027303361b266f2"} Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.711347 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9mrg6" podStartSLOduration=21.711316049 podStartE2EDuration="21.711316049s" podCreationTimestamp="2026-02-18 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:41.626330931 +0000 UTC m=+44.201866914" watchObservedRunningTime="2026-02-18 13:59:41.711316049 +0000 UTC m=+44.286852032" Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.734184 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x4qx6" event={"ID":"614109c1-b0a9-4ff3-b3d4-04325985c7df","Type":"ContainerStarted","Data":"386ca2eb23f05350984581bfb9ad41b9d4ee60c0405140abe3b04f3d8e6ed42a"} Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.735452 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:41 crc kubenswrapper[4817]: E0218 13:59:41.751664 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:42.251586281 +0000 UTC m=+44.827122264 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.747592 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-m47g9" event={"ID":"09b37c68-ecb2-4b3c-8cac-4d457be545ec","Type":"ContainerStarted","Data":"35c9fb2ec9810e1e2a521685a1ac54e89e78539aff465e12803b90113de5bab9"} Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.761839 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-m47g9" event={"ID":"09b37c68-ecb2-4b3c-8cac-4d457be545ec","Type":"ContainerStarted","Data":"2b2e019e932cb307a70d7396ce2458771703f0090535c0581ef6bdc5ab568199"} Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.766396 4817 generic.go:334] "Generic (PLEG): container finished" podID="ca679b0d-4e7e-4526-af6e-e3b0cb400fd0" containerID="e6ae443919d37adb7690d56dd09ab3bf20aa47d914ad9116f6424cf134784a55" exitCode=0 Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.767382 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" event={"ID":"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0","Type":"ContainerDied","Data":"e6ae443919d37adb7690d56dd09ab3bf20aa47d914ad9116f6424cf134784a55"} Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.767412 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" event={"ID":"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0","Type":"ContainerStarted","Data":"972a709fe11323ea90e19a765dd12c3fa55daa1f6eff08655699a4f36ee26fa8"} Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.769481 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-phf44" podStartSLOduration=21.769469964 podStartE2EDuration="21.769469964s" podCreationTimestamp="2026-02-18 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:41.759754213 +0000 UTC m=+44.335290196" watchObservedRunningTime="2026-02-18 13:59:41.769469964 +0000 UTC m=+44.345005947" Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.780567 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" event={"ID":"39a56faf-6fea-45d0-9531-fb86f571fd8b","Type":"ContainerStarted","Data":"91fb855daa3b08d69c92e04c18b40eaae7d8f9a536834a012f82f23b235f7ec7"} Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.780624 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" event={"ID":"39a56faf-6fea-45d0-9531-fb86f571fd8b","Type":"ContainerStarted","Data":"617ee40a821789b35f46fd4fda50ddf8b86146ed0ba0705cbe8b35afdf0f6ccd"} Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.782881 4817 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-bclz6 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.782928 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" podUID="39a56faf-6fea-45d0-9531-fb86f571fd8b" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.783436 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.785255 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr" event={"ID":"3707018b-031a-4902-8e5c-ba5bc46cc4c4","Type":"ContainerStarted","Data":"3abfed9bee2d45733707c3a904422161329ef07885de51da9a8b1ec2afa1c502"} Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.785286 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr" event={"ID":"3707018b-031a-4902-8e5c-ba5bc46cc4c4","Type":"ContainerStarted","Data":"9fd80a5bb0c0fbd10d8b687a39ea6f110c7385b2921dfc302b535b082dcc4fa3"} Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.787998 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr" Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.792992 4817 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-hrdpr container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.793052 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr" podUID="3707018b-031a-4902-8e5c-ba5bc46cc4c4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.814704 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vqv79" event={"ID":"0f3e1cca-80c4-4d17-bb9f-9556fed78aac","Type":"ContainerStarted","Data":"384a50fb2201517570ae6ed59a4a7755f64327aa00a4fa9d8f9aa6cbb11544c9"} Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.815808 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vqv79" Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.824913 4817 patch_prober.go:28] interesting pod/router-default-5444994796-6w9rz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 13:59:41 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Feb 18 13:59:41 crc kubenswrapper[4817]: [+]process-running ok Feb 18 13:59:41 crc kubenswrapper[4817]: healthz check failed Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.824997 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6w9rz" podUID="0a42d5a9-1383-4a55-9c81-bd40eb5ba86f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.838083 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-975fj" event={"ID":"e61f548d-eef7-4f2a-9854-c5bdb6b2b815","Type":"ContainerStarted","Data":"09d1dc42e8c916b6a049f6c15d5df6ab75ef25a62b36a5ea588dade0506320fc"} Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.839363 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vqv79" Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.839696 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:41 crc kubenswrapper[4817]: E0218 13:59:41.841441 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:42.341415195 +0000 UTC m=+44.916951168 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.844411 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr" podStartSLOduration=21.844392723 podStartE2EDuration="21.844392723s" podCreationTimestamp="2026-02-18 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:41.843283324 +0000 UTC m=+44.418819307" watchObservedRunningTime="2026-02-18 13:59:41.844392723 +0000 UTC m=+44.419928706" Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.844616 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-87jjf" podStartSLOduration=8.844612588 podStartE2EDuration="8.844612588s" podCreationTimestamp="2026-02-18 13:59:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:41.81568507 +0000 UTC m=+44.391221053" watchObservedRunningTime="2026-02-18 13:59:41.844612588 +0000 UTC m=+44.420148571" Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.860473 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-vs5x4" event={"ID":"1ae1a25b-712e-42ec-9b7b-4925ba7f97af","Type":"ContainerStarted","Data":"07173f3951dd1780bd87d6ece2710bc34e849d276b8e0d0673b5b1e5ed50ad13"} Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.860526 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-vs5x4" event={"ID":"1ae1a25b-712e-42ec-9b7b-4925ba7f97af","Type":"ContainerStarted","Data":"565cbc74ef995b6d36f65e8fcfd3c83f00c26dd719ad6036c9ef3e7ca13cbe62"} Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.902060 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-x8x2r" event={"ID":"3bd99710-b175-4115-8944-1fac544145c5","Type":"ContainerStarted","Data":"8d3ec240036fcc06c3ad690cccb6046ec21fdc5d3168e49a6f16d44ff5351141"} Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.902136 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-x8x2r" event={"ID":"3bd99710-b175-4115-8944-1fac544145c5","Type":"ContainerStarted","Data":"bd523405a1cfc15bcff6a32fa6b24d078d01dea403f4c92a0cbdbeadc68b8241"} Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.902156 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-x8x2r" event={"ID":"3bd99710-b175-4115-8944-1fac544145c5","Type":"ContainerStarted","Data":"caf9ab652d8ec48a41f8d1efd94bae9d7b61cf141217af25f11aa318076e41b4"} Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.912397 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" podStartSLOduration=22.912375842 podStartE2EDuration="22.912375842s" podCreationTimestamp="2026-02-18 13:59:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:41.881852292 +0000 UTC m=+44.457388275" watchObservedRunningTime="2026-02-18 13:59:41.912375842 +0000 UTC m=+44.487911825" Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.917249 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-m47g9" podStartSLOduration=7.917228767 podStartE2EDuration="7.917228767s" podCreationTimestamp="2026-02-18 13:59:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:41.910701558 +0000 UTC m=+44.486237531" watchObservedRunningTime="2026-02-18 13:59:41.917228767 +0000 UTC m=+44.492764750" Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.939710 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mx7fv" event={"ID":"acfbc5ef-21ea-41bb-ba10-c5b5e79dd593","Type":"ContainerStarted","Data":"63b1e4bf1f632ac1824f558e7cce779342e58b0306eb0fbe0858ae8410b3d8c9"} Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.942450 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:41 crc kubenswrapper[4817]: E0218 13:59:41.942945 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:42.442927822 +0000 UTC m=+45.018463805 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.957891 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q6hcz" event={"ID":"6df56857-33ce-442e-a600-188003fc196e","Type":"ContainerStarted","Data":"631da3ba6e04a46b2b82cbcbce44d535a03a051c68d8f96782d02e0b479dc46b"} Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.960332 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523705-gm644" event={"ID":"3f85543c-dccf-4a3e-be40-7305a2e49d1d","Type":"ContainerStarted","Data":"0d0b900a8db1d295b441b5270730b75e4e473ab00702ff85d8156305dc718499"} Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.962769 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pj24h" event={"ID":"29f1a30b-47d4-452e-9017-dcc9cf78795f","Type":"ContainerStarted","Data":"2001c372515ac1ad657dd2ba24014d750ae586efc98f3e18a2a623872cc0b0ea"} Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.962794 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pj24h" event={"ID":"29f1a30b-47d4-452e-9017-dcc9cf78795f","Type":"ContainerStarted","Data":"987190a46aa98befbc046b1638f985c3a2489a3dbaa71b1855a17a10b7048430"} Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.967363 4817 patch_prober.go:28] interesting pod/downloads-7954f5f757-wwvbf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.967428 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wwvbf" podUID="5345e1d1-a74f-4d8f-8b86-2bb389a525a2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 18 13:59:41 crc kubenswrapper[4817]: I0218 13:59:41.980562 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-975fj" podStartSLOduration=22.980538125 podStartE2EDuration="22.980538125s" podCreationTimestamp="2026-02-18 13:59:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:41.979724794 +0000 UTC m=+44.555260777" watchObservedRunningTime="2026-02-18 13:59:41.980538125 +0000 UTC m=+44.556074108" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.038311 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-mx7fv" podStartSLOduration=22.038292779 podStartE2EDuration="22.038292779s" podCreationTimestamp="2026-02-18 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:42.036435931 +0000 UTC m=+44.611971924" watchObservedRunningTime="2026-02-18 13:59:42.038292779 +0000 UTC m=+44.613828762" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.043938 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:42 crc kubenswrapper[4817]: E0218 13:59:42.045590 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:42.545567448 +0000 UTC m=+45.121103421 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.086048 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-pb2jx"] Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.086907 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-x8x2r" podStartSLOduration=22.086888386 podStartE2EDuration="22.086888386s" podCreationTimestamp="2026-02-18 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:42.081114076 +0000 UTC m=+44.656650059" watchObservedRunningTime="2026-02-18 13:59:42.086888386 +0000 UTC m=+44.662424379" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.125263 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vqv79" podStartSLOduration=22.125242618 podStartE2EDuration="22.125242618s" podCreationTimestamp="2026-02-18 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:42.121590594 +0000 UTC m=+44.697126577" watchObservedRunningTime="2026-02-18 13:59:42.125242618 +0000 UTC m=+44.700778601" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.187399 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.195404 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-vs5x4" podStartSLOduration=22.195380943 podStartE2EDuration="22.195380943s" podCreationTimestamp="2026-02-18 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:42.194645574 +0000 UTC m=+44.770181567" watchObservedRunningTime="2026-02-18 13:59:42.195380943 +0000 UTC m=+44.770916926" Feb 18 13:59:42 crc kubenswrapper[4817]: E0218 13:59:42.197883 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:42.697858897 +0000 UTC m=+45.273394880 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.234549 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q6hcz" podStartSLOduration=22.234395182 podStartE2EDuration="22.234395182s" podCreationTimestamp="2026-02-18 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:42.229445804 +0000 UTC m=+44.804981787" watchObservedRunningTime="2026-02-18 13:59:42.234395182 +0000 UTC m=+44.809931165" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.274251 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-pj24h" podStartSLOduration=22.274226553 podStartE2EDuration="22.274226553s" podCreationTimestamp="2026-02-18 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:42.272684333 +0000 UTC m=+44.848220316" watchObservedRunningTime="2026-02-18 13:59:42.274226553 +0000 UTC m=+44.849762536" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.289489 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:42 crc kubenswrapper[4817]: E0218 13:59:42.290360 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:42.79033578 +0000 UTC m=+45.365871763 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.333158 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-54ftv"] Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.334272 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-54ftv" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.342166 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.345737 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-54ftv"] Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.391458 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:42 crc kubenswrapper[4817]: E0218 13:59:42.391777 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:42.891764554 +0000 UTC m=+45.467300527 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.493305 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:42 crc kubenswrapper[4817]: E0218 13:59:42.493596 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:42.993552557 +0000 UTC m=+45.569088530 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.493797 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/307f9900-9137-46bb-9b32-254ae14c8c17-utilities\") pod \"community-operators-54ftv\" (UID: \"307f9900-9137-46bb-9b32-254ae14c8c17\") " pod="openshift-marketplace/community-operators-54ftv" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.494049 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc7nx\" (UniqueName: \"kubernetes.io/projected/307f9900-9137-46bb-9b32-254ae14c8c17-kube-api-access-cc7nx\") pod \"community-operators-54ftv\" (UID: \"307f9900-9137-46bb-9b32-254ae14c8c17\") " pod="openshift-marketplace/community-operators-54ftv" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.494130 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/307f9900-9137-46bb-9b32-254ae14c8c17-catalog-content\") pod \"community-operators-54ftv\" (UID: \"307f9900-9137-46bb-9b32-254ae14c8c17\") " pod="openshift-marketplace/community-operators-54ftv" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.494215 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:42 crc kubenswrapper[4817]: E0218 13:59:42.494662 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:42.994652136 +0000 UTC m=+45.570188379 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.538478 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hxln5"] Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.539536 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hxln5" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.544649 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.559280 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hxln5"] Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.595778 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:42 crc kubenswrapper[4817]: E0218 13:59:42.595951 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:43.095915356 +0000 UTC m=+45.671451339 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.596031 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/307f9900-9137-46bb-9b32-254ae14c8c17-utilities\") pod \"community-operators-54ftv\" (UID: \"307f9900-9137-46bb-9b32-254ae14c8c17\") " pod="openshift-marketplace/community-operators-54ftv" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.596110 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc7nx\" (UniqueName: \"kubernetes.io/projected/307f9900-9137-46bb-9b32-254ae14c8c17-kube-api-access-cc7nx\") pod \"community-operators-54ftv\" (UID: \"307f9900-9137-46bb-9b32-254ae14c8c17\") " pod="openshift-marketplace/community-operators-54ftv" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.596136 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/307f9900-9137-46bb-9b32-254ae14c8c17-catalog-content\") pod \"community-operators-54ftv\" (UID: \"307f9900-9137-46bb-9b32-254ae14c8c17\") " pod="openshift-marketplace/community-operators-54ftv" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.596176 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.596560 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/307f9900-9137-46bb-9b32-254ae14c8c17-utilities\") pod \"community-operators-54ftv\" (UID: \"307f9900-9137-46bb-9b32-254ae14c8c17\") " pod="openshift-marketplace/community-operators-54ftv" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.596623 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/307f9900-9137-46bb-9b32-254ae14c8c17-catalog-content\") pod \"community-operators-54ftv\" (UID: \"307f9900-9137-46bb-9b32-254ae14c8c17\") " pod="openshift-marketplace/community-operators-54ftv" Feb 18 13:59:42 crc kubenswrapper[4817]: E0218 13:59:42.596920 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:43.096906412 +0000 UTC m=+45.672442395 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.624131 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc7nx\" (UniqueName: \"kubernetes.io/projected/307f9900-9137-46bb-9b32-254ae14c8c17-kube-api-access-cc7nx\") pod \"community-operators-54ftv\" (UID: \"307f9900-9137-46bb-9b32-254ae14c8c17\") " pod="openshift-marketplace/community-operators-54ftv" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.693163 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-54ftv" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.696811 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:42 crc kubenswrapper[4817]: E0218 13:59:42.697026 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:43.196998601 +0000 UTC m=+45.772534584 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.697090 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr8hq\" (UniqueName: \"kubernetes.io/projected/8162b014-86d1-482a-8c7c-eba34fed3f62-kube-api-access-wr8hq\") pod \"certified-operators-hxln5\" (UID: \"8162b014-86d1-482a-8c7c-eba34fed3f62\") " pod="openshift-marketplace/certified-operators-hxln5" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.697394 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8162b014-86d1-482a-8c7c-eba34fed3f62-utilities\") pod \"certified-operators-hxln5\" (UID: \"8162b014-86d1-482a-8c7c-eba34fed3f62\") " pod="openshift-marketplace/certified-operators-hxln5" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.697459 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.697497 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8162b014-86d1-482a-8c7c-eba34fed3f62-catalog-content\") pod \"certified-operators-hxln5\" (UID: \"8162b014-86d1-482a-8c7c-eba34fed3f62\") " pod="openshift-marketplace/certified-operators-hxln5" Feb 18 13:59:42 crc kubenswrapper[4817]: E0218 13:59:42.697822 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:43.197802902 +0000 UTC m=+45.773338875 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.737451 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2z68r"] Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.739018 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2z68r" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.798466 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:42 crc kubenswrapper[4817]: E0218 13:59:42.798702 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:43.298670112 +0000 UTC m=+45.874206095 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.798896 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8162b014-86d1-482a-8c7c-eba34fed3f62-utilities\") pod \"certified-operators-hxln5\" (UID: \"8162b014-86d1-482a-8c7c-eba34fed3f62\") " pod="openshift-marketplace/certified-operators-hxln5" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.798941 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.798965 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8162b014-86d1-482a-8c7c-eba34fed3f62-catalog-content\") pod \"certified-operators-hxln5\" (UID: \"8162b014-86d1-482a-8c7c-eba34fed3f62\") " pod="openshift-marketplace/certified-operators-hxln5" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.799011 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr8hq\" (UniqueName: \"kubernetes.io/projected/8162b014-86d1-482a-8c7c-eba34fed3f62-kube-api-access-wr8hq\") pod \"certified-operators-hxln5\" (UID: \"8162b014-86d1-482a-8c7c-eba34fed3f62\") " pod="openshift-marketplace/certified-operators-hxln5" Feb 18 13:59:42 crc kubenswrapper[4817]: E0218 13:59:42.799448 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:43.299425112 +0000 UTC m=+45.874961095 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.799470 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8162b014-86d1-482a-8c7c-eba34fed3f62-utilities\") pod \"certified-operators-hxln5\" (UID: \"8162b014-86d1-482a-8c7c-eba34fed3f62\") " pod="openshift-marketplace/certified-operators-hxln5" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.799600 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8162b014-86d1-482a-8c7c-eba34fed3f62-catalog-content\") pod \"certified-operators-hxln5\" (UID: \"8162b014-86d1-482a-8c7c-eba34fed3f62\") " pod="openshift-marketplace/certified-operators-hxln5" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.806161 4817 patch_prober.go:28] interesting pod/router-default-5444994796-6w9rz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 13:59:42 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Feb 18 13:59:42 crc kubenswrapper[4817]: [+]process-running ok Feb 18 13:59:42 crc kubenswrapper[4817]: healthz check failed Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.806232 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6w9rz" podUID="0a42d5a9-1383-4a55-9c81-bd40eb5ba86f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.807180 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2z68r"] Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.824652 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr8hq\" (UniqueName: \"kubernetes.io/projected/8162b014-86d1-482a-8c7c-eba34fed3f62-kube-api-access-wr8hq\") pod \"certified-operators-hxln5\" (UID: \"8162b014-86d1-482a-8c7c-eba34fed3f62\") " pod="openshift-marketplace/certified-operators-hxln5" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.862788 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hxln5" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.900597 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.900962 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8e393a5-61e4-4e91-bec4-770687b8d01b-catalog-content\") pod \"community-operators-2z68r\" (UID: \"c8e393a5-61e4-4e91-bec4-770687b8d01b\") " pod="openshift-marketplace/community-operators-2z68r" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.901022 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxcv9\" (UniqueName: \"kubernetes.io/projected/c8e393a5-61e4-4e91-bec4-770687b8d01b-kube-api-access-rxcv9\") pod \"community-operators-2z68r\" (UID: \"c8e393a5-61e4-4e91-bec4-770687b8d01b\") " pod="openshift-marketplace/community-operators-2z68r" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.901046 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8e393a5-61e4-4e91-bec4-770687b8d01b-utilities\") pod \"community-operators-2z68r\" (UID: \"c8e393a5-61e4-4e91-bec4-770687b8d01b\") " pod="openshift-marketplace/community-operators-2z68r" Feb 18 13:59:42 crc kubenswrapper[4817]: E0218 13:59:42.901208 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:43.401187524 +0000 UTC m=+45.976723507 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.945744 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-86tmt"] Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.946845 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-86tmt" Feb 18 13:59:42 crc kubenswrapper[4817]: I0218 13:59:42.972659 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-86tmt"] Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.002279 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8e393a5-61e4-4e91-bec4-770687b8d01b-catalog-content\") pod \"community-operators-2z68r\" (UID: \"c8e393a5-61e4-4e91-bec4-770687b8d01b\") " pod="openshift-marketplace/community-operators-2z68r" Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.002343 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxcv9\" (UniqueName: \"kubernetes.io/projected/c8e393a5-61e4-4e91-bec4-770687b8d01b-kube-api-access-rxcv9\") pod \"community-operators-2z68r\" (UID: \"c8e393a5-61e4-4e91-bec4-770687b8d01b\") " pod="openshift-marketplace/community-operators-2z68r" Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.002370 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8e393a5-61e4-4e91-bec4-770687b8d01b-utilities\") pod \"community-operators-2z68r\" (UID: \"c8e393a5-61e4-4e91-bec4-770687b8d01b\") " pod="openshift-marketplace/community-operators-2z68r" Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.002419 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:43 crc kubenswrapper[4817]: E0218 13:59:43.002736 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:43.502723391 +0000 UTC m=+46.078259374 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.006894 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8e393a5-61e4-4e91-bec4-770687b8d01b-utilities\") pod \"community-operators-2z68r\" (UID: \"c8e393a5-61e4-4e91-bec4-770687b8d01b\") " pod="openshift-marketplace/community-operators-2z68r" Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.013086 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8e393a5-61e4-4e91-bec4-770687b8d01b-catalog-content\") pod \"community-operators-2z68r\" (UID: \"c8e393a5-61e4-4e91-bec4-770687b8d01b\") " pod="openshift-marketplace/community-operators-2z68r" Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.046380 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxcv9\" (UniqueName: \"kubernetes.io/projected/c8e393a5-61e4-4e91-bec4-770687b8d01b-kube-api-access-rxcv9\") pod \"community-operators-2z68r\" (UID: \"c8e393a5-61e4-4e91-bec4-770687b8d01b\") " pod="openshift-marketplace/community-operators-2z68r" Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.053050 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2z68r" Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.074791 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" event={"ID":"ca679b0d-4e7e-4526-af6e-e3b0cb400fd0","Type":"ContainerStarted","Data":"b04814cd050293ac730fcbc3568ab2923f534dc9ab39dc3ffcaacb2916f4191e"} Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.088577 4817 generic.go:334] "Generic (PLEG): container finished" podID="3f85543c-dccf-4a3e-be40-7305a2e49d1d" containerID="0d0b900a8db1d295b441b5270730b75e4e473ab00702ff85d8156305dc718499" exitCode=0 Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.088687 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523705-gm644" event={"ID":"3f85543c-dccf-4a3e-be40-7305a2e49d1d","Type":"ContainerDied","Data":"0d0b900a8db1d295b441b5270730b75e4e473ab00702ff85d8156305dc718499"} Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.103691 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.104096 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99fce15c-b13c-4341-b1a0-494d5bd3f76a-catalog-content\") pod \"certified-operators-86tmt\" (UID: \"99fce15c-b13c-4341-b1a0-494d5bd3f76a\") " pod="openshift-marketplace/certified-operators-86tmt" Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.104144 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99fce15c-b13c-4341-b1a0-494d5bd3f76a-utilities\") pod \"certified-operators-86tmt\" (UID: \"99fce15c-b13c-4341-b1a0-494d5bd3f76a\") " pod="openshift-marketplace/certified-operators-86tmt" Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.104189 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvd9n\" (UniqueName: \"kubernetes.io/projected/99fce15c-b13c-4341-b1a0-494d5bd3f76a-kube-api-access-wvd9n\") pod \"certified-operators-86tmt\" (UID: \"99fce15c-b13c-4341-b1a0-494d5bd3f76a\") " pod="openshift-marketplace/certified-operators-86tmt" Feb 18 13:59:43 crc kubenswrapper[4817]: E0218 13:59:43.104299 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:43.604242988 +0000 UTC m=+46.179778971 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.117421 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x4qx6" event={"ID":"614109c1-b0a9-4ff3-b3d4-04325985c7df","Type":"ContainerStarted","Data":"418bd9d81e9d6ceab6a51a3efd5e8b36045e37105e4df91c61955a193fef3802"} Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.125041 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-54ftv"] Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.145685 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pqtc6" event={"ID":"e3015431-de34-487b-b9d9-473c6da0578b","Type":"ContainerStarted","Data":"4368f6d9a31e03cabee87ee5ec1539c5f185d3aec7dea4dc908c867ba089b391"} Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.145763 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pqtc6" event={"ID":"e3015431-de34-487b-b9d9-473c6da0578b","Type":"ContainerStarted","Data":"d80a665706ce6fcfd069a464dfaa3e4c554464ed5501c8e22fe8273eb4b1abf1"} Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.146744 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-pqtc6" Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.170905 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" event={"ID":"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c","Type":"ContainerStarted","Data":"7a3c086f60392a6dd495eecad0e90307dba13d18d4ba1ee80025219531fec360"} Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.170960 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" event={"ID":"67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c","Type":"ContainerStarted","Data":"418cfa423e9e98526eb3b48a8a564bda17bdaaf4c2bd9c936c4b4ccca6038a9b"} Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.176230 4817 patch_prober.go:28] interesting pod/downloads-7954f5f757-wwvbf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.176304 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wwvbf" podUID="5345e1d1-a74f-4d8f-8b86-2bb389a525a2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.189479 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.190649 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr" Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.204198 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-phf44" Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.219738 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99fce15c-b13c-4341-b1a0-494d5bd3f76a-utilities\") pod \"certified-operators-86tmt\" (UID: \"99fce15c-b13c-4341-b1a0-494d5bd3f76a\") " pod="openshift-marketplace/certified-operators-86tmt" Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.221132 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvd9n\" (UniqueName: \"kubernetes.io/projected/99fce15c-b13c-4341-b1a0-494d5bd3f76a-kube-api-access-wvd9n\") pod \"certified-operators-86tmt\" (UID: \"99fce15c-b13c-4341-b1a0-494d5bd3f76a\") " pod="openshift-marketplace/certified-operators-86tmt" Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.221972 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.222012 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99fce15c-b13c-4341-b1a0-494d5bd3f76a-catalog-content\") pod \"certified-operators-86tmt\" (UID: \"99fce15c-b13c-4341-b1a0-494d5bd3f76a\") " pod="openshift-marketplace/certified-operators-86tmt" Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.222534 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99fce15c-b13c-4341-b1a0-494d5bd3f76a-catalog-content\") pod \"certified-operators-86tmt\" (UID: \"99fce15c-b13c-4341-b1a0-494d5bd3f76a\") " pod="openshift-marketplace/certified-operators-86tmt" Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.223835 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99fce15c-b13c-4341-b1a0-494d5bd3f76a-utilities\") pod \"certified-operators-86tmt\" (UID: \"99fce15c-b13c-4341-b1a0-494d5bd3f76a\") " pod="openshift-marketplace/certified-operators-86tmt" Feb 18 13:59:43 crc kubenswrapper[4817]: E0218 13:59:43.225614 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:43.725584658 +0000 UTC m=+46.301120641 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.254354 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.304452 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvd9n\" (UniqueName: \"kubernetes.io/projected/99fce15c-b13c-4341-b1a0-494d5bd3f76a-kube-api-access-wvd9n\") pod \"certified-operators-86tmt\" (UID: \"99fce15c-b13c-4341-b1a0-494d5bd3f76a\") " pod="openshift-marketplace/certified-operators-86tmt" Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.305420 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-pqtc6" podStartSLOduration=10.305392652 podStartE2EDuration="10.305392652s" podCreationTimestamp="2026-02-18 13:59:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:43.286841073 +0000 UTC m=+45.862377056" watchObservedRunningTime="2026-02-18 13:59:43.305392652 +0000 UTC m=+45.880928645" Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.307202 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" podStartSLOduration=23.307191589 podStartE2EDuration="23.307191589s" podCreationTimestamp="2026-02-18 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:43.232103526 +0000 UTC m=+45.807639529" watchObservedRunningTime="2026-02-18 13:59:43.307191589 +0000 UTC m=+45.882727582" Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.329372 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:43 crc kubenswrapper[4817]: E0218 13:59:43.330765 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:43.830721908 +0000 UTC m=+46.406257941 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.353825 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:43 crc kubenswrapper[4817]: E0218 13:59:43.355606 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:43.855584021 +0000 UTC m=+46.431120004 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.455178 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:43 crc kubenswrapper[4817]: E0218 13:59:43.455713 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:43.955677591 +0000 UTC m=+46.531213564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.510781 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hxln5"] Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.549024 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" podStartSLOduration=24.549003406 podStartE2EDuration="24.549003406s" podCreationTimestamp="2026-02-18 13:59:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:43.548426921 +0000 UTC m=+46.123962904" watchObservedRunningTime="2026-02-18 13:59:43.549003406 +0000 UTC m=+46.124539389" Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.558054 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:43 crc kubenswrapper[4817]: E0218 13:59:43.558658 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:44.058640175 +0000 UTC m=+46.634176158 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.601637 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-86tmt" Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.659068 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:43 crc kubenswrapper[4817]: E0218 13:59:43.659596 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:44.159577856 +0000 UTC m=+46.735113839 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.764203 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:43 crc kubenswrapper[4817]: E0218 13:59:43.764636 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:44.264622514 +0000 UTC m=+46.840158497 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.807802 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2z68r"] Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.812331 4817 patch_prober.go:28] interesting pod/router-default-5444994796-6w9rz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 13:59:43 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Feb 18 13:59:43 crc kubenswrapper[4817]: [+]process-running ok Feb 18 13:59:43 crc kubenswrapper[4817]: healthz check failed Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.812415 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6w9rz" podUID="0a42d5a9-1383-4a55-9c81-bd40eb5ba86f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.866221 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:43 crc kubenswrapper[4817]: E0218 13:59:43.867028 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:44.367010403 +0000 UTC m=+46.942546386 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:43 crc kubenswrapper[4817]: I0218 13:59:43.970401 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:43 crc kubenswrapper[4817]: E0218 13:59:43.971059 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:44.471034015 +0000 UTC m=+47.046569998 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.073568 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:44 crc kubenswrapper[4817]: E0218 13:59:44.074137 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:44.574119482 +0000 UTC m=+47.149655455 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.178361 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:44 crc kubenswrapper[4817]: E0218 13:59:44.178953 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:44.678937474 +0000 UTC m=+47.254473457 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.208817 4817 generic.go:334] "Generic (PLEG): container finished" podID="307f9900-9137-46bb-9b32-254ae14c8c17" containerID="22298acbf82a978e9b06615fff23103085837415a73e22b240c8153c62d80df1" exitCode=0 Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.213660 4817 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.214785 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54ftv" event={"ID":"307f9900-9137-46bb-9b32-254ae14c8c17","Type":"ContainerDied","Data":"22298acbf82a978e9b06615fff23103085837415a73e22b240c8153c62d80df1"} Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.214839 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54ftv" event={"ID":"307f9900-9137-46bb-9b32-254ae14c8c17","Type":"ContainerStarted","Data":"c866af86003101a5b03304faf9f9c8eba07e17488bbf0d2e6a9406274bbcad67"} Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.233842 4817 generic.go:334] "Generic (PLEG): container finished" podID="c8e393a5-61e4-4e91-bec4-770687b8d01b" containerID="aef25b3d5c1cf6030c480735a540395a770eecf1c36854fce256f043901d64ed" exitCode=0 Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.234572 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2z68r" event={"ID":"c8e393a5-61e4-4e91-bec4-770687b8d01b","Type":"ContainerDied","Data":"aef25b3d5c1cf6030c480735a540395a770eecf1c36854fce256f043901d64ed"} Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.234626 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2z68r" event={"ID":"c8e393a5-61e4-4e91-bec4-770687b8d01b","Type":"ContainerStarted","Data":"f812b0571f00f76f0c5dd6615443393efddcb51f670924edf17c87aeb6413b65"} Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.249155 4817 generic.go:334] "Generic (PLEG): container finished" podID="8162b014-86d1-482a-8c7c-eba34fed3f62" containerID="94fd463f320e734957f758e9aa5704c0873f4941b3317af56b7a7914e01b2385" exitCode=0 Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.249244 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hxln5" event={"ID":"8162b014-86d1-482a-8c7c-eba34fed3f62","Type":"ContainerDied","Data":"94fd463f320e734957f758e9aa5704c0873f4941b3317af56b7a7914e01b2385"} Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.249278 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hxln5" event={"ID":"8162b014-86d1-482a-8c7c-eba34fed3f62","Type":"ContainerStarted","Data":"6648c733b10888135df31bb74dcee822a035d8432afeefa97dd64240c9bdd812"} Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.260876 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-pb2jx" podUID="3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://1c94d838fdef8a73cec016eeb66e2e241cbdafd8f76a6afa7ad7b14c12985adc" gracePeriod=30 Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.261265 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x4qx6" event={"ID":"614109c1-b0a9-4ff3-b3d4-04325985c7df","Type":"ContainerStarted","Data":"430d4e58dd506b175882a9ac4ff5708207a17f5ae9aff525b0051b570f59d872"} Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.284332 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dqctb" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.284611 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:44 crc kubenswrapper[4817]: E0218 13:59:44.285408 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:44.785375588 +0000 UTC m=+47.360911561 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.305429 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-86tmt"] Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.329072 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.333005 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jp6w5"] Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.335730 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jp6w5" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.338533 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.342115 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.342426 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.342560 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.350061 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jp6w5"] Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.363780 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.385920 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:44 crc kubenswrapper[4817]: E0218 13:59:44.388263 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:44.88824816 +0000 UTC m=+47.463784143 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.494725 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.495070 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/557bca4a-3b11-4613-bc75-3bdfd3ddbd25-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"557bca4a-3b11-4613-bc75-3bdfd3ddbd25\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.495094 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a860a4b3-b2dd-4d2f-8d2f-a959007a6197-catalog-content\") pod \"redhat-marketplace-jp6w5\" (UID: \"a860a4b3-b2dd-4d2f-8d2f-a959007a6197\") " pod="openshift-marketplace/redhat-marketplace-jp6w5" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.495117 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m47tw\" (UniqueName: \"kubernetes.io/projected/a860a4b3-b2dd-4d2f-8d2f-a959007a6197-kube-api-access-m47tw\") pod \"redhat-marketplace-jp6w5\" (UID: \"a860a4b3-b2dd-4d2f-8d2f-a959007a6197\") " pod="openshift-marketplace/redhat-marketplace-jp6w5" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.495144 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/557bca4a-3b11-4613-bc75-3bdfd3ddbd25-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"557bca4a-3b11-4613-bc75-3bdfd3ddbd25\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.495170 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a860a4b3-b2dd-4d2f-8d2f-a959007a6197-utilities\") pod \"redhat-marketplace-jp6w5\" (UID: \"a860a4b3-b2dd-4d2f-8d2f-a959007a6197\") " pod="openshift-marketplace/redhat-marketplace-jp6w5" Feb 18 13:59:44 crc kubenswrapper[4817]: E0218 13:59:44.495303 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:44.995283369 +0000 UTC m=+47.570819342 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.598542 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a860a4b3-b2dd-4d2f-8d2f-a959007a6197-catalog-content\") pod \"redhat-marketplace-jp6w5\" (UID: \"a860a4b3-b2dd-4d2f-8d2f-a959007a6197\") " pod="openshift-marketplace/redhat-marketplace-jp6w5" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.598579 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/557bca4a-3b11-4613-bc75-3bdfd3ddbd25-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"557bca4a-3b11-4613-bc75-3bdfd3ddbd25\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.598600 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m47tw\" (UniqueName: \"kubernetes.io/projected/a860a4b3-b2dd-4d2f-8d2f-a959007a6197-kube-api-access-m47tw\") pod \"redhat-marketplace-jp6w5\" (UID: \"a860a4b3-b2dd-4d2f-8d2f-a959007a6197\") " pod="openshift-marketplace/redhat-marketplace-jp6w5" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.598622 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.598643 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/557bca4a-3b11-4613-bc75-3bdfd3ddbd25-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"557bca4a-3b11-4613-bc75-3bdfd3ddbd25\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.598665 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a860a4b3-b2dd-4d2f-8d2f-a959007a6197-utilities\") pod \"redhat-marketplace-jp6w5\" (UID: \"a860a4b3-b2dd-4d2f-8d2f-a959007a6197\") " pod="openshift-marketplace/redhat-marketplace-jp6w5" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.599240 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a860a4b3-b2dd-4d2f-8d2f-a959007a6197-utilities\") pod \"redhat-marketplace-jp6w5\" (UID: \"a860a4b3-b2dd-4d2f-8d2f-a959007a6197\") " pod="openshift-marketplace/redhat-marketplace-jp6w5" Feb 18 13:59:44 crc kubenswrapper[4817]: E0218 13:59:44.599619 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:45.099605808 +0000 UTC m=+47.675141791 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.599802 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/557bca4a-3b11-4613-bc75-3bdfd3ddbd25-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"557bca4a-3b11-4613-bc75-3bdfd3ddbd25\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.599952 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a860a4b3-b2dd-4d2f-8d2f-a959007a6197-catalog-content\") pod \"redhat-marketplace-jp6w5\" (UID: \"a860a4b3-b2dd-4d2f-8d2f-a959007a6197\") " pod="openshift-marketplace/redhat-marketplace-jp6w5" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.629843 4817 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.629879 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/557bca4a-3b11-4613-bc75-3bdfd3ddbd25-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"557bca4a-3b11-4613-bc75-3bdfd3ddbd25\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.637018 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m47tw\" (UniqueName: \"kubernetes.io/projected/a860a4b3-b2dd-4d2f-8d2f-a959007a6197-kube-api-access-m47tw\") pod \"redhat-marketplace-jp6w5\" (UID: \"a860a4b3-b2dd-4d2f-8d2f-a959007a6197\") " pod="openshift-marketplace/redhat-marketplace-jp6w5" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.671918 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523705-gm644" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.688080 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jp6w5" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.700280 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f85543c-dccf-4a3e-be40-7305a2e49d1d-secret-volume\") pod \"3f85543c-dccf-4a3e-be40-7305a2e49d1d\" (UID: \"3f85543c-dccf-4a3e-be40-7305a2e49d1d\") " Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.700407 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx565\" (UniqueName: \"kubernetes.io/projected/3f85543c-dccf-4a3e-be40-7305a2e49d1d-kube-api-access-xx565\") pod \"3f85543c-dccf-4a3e-be40-7305a2e49d1d\" (UID: \"3f85543c-dccf-4a3e-be40-7305a2e49d1d\") " Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.700531 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.700591 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f85543c-dccf-4a3e-be40-7305a2e49d1d-config-volume\") pod \"3f85543c-dccf-4a3e-be40-7305a2e49d1d\" (UID: \"3f85543c-dccf-4a3e-be40-7305a2e49d1d\") " Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.704546 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f85543c-dccf-4a3e-be40-7305a2e49d1d-config-volume" (OuterVolumeSpecName: "config-volume") pod "3f85543c-dccf-4a3e-be40-7305a2e49d1d" (UID: "3f85543c-dccf-4a3e-be40-7305a2e49d1d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 13:59:44 crc kubenswrapper[4817]: E0218 13:59:44.704686 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:45.204665916 +0000 UTC m=+47.780201899 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.713473 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f85543c-dccf-4a3e-be40-7305a2e49d1d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3f85543c-dccf-4a3e-be40-7305a2e49d1d" (UID: "3f85543c-dccf-4a3e-be40-7305a2e49d1d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.716322 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f85543c-dccf-4a3e-be40-7305a2e49d1d-kube-api-access-xx565" (OuterVolumeSpecName: "kube-api-access-xx565") pod "3f85543c-dccf-4a3e-be40-7305a2e49d1d" (UID: "3f85543c-dccf-4a3e-be40-7305a2e49d1d"). InnerVolumeSpecName "kube-api-access-xx565". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.740373 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.772342 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7rx2s"] Feb 18 13:59:44 crc kubenswrapper[4817]: E0218 13:59:44.773025 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f85543c-dccf-4a3e-be40-7305a2e49d1d" containerName="collect-profiles" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.773046 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f85543c-dccf-4a3e-be40-7305a2e49d1d" containerName="collect-profiles" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.773248 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f85543c-dccf-4a3e-be40-7305a2e49d1d" containerName="collect-profiles" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.775231 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7rx2s" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.791487 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7rx2s"] Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.802821 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a6721c8-f88b-4812-9cce-0e6d959d5fa6-utilities\") pod \"redhat-marketplace-7rx2s\" (UID: \"6a6721c8-f88b-4812-9cce-0e6d959d5fa6\") " pod="openshift-marketplace/redhat-marketplace-7rx2s" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.802908 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a6721c8-f88b-4812-9cce-0e6d959d5fa6-catalog-content\") pod \"redhat-marketplace-7rx2s\" (UID: \"6a6721c8-f88b-4812-9cce-0e6d959d5fa6\") " pod="openshift-marketplace/redhat-marketplace-7rx2s" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.802953 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkmk4\" (UniqueName: \"kubernetes.io/projected/6a6721c8-f88b-4812-9cce-0e6d959d5fa6-kube-api-access-dkmk4\") pod \"redhat-marketplace-7rx2s\" (UID: \"6a6721c8-f88b-4812-9cce-0e6d959d5fa6\") " pod="openshift-marketplace/redhat-marketplace-7rx2s" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.803016 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.803062 4817 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f85543c-dccf-4a3e-be40-7305a2e49d1d-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.803076 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx565\" (UniqueName: \"kubernetes.io/projected/3f85543c-dccf-4a3e-be40-7305a2e49d1d-kube-api-access-xx565\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.803088 4817 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f85543c-dccf-4a3e-be40-7305a2e49d1d-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:44 crc kubenswrapper[4817]: E0218 13:59:44.803389 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:45.30337418 +0000 UTC m=+47.878910163 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.819990 4817 patch_prober.go:28] interesting pod/router-default-5444994796-6w9rz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 13:59:44 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Feb 18 13:59:44 crc kubenswrapper[4817]: [+]process-running ok Feb 18 13:59:44 crc kubenswrapper[4817]: healthz check failed Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.820520 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6w9rz" podUID="0a42d5a9-1383-4a55-9c81-bd40eb5ba86f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.905497 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.905813 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a6721c8-f88b-4812-9cce-0e6d959d5fa6-utilities\") pod \"redhat-marketplace-7rx2s\" (UID: \"6a6721c8-f88b-4812-9cce-0e6d959d5fa6\") " pod="openshift-marketplace/redhat-marketplace-7rx2s" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.905879 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a6721c8-f88b-4812-9cce-0e6d959d5fa6-catalog-content\") pod \"redhat-marketplace-7rx2s\" (UID: \"6a6721c8-f88b-4812-9cce-0e6d959d5fa6\") " pod="openshift-marketplace/redhat-marketplace-7rx2s" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.905921 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkmk4\" (UniqueName: \"kubernetes.io/projected/6a6721c8-f88b-4812-9cce-0e6d959d5fa6-kube-api-access-dkmk4\") pod \"redhat-marketplace-7rx2s\" (UID: \"6a6721c8-f88b-4812-9cce-0e6d959d5fa6\") " pod="openshift-marketplace/redhat-marketplace-7rx2s" Feb 18 13:59:44 crc kubenswrapper[4817]: E0218 13:59:44.907027 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:45.406747235 +0000 UTC m=+47.982283218 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.908568 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a6721c8-f88b-4812-9cce-0e6d959d5fa6-catalog-content\") pod \"redhat-marketplace-7rx2s\" (UID: \"6a6721c8-f88b-4812-9cce-0e6d959d5fa6\") " pod="openshift-marketplace/redhat-marketplace-7rx2s" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.914400 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a6721c8-f88b-4812-9cce-0e6d959d5fa6-utilities\") pod \"redhat-marketplace-7rx2s\" (UID: \"6a6721c8-f88b-4812-9cce-0e6d959d5fa6\") " pod="openshift-marketplace/redhat-marketplace-7rx2s" Feb 18 13:59:44 crc kubenswrapper[4817]: I0218 13:59:44.948666 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkmk4\" (UniqueName: \"kubernetes.io/projected/6a6721c8-f88b-4812-9cce-0e6d959d5fa6-kube-api-access-dkmk4\") pod \"redhat-marketplace-7rx2s\" (UID: \"6a6721c8-f88b-4812-9cce-0e6d959d5fa6\") " pod="openshift-marketplace/redhat-marketplace-7rx2s" Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.007066 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:45 crc kubenswrapper[4817]: E0218 13:59:45.007471 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:45.507456361 +0000 UTC m=+48.082992334 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.111002 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:45 crc kubenswrapper[4817]: E0218 13:59:45.111256 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:45.611214575 +0000 UTC m=+48.186750548 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.111801 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:45 crc kubenswrapper[4817]: E0218 13:59:45.112273 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:45.612257152 +0000 UTC m=+48.187793135 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.113516 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jp6w5"] Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.138057 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7rx2s" Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.208433 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.212259 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:45 crc kubenswrapper[4817]: E0218 13:59:45.212540 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-18 13:59:45.712520606 +0000 UTC m=+48.288056589 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.275728 4817 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-18T13:59:44.630170729Z","Handler":null,"Name":""} Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.305706 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523705-gm644" event={"ID":"3f85543c-dccf-4a3e-be40-7305a2e49d1d","Type":"ContainerDied","Data":"23b7bdfebe45c1cdd3b38d114864281e4380b7ed9d829a4ee15d265d3a4509f8"} Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.305755 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23b7bdfebe45c1cdd3b38d114864281e4380b7ed9d829a4ee15d265d3a4509f8" Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.305879 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523705-gm644" Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.313722 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:45 crc kubenswrapper[4817]: E0218 13:59:45.314168 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-18 13:59:45.814153516 +0000 UTC m=+48.389689499 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mddfd" (UID: "4faf1743-e825-477d-b191-830513a39317") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.315171 4817 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.315245 4817 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.349818 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x4qx6" event={"ID":"614109c1-b0a9-4ff3-b3d4-04325985c7df","Type":"ContainerStarted","Data":"325658ac515a6d3d6fc3c209f9182a7032fc1721a7d8abb2a30d16d7f4ee1ab8"} Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.349881 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x4qx6" event={"ID":"614109c1-b0a9-4ff3-b3d4-04325985c7df","Type":"ContainerStarted","Data":"d66f15fabde84a9b9b7e9a05d6139c8e2c1d8875f50e71481ee21755a0c19b4d"} Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.366740 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jp6w5" event={"ID":"a860a4b3-b2dd-4d2f-8d2f-a959007a6197","Type":"ContainerStarted","Data":"62fb97cbfacc50e216527029bc24d68da25c026473c5ec48450bc55b0896a11e"} Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.410478 4817 generic.go:334] "Generic (PLEG): container finished" podID="99fce15c-b13c-4341-b1a0-494d5bd3f76a" containerID="000e4ecf16c79304599cad55da15072d5d55c86d1dfb50e157fdbb969ff7b132" exitCode=0 Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.410905 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-86tmt" event={"ID":"99fce15c-b13c-4341-b1a0-494d5bd3f76a","Type":"ContainerDied","Data":"000e4ecf16c79304599cad55da15072d5d55c86d1dfb50e157fdbb969ff7b132"} Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.410966 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-86tmt" event={"ID":"99fce15c-b13c-4341-b1a0-494d5bd3f76a","Type":"ContainerStarted","Data":"8c99bfa6e9fdc892640d7765a99b85a4c0be2f0c50d238a35cc4f5c958f1db72"} Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.414675 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.434434 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.439869 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-x4qx6" podStartSLOduration=12.439829808 podStartE2EDuration="12.439829808s" podCreationTimestamp="2026-02-18 13:59:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:45.381797216 +0000 UTC m=+47.957333199" watchObservedRunningTime="2026-02-18 13:59:45.439829808 +0000 UTC m=+48.015365791" Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.471228 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"557bca4a-3b11-4613-bc75-3bdfd3ddbd25","Type":"ContainerStarted","Data":"b05f406f5e49c4927e3fcce61324e1cbe387e08a8b73b2d35c62b2cdef42168d"} Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.523068 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.543284 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b7925"] Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.544435 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b7925" Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.559289 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.576199 4817 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.576259 4817 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.584878 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b7925"] Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.626031 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44a62058-ed9b-4364-97d0-09af2bb1c22d-utilities\") pod \"redhat-operators-b7925\" (UID: \"44a62058-ed9b-4364-97d0-09af2bb1c22d\") " pod="openshift-marketplace/redhat-operators-b7925" Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.626158 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tms49\" (UniqueName: \"kubernetes.io/projected/44a62058-ed9b-4364-97d0-09af2bb1c22d-kube-api-access-tms49\") pod \"redhat-operators-b7925\" (UID: \"44a62058-ed9b-4364-97d0-09af2bb1c22d\") " pod="openshift-marketplace/redhat-operators-b7925" Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.626193 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44a62058-ed9b-4364-97d0-09af2bb1c22d-catalog-content\") pod \"redhat-operators-b7925\" (UID: \"44a62058-ed9b-4364-97d0-09af2bb1c22d\") " pod="openshift-marketplace/redhat-operators-b7925" Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.729440 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44a62058-ed9b-4364-97d0-09af2bb1c22d-utilities\") pod \"redhat-operators-b7925\" (UID: \"44a62058-ed9b-4364-97d0-09af2bb1c22d\") " pod="openshift-marketplace/redhat-operators-b7925" Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.729592 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tms49\" (UniqueName: \"kubernetes.io/projected/44a62058-ed9b-4364-97d0-09af2bb1c22d-kube-api-access-tms49\") pod \"redhat-operators-b7925\" (UID: \"44a62058-ed9b-4364-97d0-09af2bb1c22d\") " pod="openshift-marketplace/redhat-operators-b7925" Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.729630 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44a62058-ed9b-4364-97d0-09af2bb1c22d-catalog-content\") pod \"redhat-operators-b7925\" (UID: \"44a62058-ed9b-4364-97d0-09af2bb1c22d\") " pod="openshift-marketplace/redhat-operators-b7925" Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.730373 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44a62058-ed9b-4364-97d0-09af2bb1c22d-catalog-content\") pod \"redhat-operators-b7925\" (UID: \"44a62058-ed9b-4364-97d0-09af2bb1c22d\") " pod="openshift-marketplace/redhat-operators-b7925" Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.730908 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44a62058-ed9b-4364-97d0-09af2bb1c22d-utilities\") pod \"redhat-operators-b7925\" (UID: \"44a62058-ed9b-4364-97d0-09af2bb1c22d\") " pod="openshift-marketplace/redhat-operators-b7925" Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.744187 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mddfd\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.755492 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.794607 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7rx2s"] Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.810603 4817 patch_prober.go:28] interesting pod/router-default-5444994796-6w9rz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 13:59:45 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Feb 18 13:59:45 crc kubenswrapper[4817]: [+]process-running ok Feb 18 13:59:45 crc kubenswrapper[4817]: healthz check failed Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.810672 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6w9rz" podUID="0a42d5a9-1383-4a55-9c81-bd40eb5ba86f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.836426 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.838622 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.846035 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.850210 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.851485 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.851651 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tms49\" (UniqueName: \"kubernetes.io/projected/44a62058-ed9b-4364-97d0-09af2bb1c22d-kube-api-access-tms49\") pod \"redhat-operators-b7925\" (UID: \"44a62058-ed9b-4364-97d0-09af2bb1c22d\") " pod="openshift-marketplace/redhat-operators-b7925" Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.915203 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b7925" Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.928604 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tnrfx"] Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.932120 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tnrfx" Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.933921 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17628042-4c9e-453d-bc17-e54b625cbc64-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"17628042-4c9e-453d-bc17-e54b625cbc64\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.934015 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/262d049a-2c52-453b-a054-3a17b595d535-utilities\") pod \"redhat-operators-tnrfx\" (UID: \"262d049a-2c52-453b-a054-3a17b595d535\") " pod="openshift-marketplace/redhat-operators-tnrfx" Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.934114 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rlmz\" (UniqueName: \"kubernetes.io/projected/262d049a-2c52-453b-a054-3a17b595d535-kube-api-access-6rlmz\") pod \"redhat-operators-tnrfx\" (UID: \"262d049a-2c52-453b-a054-3a17b595d535\") " pod="openshift-marketplace/redhat-operators-tnrfx" Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.934164 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/262d049a-2c52-453b-a054-3a17b595d535-catalog-content\") pod \"redhat-operators-tnrfx\" (UID: \"262d049a-2c52-453b-a054-3a17b595d535\") " pod="openshift-marketplace/redhat-operators-tnrfx" Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.934192 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17628042-4c9e-453d-bc17-e54b625cbc64-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"17628042-4c9e-453d-bc17-e54b625cbc64\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 13:59:45 crc kubenswrapper[4817]: I0218 13:59:45.942117 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tnrfx"] Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.035899 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/262d049a-2c52-453b-a054-3a17b595d535-utilities\") pod \"redhat-operators-tnrfx\" (UID: \"262d049a-2c52-453b-a054-3a17b595d535\") " pod="openshift-marketplace/redhat-operators-tnrfx" Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.036559 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rlmz\" (UniqueName: \"kubernetes.io/projected/262d049a-2c52-453b-a054-3a17b595d535-kube-api-access-6rlmz\") pod \"redhat-operators-tnrfx\" (UID: \"262d049a-2c52-453b-a054-3a17b595d535\") " pod="openshift-marketplace/redhat-operators-tnrfx" Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.036591 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/262d049a-2c52-453b-a054-3a17b595d535-catalog-content\") pod \"redhat-operators-tnrfx\" (UID: \"262d049a-2c52-453b-a054-3a17b595d535\") " pod="openshift-marketplace/redhat-operators-tnrfx" Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.036620 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17628042-4c9e-453d-bc17-e54b625cbc64-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"17628042-4c9e-453d-bc17-e54b625cbc64\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.036656 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17628042-4c9e-453d-bc17-e54b625cbc64-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"17628042-4c9e-453d-bc17-e54b625cbc64\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.036737 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17628042-4c9e-453d-bc17-e54b625cbc64-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"17628042-4c9e-453d-bc17-e54b625cbc64\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.037202 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/262d049a-2c52-453b-a054-3a17b595d535-utilities\") pod \"redhat-operators-tnrfx\" (UID: \"262d049a-2c52-453b-a054-3a17b595d535\") " pod="openshift-marketplace/redhat-operators-tnrfx" Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.037468 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/262d049a-2c52-453b-a054-3a17b595d535-catalog-content\") pod \"redhat-operators-tnrfx\" (UID: \"262d049a-2c52-453b-a054-3a17b595d535\") " pod="openshift-marketplace/redhat-operators-tnrfx" Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.066135 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17628042-4c9e-453d-bc17-e54b625cbc64-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"17628042-4c9e-453d-bc17-e54b625cbc64\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.066182 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rlmz\" (UniqueName: \"kubernetes.io/projected/262d049a-2c52-453b-a054-3a17b595d535-kube-api-access-6rlmz\") pod \"redhat-operators-tnrfx\" (UID: \"262d049a-2c52-453b-a054-3a17b595d535\") " pod="openshift-marketplace/redhat-operators-tnrfx" Feb 18 13:59:46 crc kubenswrapper[4817]: W0218 13:59:46.129150 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4faf1743_e825_477d_b191_830513a39317.slice/crio-f1ee523467d8f30e34fede55c323c6a464aa4503afd82a661b8535db3b1b1a3f WatchSource:0}: Error finding container f1ee523467d8f30e34fede55c323c6a464aa4503afd82a661b8535db3b1b1a3f: Status 404 returned error can't find the container with id f1ee523467d8f30e34fede55c323c6a464aa4503afd82a661b8535db3b1b1a3f Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.129681 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mddfd"] Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.171079 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.197305 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.266625 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tnrfx" Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.421863 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b7925"] Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.500114 4817 generic.go:334] "Generic (PLEG): container finished" podID="6a6721c8-f88b-4812-9cce-0e6d959d5fa6" containerID="0c1fe5e133d24e57580de1ecbfddb8804019d9dbcbcecc4e4d25a091a3cdef12" exitCode=0 Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.500878 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7rx2s" event={"ID":"6a6721c8-f88b-4812-9cce-0e6d959d5fa6","Type":"ContainerDied","Data":"0c1fe5e133d24e57580de1ecbfddb8804019d9dbcbcecc4e4d25a091a3cdef12"} Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.500906 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7rx2s" event={"ID":"6a6721c8-f88b-4812-9cce-0e6d959d5fa6","Type":"ContainerStarted","Data":"89123074c9267fed28141100445d38cd151b06de54c450a33a7a15d7b3afd2e5"} Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.513422 4817 generic.go:334] "Generic (PLEG): container finished" podID="a860a4b3-b2dd-4d2f-8d2f-a959007a6197" containerID="c573b08532a022ea1e7bddffb3a48364e3fc3bc378a04577f928453f616541cc" exitCode=0 Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.513501 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jp6w5" event={"ID":"a860a4b3-b2dd-4d2f-8d2f-a959007a6197","Type":"ContainerDied","Data":"c573b08532a022ea1e7bddffb3a48364e3fc3bc378a04577f928453f616541cc"} Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.553266 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" event={"ID":"4faf1743-e825-477d-b191-830513a39317","Type":"ContainerStarted","Data":"f1ee523467d8f30e34fede55c323c6a464aa4503afd82a661b8535db3b1b1a3f"} Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.562409 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"557bca4a-3b11-4613-bc75-3bdfd3ddbd25","Type":"ContainerStarted","Data":"555237c58dfc9831f2e91c6ba0b1c78f3dbc3438c0890a17660cd52050c645fd"} Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.571011 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7925" event={"ID":"44a62058-ed9b-4364-97d0-09af2bb1c22d","Type":"ContainerStarted","Data":"409b121b87837cca791c4f60e0e46e6192da6970ca02ef9ba5489e96f1250b8f"} Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.591610 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.616258 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.616313 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.625861 4817 patch_prober.go:28] interesting pod/apiserver-76f77b778f-4rsdh container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 18 13:59:46 crc kubenswrapper[4817]: [+]log ok Feb 18 13:59:46 crc kubenswrapper[4817]: [+]etcd ok Feb 18 13:59:46 crc kubenswrapper[4817]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 18 13:59:46 crc kubenswrapper[4817]: [+]poststarthook/generic-apiserver-start-informers ok Feb 18 13:59:46 crc kubenswrapper[4817]: [+]poststarthook/max-in-flight-filter ok Feb 18 13:59:46 crc kubenswrapper[4817]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 18 13:59:46 crc kubenswrapper[4817]: [+]poststarthook/image.openshift.io-apiserver-caches ok Feb 18 13:59:46 crc kubenswrapper[4817]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Feb 18 13:59:46 crc kubenswrapper[4817]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Feb 18 13:59:46 crc kubenswrapper[4817]: [+]poststarthook/project.openshift.io-projectcache ok Feb 18 13:59:46 crc kubenswrapper[4817]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Feb 18 13:59:46 crc kubenswrapper[4817]: [+]poststarthook/openshift.io-startinformers ok Feb 18 13:59:46 crc kubenswrapper[4817]: [+]poststarthook/openshift.io-restmapperupdater ok Feb 18 13:59:46 crc kubenswrapper[4817]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 18 13:59:46 crc kubenswrapper[4817]: livez check failed Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.626033 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" podUID="67bbfc49-2d2d-4e26-8f38-cc7d2fd6765c" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.626566 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.626547221 podStartE2EDuration="2.626547221s" podCreationTimestamp="2026-02-18 13:59:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:46.623508382 +0000 UTC m=+49.199044355" watchObservedRunningTime="2026-02-18 13:59:46.626547221 +0000 UTC m=+49.202083204" Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.629151 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-v2snv" Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.629183 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-v2snv" Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.636635 4817 patch_prober.go:28] interesting pod/console-f9d7485db-v2snv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.636691 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-v2snv" podUID="149bcfc3-9623-403e-8c4c-1019bd5f0c16" containerName="console" probeResult="failure" output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.645012 4817 patch_prober.go:28] interesting pod/downloads-7954f5f757-wwvbf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.645072 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wwvbf" podUID="5345e1d1-a74f-4d8f-8b86-2bb389a525a2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.645088 4817 patch_prober.go:28] interesting pod/downloads-7954f5f757-wwvbf container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.645155 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-wwvbf" podUID="5345e1d1-a74f-4d8f-8b86-2bb389a525a2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 18 13:59:46 crc kubenswrapper[4817]: W0218 13:59:46.666003 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod17628042_4c9e_453d_bc17_e54b625cbc64.slice/crio-732acdd11a85b81605ad08aa18e93e9a2f9b4be9b628f2fa6bb1545de9fc71c0 WatchSource:0}: Error finding container 732acdd11a85b81605ad08aa18e93e9a2f9b4be9b628f2fa6bb1545de9fc71c0: Status 404 returned error can't find the container with id 732acdd11a85b81605ad08aa18e93e9a2f9b4be9b628f2fa6bb1545de9fc71c0 Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.804218 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-6w9rz" Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.810852 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tnrfx"] Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.811684 4817 patch_prober.go:28] interesting pod/router-default-5444994796-6w9rz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 13:59:46 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Feb 18 13:59:46 crc kubenswrapper[4817]: [+]process-running ok Feb 18 13:59:46 crc kubenswrapper[4817]: healthz check failed Feb 18 13:59:46 crc kubenswrapper[4817]: I0218 13:59:46.811733 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6w9rz" podUID="0a42d5a9-1383-4a55-9c81-bd40eb5ba86f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 13:59:47 crc kubenswrapper[4817]: I0218 13:59:47.583450 4817 generic.go:334] "Generic (PLEG): container finished" podID="557bca4a-3b11-4613-bc75-3bdfd3ddbd25" containerID="555237c58dfc9831f2e91c6ba0b1c78f3dbc3438c0890a17660cd52050c645fd" exitCode=0 Feb 18 13:59:47 crc kubenswrapper[4817]: I0218 13:59:47.583689 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"557bca4a-3b11-4613-bc75-3bdfd3ddbd25","Type":"ContainerDied","Data":"555237c58dfc9831f2e91c6ba0b1c78f3dbc3438c0890a17660cd52050c645fd"} Feb 18 13:59:47 crc kubenswrapper[4817]: I0218 13:59:47.588399 4817 generic.go:334] "Generic (PLEG): container finished" podID="44a62058-ed9b-4364-97d0-09af2bb1c22d" containerID="89594fa67868a4f25b500350d3891d510e66849a0e5aaf872bcd923442db2fe2" exitCode=0 Feb 18 13:59:47 crc kubenswrapper[4817]: I0218 13:59:47.588873 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7925" event={"ID":"44a62058-ed9b-4364-97d0-09af2bb1c22d","Type":"ContainerDied","Data":"89594fa67868a4f25b500350d3891d510e66849a0e5aaf872bcd923442db2fe2"} Feb 18 13:59:47 crc kubenswrapper[4817]: I0218 13:59:47.591890 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" event={"ID":"4faf1743-e825-477d-b191-830513a39317","Type":"ContainerStarted","Data":"5b48d2dceb1c7cf78a9dd57d6b9a2d6bd5ee08ab58e00b48402ac4ecacd2c1ce"} Feb 18 13:59:47 crc kubenswrapper[4817]: I0218 13:59:47.592092 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 13:59:47 crc kubenswrapper[4817]: I0218 13:59:47.610183 4817 generic.go:334] "Generic (PLEG): container finished" podID="262d049a-2c52-453b-a054-3a17b595d535" containerID="c1ca46ca13eead25086d7cfba371f7db6f3d8f0f30cc2f2dcf67c84a101948a5" exitCode=0 Feb 18 13:59:47 crc kubenswrapper[4817]: I0218 13:59:47.610273 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnrfx" event={"ID":"262d049a-2c52-453b-a054-3a17b595d535","Type":"ContainerDied","Data":"c1ca46ca13eead25086d7cfba371f7db6f3d8f0f30cc2f2dcf67c84a101948a5"} Feb 18 13:59:47 crc kubenswrapper[4817]: I0218 13:59:47.610296 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnrfx" event={"ID":"262d049a-2c52-453b-a054-3a17b595d535","Type":"ContainerStarted","Data":"a3db0c2b12a86a48bcdd47504e810d8fa320217e8c4786b0882dc49689502b25"} Feb 18 13:59:47 crc kubenswrapper[4817]: I0218 13:59:47.620254 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"17628042-4c9e-453d-bc17-e54b625cbc64","Type":"ContainerStarted","Data":"47ce642654c412b15df90fcdaaed14a7ec3e3734ab3d257814da4ec93ff08f9a"} Feb 18 13:59:47 crc kubenswrapper[4817]: I0218 13:59:47.620299 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"17628042-4c9e-453d-bc17-e54b625cbc64","Type":"ContainerStarted","Data":"732acdd11a85b81605ad08aa18e93e9a2f9b4be9b628f2fa6bb1545de9fc71c0"} Feb 18 13:59:47 crc kubenswrapper[4817]: I0218 13:59:47.669440 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" podStartSLOduration=27.668345026 podStartE2EDuration="27.668345026s" podCreationTimestamp="2026-02-18 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:47.643246976 +0000 UTC m=+50.218782959" watchObservedRunningTime="2026-02-18 13:59:47.668345026 +0000 UTC m=+50.243881009" Feb 18 13:59:47 crc kubenswrapper[4817]: I0218 13:59:47.690310 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.690286583 podStartE2EDuration="2.690286583s" podCreationTimestamp="2026-02-18 13:59:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:47.687705347 +0000 UTC m=+50.263241330" watchObservedRunningTime="2026-02-18 13:59:47.690286583 +0000 UTC m=+50.265822566" Feb 18 13:59:47 crc kubenswrapper[4817]: I0218 13:59:47.806617 4817 patch_prober.go:28] interesting pod/router-default-5444994796-6w9rz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 13:59:47 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Feb 18 13:59:47 crc kubenswrapper[4817]: [+]process-running ok Feb 18 13:59:47 crc kubenswrapper[4817]: healthz check failed Feb 18 13:59:47 crc kubenswrapper[4817]: I0218 13:59:47.806758 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6w9rz" podUID="0a42d5a9-1383-4a55-9c81-bd40eb5ba86f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 13:59:47 crc kubenswrapper[4817]: I0218 13:59:47.939611 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:47 crc kubenswrapper[4817]: I0218 13:59:47.939658 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:47 crc kubenswrapper[4817]: I0218 13:59:47.947092 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:48 crc kubenswrapper[4817]: E0218 13:59:48.080218 4817 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c94d838fdef8a73cec016eeb66e2e241cbdafd8f76a6afa7ad7b14c12985adc" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 18 13:59:48 crc kubenswrapper[4817]: E0218 13:59:48.084731 4817 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c94d838fdef8a73cec016eeb66e2e241cbdafd8f76a6afa7ad7b14c12985adc" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 18 13:59:48 crc kubenswrapper[4817]: E0218 13:59:48.087022 4817 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c94d838fdef8a73cec016eeb66e2e241cbdafd8f76a6afa7ad7b14c12985adc" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 18 13:59:48 crc kubenswrapper[4817]: E0218 13:59:48.087056 4817 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-pb2jx" podUID="3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982" containerName="kube-multus-additional-cni-plugins" Feb 18 13:59:48 crc kubenswrapper[4817]: I0218 13:59:48.651271 4817 generic.go:334] "Generic (PLEG): container finished" podID="17628042-4c9e-453d-bc17-e54b625cbc64" containerID="47ce642654c412b15df90fcdaaed14a7ec3e3734ab3d257814da4ec93ff08f9a" exitCode=0 Feb 18 13:59:48 crc kubenswrapper[4817]: I0218 13:59:48.651431 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"17628042-4c9e-453d-bc17-e54b625cbc64","Type":"ContainerDied","Data":"47ce642654c412b15df90fcdaaed14a7ec3e3734ab3d257814da4ec93ff08f9a"} Feb 18 13:59:48 crc kubenswrapper[4817]: I0218 13:59:48.661369 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pq5tr" Feb 18 13:59:48 crc kubenswrapper[4817]: I0218 13:59:48.825201 4817 patch_prober.go:28] interesting pod/router-default-5444994796-6w9rz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 13:59:48 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Feb 18 13:59:48 crc kubenswrapper[4817]: [+]process-running ok Feb 18 13:59:48 crc kubenswrapper[4817]: healthz check failed Feb 18 13:59:48 crc kubenswrapper[4817]: I0218 13:59:48.825275 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6w9rz" podUID="0a42d5a9-1383-4a55-9c81-bd40eb5ba86f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 13:59:48 crc kubenswrapper[4817]: I0218 13:59:48.977930 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 13:59:49 crc kubenswrapper[4817]: I0218 13:59:49.100343 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/557bca4a-3b11-4613-bc75-3bdfd3ddbd25-kube-api-access\") pod \"557bca4a-3b11-4613-bc75-3bdfd3ddbd25\" (UID: \"557bca4a-3b11-4613-bc75-3bdfd3ddbd25\") " Feb 18 13:59:49 crc kubenswrapper[4817]: I0218 13:59:49.100447 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/557bca4a-3b11-4613-bc75-3bdfd3ddbd25-kubelet-dir\") pod \"557bca4a-3b11-4613-bc75-3bdfd3ddbd25\" (UID: \"557bca4a-3b11-4613-bc75-3bdfd3ddbd25\") " Feb 18 13:59:49 crc kubenswrapper[4817]: I0218 13:59:49.100842 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/557bca4a-3b11-4613-bc75-3bdfd3ddbd25-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "557bca4a-3b11-4613-bc75-3bdfd3ddbd25" (UID: "557bca4a-3b11-4613-bc75-3bdfd3ddbd25"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 13:59:49 crc kubenswrapper[4817]: I0218 13:59:49.109904 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/557bca4a-3b11-4613-bc75-3bdfd3ddbd25-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "557bca4a-3b11-4613-bc75-3bdfd3ddbd25" (UID: "557bca4a-3b11-4613-bc75-3bdfd3ddbd25"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:49 crc kubenswrapper[4817]: I0218 13:59:49.202544 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/557bca4a-3b11-4613-bc75-3bdfd3ddbd25-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:49 crc kubenswrapper[4817]: I0218 13:59:49.202590 4817 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/557bca4a-3b11-4613-bc75-3bdfd3ddbd25-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:49 crc kubenswrapper[4817]: I0218 13:59:49.692671 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 18 13:59:49 crc kubenswrapper[4817]: I0218 13:59:49.692758 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"557bca4a-3b11-4613-bc75-3bdfd3ddbd25","Type":"ContainerDied","Data":"b05f406f5e49c4927e3fcce61324e1cbe387e08a8b73b2d35c62b2cdef42168d"} Feb 18 13:59:49 crc kubenswrapper[4817]: I0218 13:59:49.696152 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b05f406f5e49c4927e3fcce61324e1cbe387e08a8b73b2d35c62b2cdef42168d" Feb 18 13:59:49 crc kubenswrapper[4817]: I0218 13:59:49.814138 4817 patch_prober.go:28] interesting pod/router-default-5444994796-6w9rz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 13:59:49 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Feb 18 13:59:49 crc kubenswrapper[4817]: [+]process-running ok Feb 18 13:59:49 crc kubenswrapper[4817]: healthz check failed Feb 18 13:59:49 crc kubenswrapper[4817]: I0218 13:59:49.814207 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6w9rz" podUID="0a42d5a9-1383-4a55-9c81-bd40eb5ba86f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 13:59:49 crc kubenswrapper[4817]: I0218 13:59:49.915462 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 13:59:49 crc kubenswrapper[4817]: I0218 13:59:49.915532 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 13:59:49 crc kubenswrapper[4817]: I0218 13:59:49.916540 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 13:59:49 crc kubenswrapper[4817]: I0218 13:59:49.916611 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 13:59:49 crc kubenswrapper[4817]: I0218 13:59:49.918437 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 13:59:49 crc kubenswrapper[4817]: I0218 13:59:49.921238 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 13:59:49 crc kubenswrapper[4817]: I0218 13:59:49.921703 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 13:59:49 crc kubenswrapper[4817]: I0218 13:59:49.934800 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 13:59:49 crc kubenswrapper[4817]: I0218 13:59:49.952057 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 13:59:50 crc kubenswrapper[4817]: I0218 13:59:50.090843 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 18 13:59:50 crc kubenswrapper[4817]: I0218 13:59:50.098951 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 18 13:59:50 crc kubenswrapper[4817]: I0218 13:59:50.161911 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 13:59:50 crc kubenswrapper[4817]: I0218 13:59:50.323556 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17628042-4c9e-453d-bc17-e54b625cbc64-kube-api-access\") pod \"17628042-4c9e-453d-bc17-e54b625cbc64\" (UID: \"17628042-4c9e-453d-bc17-e54b625cbc64\") " Feb 18 13:59:50 crc kubenswrapper[4817]: I0218 13:59:50.323955 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17628042-4c9e-453d-bc17-e54b625cbc64-kubelet-dir\") pod \"17628042-4c9e-453d-bc17-e54b625cbc64\" (UID: \"17628042-4c9e-453d-bc17-e54b625cbc64\") " Feb 18 13:59:50 crc kubenswrapper[4817]: I0218 13:59:50.324291 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17628042-4c9e-453d-bc17-e54b625cbc64-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "17628042-4c9e-453d-bc17-e54b625cbc64" (UID: "17628042-4c9e-453d-bc17-e54b625cbc64"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 13:59:50 crc kubenswrapper[4817]: I0218 13:59:50.344105 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17628042-4c9e-453d-bc17-e54b625cbc64-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "17628042-4c9e-453d-bc17-e54b625cbc64" (UID: "17628042-4c9e-453d-bc17-e54b625cbc64"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 13:59:50 crc kubenswrapper[4817]: W0218 13:59:50.365765 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-8e5c15d6aad0a04a4f395616f6ff9c825e68cb7fe5b63dd64ca4ee7700bec87e WatchSource:0}: Error finding container 8e5c15d6aad0a04a4f395616f6ff9c825e68cb7fe5b63dd64ca4ee7700bec87e: Status 404 returned error can't find the container with id 8e5c15d6aad0a04a4f395616f6ff9c825e68cb7fe5b63dd64ca4ee7700bec87e Feb 18 13:59:50 crc kubenswrapper[4817]: I0218 13:59:50.432657 4817 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17628042-4c9e-453d-bc17-e54b625cbc64-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:50 crc kubenswrapper[4817]: I0218 13:59:50.432692 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17628042-4c9e-453d-bc17-e54b625cbc64-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 13:59:50 crc kubenswrapper[4817]: W0218 13:59:50.477254 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-ee41e42e34e7722438eff3ba5d123dc598f0f41fef0e61b1968c023f4d1c0b2a WatchSource:0}: Error finding container ee41e42e34e7722438eff3ba5d123dc598f0f41fef0e61b1968c023f4d1c0b2a: Status 404 returned error can't find the container with id ee41e42e34e7722438eff3ba5d123dc598f0f41fef0e61b1968c023f4d1c0b2a Feb 18 13:59:50 crc kubenswrapper[4817]: I0218 13:59:50.718702 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"ee41e42e34e7722438eff3ba5d123dc598f0f41fef0e61b1968c023f4d1c0b2a"} Feb 18 13:59:50 crc kubenswrapper[4817]: I0218 13:59:50.725382 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"0ca8803ed0d2417191377b88adf9bffa19c0c2c1a19d9af19182cd8a73fb582e"} Feb 18 13:59:50 crc kubenswrapper[4817]: I0218 13:59:50.728458 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"17628042-4c9e-453d-bc17-e54b625cbc64","Type":"ContainerDied","Data":"732acdd11a85b81605ad08aa18e93e9a2f9b4be9b628f2fa6bb1545de9fc71c0"} Feb 18 13:59:50 crc kubenswrapper[4817]: I0218 13:59:50.728524 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="732acdd11a85b81605ad08aa18e93e9a2f9b4be9b628f2fa6bb1545de9fc71c0" Feb 18 13:59:50 crc kubenswrapper[4817]: I0218 13:59:50.728620 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 18 13:59:50 crc kubenswrapper[4817]: I0218 13:59:50.745452 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"8e5c15d6aad0a04a4f395616f6ff9c825e68cb7fe5b63dd64ca4ee7700bec87e"} Feb 18 13:59:50 crc kubenswrapper[4817]: I0218 13:59:50.821462 4817 patch_prober.go:28] interesting pod/router-default-5444994796-6w9rz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 13:59:50 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Feb 18 13:59:50 crc kubenswrapper[4817]: [+]process-running ok Feb 18 13:59:50 crc kubenswrapper[4817]: healthz check failed Feb 18 13:59:50 crc kubenswrapper[4817]: I0218 13:59:50.821559 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6w9rz" podUID="0a42d5a9-1383-4a55-9c81-bd40eb5ba86f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 13:59:51 crc kubenswrapper[4817]: I0218 13:59:51.624971 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:51 crc kubenswrapper[4817]: I0218 13:59:51.631333 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-4rsdh" Feb 18 13:59:51 crc kubenswrapper[4817]: I0218 13:59:51.774339 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"2b727d766d5542c9788ca667894a414ef293874da33279bc4a96f0edfdb1b611"} Feb 18 13:59:51 crc kubenswrapper[4817]: I0218 13:59:51.774429 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 13:59:51 crc kubenswrapper[4817]: I0218 13:59:51.808307 4817 patch_prober.go:28] interesting pod/router-default-5444994796-6w9rz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 13:59:51 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Feb 18 13:59:51 crc kubenswrapper[4817]: [+]process-running ok Feb 18 13:59:51 crc kubenswrapper[4817]: healthz check failed Feb 18 13:59:51 crc kubenswrapper[4817]: I0218 13:59:51.808366 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6w9rz" podUID="0a42d5a9-1383-4a55-9c81-bd40eb5ba86f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 13:59:51 crc kubenswrapper[4817]: I0218 13:59:51.905309 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 18 13:59:51 crc kubenswrapper[4817]: I0218 13:59:51.931860 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 18 13:59:52 crc kubenswrapper[4817]: I0218 13:59:52.479178 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 13:59:52 crc kubenswrapper[4817]: I0218 13:59:52.501151 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=1.501129455 podStartE2EDuration="1.501129455s" podCreationTimestamp="2026-02-18 13:59:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 13:59:52.49668722 +0000 UTC m=+55.072223213" watchObservedRunningTime="2026-02-18 13:59:52.501129455 +0000 UTC m=+55.076665438" Feb 18 13:59:52 crc kubenswrapper[4817]: I0218 13:59:52.805755 4817 patch_prober.go:28] interesting pod/router-default-5444994796-6w9rz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 13:59:52 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Feb 18 13:59:52 crc kubenswrapper[4817]: [+]process-running ok Feb 18 13:59:52 crc kubenswrapper[4817]: healthz check failed Feb 18 13:59:52 crc kubenswrapper[4817]: I0218 13:59:52.805837 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6w9rz" podUID="0a42d5a9-1383-4a55-9c81-bd40eb5ba86f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 13:59:53 crc kubenswrapper[4817]: I0218 13:59:53.331345 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-pqtc6" Feb 18 13:59:53 crc kubenswrapper[4817]: I0218 13:59:53.810507 4817 patch_prober.go:28] interesting pod/router-default-5444994796-6w9rz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 13:59:53 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Feb 18 13:59:53 crc kubenswrapper[4817]: [+]process-running ok Feb 18 13:59:53 crc kubenswrapper[4817]: healthz check failed Feb 18 13:59:53 crc kubenswrapper[4817]: I0218 13:59:53.810620 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6w9rz" podUID="0a42d5a9-1383-4a55-9c81-bd40eb5ba86f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 13:59:54 crc kubenswrapper[4817]: I0218 13:59:54.807449 4817 patch_prober.go:28] interesting pod/router-default-5444994796-6w9rz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 13:59:54 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Feb 18 13:59:54 crc kubenswrapper[4817]: [+]process-running ok Feb 18 13:59:54 crc kubenswrapper[4817]: healthz check failed Feb 18 13:59:54 crc kubenswrapper[4817]: I0218 13:59:54.808117 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6w9rz" podUID="0a42d5a9-1383-4a55-9c81-bd40eb5ba86f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 13:59:55 crc kubenswrapper[4817]: I0218 13:59:55.805749 4817 patch_prober.go:28] interesting pod/router-default-5444994796-6w9rz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 13:59:55 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Feb 18 13:59:55 crc kubenswrapper[4817]: [+]process-running ok Feb 18 13:59:55 crc kubenswrapper[4817]: healthz check failed Feb 18 13:59:55 crc kubenswrapper[4817]: I0218 13:59:55.805829 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6w9rz" podUID="0a42d5a9-1383-4a55-9c81-bd40eb5ba86f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 13:59:56 crc kubenswrapper[4817]: I0218 13:59:56.630628 4817 patch_prober.go:28] interesting pod/console-f9d7485db-v2snv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 18 13:59:56 crc kubenswrapper[4817]: I0218 13:59:56.630724 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-v2snv" podUID="149bcfc3-9623-403e-8c4c-1019bd5f0c16" containerName="console" probeResult="failure" output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 18 13:59:56 crc kubenswrapper[4817]: I0218 13:59:56.652123 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-wwvbf" Feb 18 13:59:56 crc kubenswrapper[4817]: I0218 13:59:56.805812 4817 patch_prober.go:28] interesting pod/router-default-5444994796-6w9rz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 13:59:56 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Feb 18 13:59:56 crc kubenswrapper[4817]: [+]process-running ok Feb 18 13:59:56 crc kubenswrapper[4817]: healthz check failed Feb 18 13:59:56 crc kubenswrapper[4817]: I0218 13:59:56.805900 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6w9rz" podUID="0a42d5a9-1383-4a55-9c81-bd40eb5ba86f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 13:59:57 crc kubenswrapper[4817]: I0218 13:59:57.806020 4817 patch_prober.go:28] interesting pod/router-default-5444994796-6w9rz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 13:59:57 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Feb 18 13:59:57 crc kubenswrapper[4817]: [+]process-running ok Feb 18 13:59:57 crc kubenswrapper[4817]: healthz check failed Feb 18 13:59:57 crc kubenswrapper[4817]: I0218 13:59:57.806102 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6w9rz" podUID="0a42d5a9-1383-4a55-9c81-bd40eb5ba86f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 13:59:58 crc kubenswrapper[4817]: E0218 13:59:58.078945 4817 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c94d838fdef8a73cec016eeb66e2e241cbdafd8f76a6afa7ad7b14c12985adc" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 18 13:59:58 crc kubenswrapper[4817]: E0218 13:59:58.080972 4817 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c94d838fdef8a73cec016eeb66e2e241cbdafd8f76a6afa7ad7b14c12985adc" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 18 13:59:58 crc kubenswrapper[4817]: E0218 13:59:58.083928 4817 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c94d838fdef8a73cec016eeb66e2e241cbdafd8f76a6afa7ad7b14c12985adc" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 18 13:59:58 crc kubenswrapper[4817]: E0218 13:59:58.083959 4817 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-pb2jx" podUID="3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982" containerName="kube-multus-additional-cni-plugins" Feb 18 13:59:58 crc kubenswrapper[4817]: I0218 13:59:58.807973 4817 patch_prober.go:28] interesting pod/router-default-5444994796-6w9rz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 13:59:58 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Feb 18 13:59:58 crc kubenswrapper[4817]: [+]process-running ok Feb 18 13:59:58 crc kubenswrapper[4817]: healthz check failed Feb 18 13:59:58 crc kubenswrapper[4817]: I0218 13:59:58.808354 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6w9rz" podUID="0a42d5a9-1383-4a55-9c81-bd40eb5ba86f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 13:59:58 crc kubenswrapper[4817]: I0218 13:59:58.850516 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"aae79ae24b2cb15a668af28b28424c0da7d07c82e770e4fee6764acdfbc43a42"} Feb 18 13:59:58 crc kubenswrapper[4817]: I0218 13:59:58.856145 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"0a29f8d985748d1d9c7b9fedd254a30b5188411405b5d886269942ea5e5cffed"} Feb 18 13:59:59 crc kubenswrapper[4817]: I0218 13:59:59.806842 4817 patch_prober.go:28] interesting pod/router-default-5444994796-6w9rz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 13:59:59 crc kubenswrapper[4817]: [-]has-synced failed: reason withheld Feb 18 13:59:59 crc kubenswrapper[4817]: [+]process-running ok Feb 18 13:59:59 crc kubenswrapper[4817]: healthz check failed Feb 18 13:59:59 crc kubenswrapper[4817]: I0218 13:59:59.806948 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6w9rz" podUID="0a42d5a9-1383-4a55-9c81-bd40eb5ba86f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 14:00:00 crc kubenswrapper[4817]: I0218 14:00:00.164683 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523720-6wkzh"] Feb 18 14:00:00 crc kubenswrapper[4817]: E0218 14:00:00.164956 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17628042-4c9e-453d-bc17-e54b625cbc64" containerName="pruner" Feb 18 14:00:00 crc kubenswrapper[4817]: I0218 14:00:00.164970 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="17628042-4c9e-453d-bc17-e54b625cbc64" containerName="pruner" Feb 18 14:00:00 crc kubenswrapper[4817]: E0218 14:00:00.165018 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="557bca4a-3b11-4613-bc75-3bdfd3ddbd25" containerName="pruner" Feb 18 14:00:00 crc kubenswrapper[4817]: I0218 14:00:00.165025 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="557bca4a-3b11-4613-bc75-3bdfd3ddbd25" containerName="pruner" Feb 18 14:00:00 crc kubenswrapper[4817]: I0218 14:00:00.165120 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="17628042-4c9e-453d-bc17-e54b625cbc64" containerName="pruner" Feb 18 14:00:00 crc kubenswrapper[4817]: I0218 14:00:00.165136 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="557bca4a-3b11-4613-bc75-3bdfd3ddbd25" containerName="pruner" Feb 18 14:00:00 crc kubenswrapper[4817]: I0218 14:00:00.165542 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523720-6wkzh" Feb 18 14:00:00 crc kubenswrapper[4817]: I0218 14:00:00.169712 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 18 14:00:00 crc kubenswrapper[4817]: I0218 14:00:00.171857 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 18 14:00:00 crc kubenswrapper[4817]: I0218 14:00:00.213246 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46kp5\" (UniqueName: \"kubernetes.io/projected/22a49a67-343c-4b86-87b7-68804e001fb2-kube-api-access-46kp5\") pod \"collect-profiles-29523720-6wkzh\" (UID: \"22a49a67-343c-4b86-87b7-68804e001fb2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523720-6wkzh" Feb 18 14:00:00 crc kubenswrapper[4817]: I0218 14:00:00.213463 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22a49a67-343c-4b86-87b7-68804e001fb2-secret-volume\") pod \"collect-profiles-29523720-6wkzh\" (UID: \"22a49a67-343c-4b86-87b7-68804e001fb2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523720-6wkzh" Feb 18 14:00:00 crc kubenswrapper[4817]: I0218 14:00:00.213602 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22a49a67-343c-4b86-87b7-68804e001fb2-config-volume\") pod \"collect-profiles-29523720-6wkzh\" (UID: \"22a49a67-343c-4b86-87b7-68804e001fb2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523720-6wkzh" Feb 18 14:00:00 crc kubenswrapper[4817]: I0218 14:00:00.239014 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523720-6wkzh"] Feb 18 14:00:00 crc kubenswrapper[4817]: I0218 14:00:00.314641 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46kp5\" (UniqueName: \"kubernetes.io/projected/22a49a67-343c-4b86-87b7-68804e001fb2-kube-api-access-46kp5\") pod \"collect-profiles-29523720-6wkzh\" (UID: \"22a49a67-343c-4b86-87b7-68804e001fb2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523720-6wkzh" Feb 18 14:00:00 crc kubenswrapper[4817]: I0218 14:00:00.314746 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22a49a67-343c-4b86-87b7-68804e001fb2-secret-volume\") pod \"collect-profiles-29523720-6wkzh\" (UID: \"22a49a67-343c-4b86-87b7-68804e001fb2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523720-6wkzh" Feb 18 14:00:00 crc kubenswrapper[4817]: I0218 14:00:00.314799 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22a49a67-343c-4b86-87b7-68804e001fb2-config-volume\") pod \"collect-profiles-29523720-6wkzh\" (UID: \"22a49a67-343c-4b86-87b7-68804e001fb2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523720-6wkzh" Feb 18 14:00:00 crc kubenswrapper[4817]: I0218 14:00:00.315758 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22a49a67-343c-4b86-87b7-68804e001fb2-config-volume\") pod \"collect-profiles-29523720-6wkzh\" (UID: \"22a49a67-343c-4b86-87b7-68804e001fb2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523720-6wkzh" Feb 18 14:00:00 crc kubenswrapper[4817]: I0218 14:00:00.328794 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22a49a67-343c-4b86-87b7-68804e001fb2-secret-volume\") pod \"collect-profiles-29523720-6wkzh\" (UID: \"22a49a67-343c-4b86-87b7-68804e001fb2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523720-6wkzh" Feb 18 14:00:00 crc kubenswrapper[4817]: I0218 14:00:00.331072 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46kp5\" (UniqueName: \"kubernetes.io/projected/22a49a67-343c-4b86-87b7-68804e001fb2-kube-api-access-46kp5\") pod \"collect-profiles-29523720-6wkzh\" (UID: \"22a49a67-343c-4b86-87b7-68804e001fb2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523720-6wkzh" Feb 18 14:00:00 crc kubenswrapper[4817]: I0218 14:00:00.488153 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523720-6wkzh" Feb 18 14:00:00 crc kubenswrapper[4817]: I0218 14:00:00.806896 4817 patch_prober.go:28] interesting pod/router-default-5444994796-6w9rz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 18 14:00:00 crc kubenswrapper[4817]: [+]has-synced ok Feb 18 14:00:00 crc kubenswrapper[4817]: [+]process-running ok Feb 18 14:00:00 crc kubenswrapper[4817]: healthz check failed Feb 18 14:00:00 crc kubenswrapper[4817]: I0218 14:00:00.807003 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6w9rz" podUID="0a42d5a9-1383-4a55-9c81-bd40eb5ba86f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 14:00:01 crc kubenswrapper[4817]: I0218 14:00:01.806641 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-6w9rz" Feb 18 14:00:01 crc kubenswrapper[4817]: I0218 14:00:01.811210 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-6w9rz" Feb 18 14:00:05 crc kubenswrapper[4817]: I0218 14:00:05.764490 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 14:00:06 crc kubenswrapper[4817]: I0218 14:00:06.721077 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-v2snv" Feb 18 14:00:06 crc kubenswrapper[4817]: I0218 14:00:06.728473 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-v2snv" Feb 18 14:00:08 crc kubenswrapper[4817]: E0218 14:00:08.080393 4817 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c94d838fdef8a73cec016eeb66e2e241cbdafd8f76a6afa7ad7b14c12985adc" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 18 14:00:08 crc kubenswrapper[4817]: E0218 14:00:08.082566 4817 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c94d838fdef8a73cec016eeb66e2e241cbdafd8f76a6afa7ad7b14c12985adc" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 18 14:00:08 crc kubenswrapper[4817]: E0218 14:00:08.084380 4817 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c94d838fdef8a73cec016eeb66e2e241cbdafd8f76a6afa7ad7b14c12985adc" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 18 14:00:08 crc kubenswrapper[4817]: E0218 14:00:08.084441 4817 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-pb2jx" podUID="3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982" containerName="kube-multus-additional-cni-plugins" Feb 18 14:00:14 crc kubenswrapper[4817]: E0218 14:00:14.011777 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 18 14:00:14 crc kubenswrapper[4817]: E0218 14:00:14.012341 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dkmk4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-7rx2s_openshift-marketplace(6a6721c8-f88b-4812-9cce-0e6d959d5fa6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 18 14:00:14 crc kubenswrapper[4817]: E0218 14:00:14.013513 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-7rx2s" podUID="6a6721c8-f88b-4812-9cce-0e6d959d5fa6" Feb 18 14:00:14 crc kubenswrapper[4817]: E0218 14:00:14.017513 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 18 14:00:14 crc kubenswrapper[4817]: E0218 14:00:14.017646 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6rlmz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-tnrfx_openshift-marketplace(262d049a-2c52-453b-a054-3a17b595d535): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 18 14:00:14 crc kubenswrapper[4817]: E0218 14:00:14.018911 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-tnrfx" podUID="262d049a-2c52-453b-a054-3a17b595d535" Feb 18 14:00:14 crc kubenswrapper[4817]: I0218 14:00:14.974389 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-pb2jx_3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982/kube-multus-additional-cni-plugins/0.log" Feb 18 14:00:14 crc kubenswrapper[4817]: I0218 14:00:14.974686 4817 generic.go:334] "Generic (PLEG): container finished" podID="3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982" containerID="1c94d838fdef8a73cec016eeb66e2e241cbdafd8f76a6afa7ad7b14c12985adc" exitCode=137 Feb 18 14:00:14 crc kubenswrapper[4817]: I0218 14:00:14.974819 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-pb2jx" event={"ID":"3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982","Type":"ContainerDied","Data":"1c94d838fdef8a73cec016eeb66e2e241cbdafd8f76a6afa7ad7b14c12985adc"} Feb 18 14:00:15 crc kubenswrapper[4817]: E0218 14:00:15.825381 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-7rx2s" podUID="6a6721c8-f88b-4812-9cce-0e6d959d5fa6" Feb 18 14:00:15 crc kubenswrapper[4817]: E0218 14:00:15.828654 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-tnrfx" podUID="262d049a-2c52-453b-a054-3a17b595d535" Feb 18 14:00:15 crc kubenswrapper[4817]: E0218 14:00:15.897789 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 18 14:00:15 crc kubenswrapper[4817]: E0218 14:00:15.898042 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wr8hq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-hxln5_openshift-marketplace(8162b014-86d1-482a-8c7c-eba34fed3f62): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 18 14:00:15 crc kubenswrapper[4817]: E0218 14:00:15.899271 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-hxln5" podUID="8162b014-86d1-482a-8c7c-eba34fed3f62" Feb 18 14:00:15 crc kubenswrapper[4817]: E0218 14:00:15.957814 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 18 14:00:15 crc kubenswrapper[4817]: E0218 14:00:15.958074 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wvd9n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-86tmt_openshift-marketplace(99fce15c-b13c-4341-b1a0-494d5bd3f76a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 18 14:00:15 crc kubenswrapper[4817]: E0218 14:00:15.959591 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-86tmt" podUID="99fce15c-b13c-4341-b1a0-494d5bd3f76a" Feb 18 14:00:16 crc kubenswrapper[4817]: I0218 14:00:16.257155 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j4s9g" Feb 18 14:00:17 crc kubenswrapper[4817]: I0218 14:00:17.190317 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 18 14:00:17 crc kubenswrapper[4817]: E0218 14:00:17.449676 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-hxln5" podUID="8162b014-86d1-482a-8c7c-eba34fed3f62" Feb 18 14:00:17 crc kubenswrapper[4817]: E0218 14:00:17.450632 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-86tmt" podUID="99fce15c-b13c-4341-b1a0-494d5bd3f76a" Feb 18 14:00:17 crc kubenswrapper[4817]: I0218 14:00:17.535646 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-pb2jx_3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982/kube-multus-additional-cni-plugins/0.log" Feb 18 14:00:17 crc kubenswrapper[4817]: I0218 14:00:17.535724 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-pb2jx" Feb 18 14:00:17 crc kubenswrapper[4817]: I0218 14:00:17.555478 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=0.555455368 podStartE2EDuration="555.455368ms" podCreationTimestamp="2026-02-18 14:00:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:00:17.553027385 +0000 UTC m=+80.128563358" watchObservedRunningTime="2026-02-18 14:00:17.555455368 +0000 UTC m=+80.130991351" Feb 18 14:00:17 crc kubenswrapper[4817]: E0218 14:00:17.592767 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 18 14:00:17 crc kubenswrapper[4817]: E0218 14:00:17.593016 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rxcv9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-2z68r_openshift-marketplace(c8e393a5-61e4-4e91-bec4-770687b8d01b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 18 14:00:17 crc kubenswrapper[4817]: E0218 14:00:17.594255 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-2z68r" podUID="c8e393a5-61e4-4e91-bec4-770687b8d01b" Feb 18 14:00:17 crc kubenswrapper[4817]: I0218 14:00:17.600954 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982-tuning-conf-dir\") pod \"3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982\" (UID: \"3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982\") " Feb 18 14:00:17 crc kubenswrapper[4817]: I0218 14:00:17.601074 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982-ready\") pod \"3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982\" (UID: \"3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982\") " Feb 18 14:00:17 crc kubenswrapper[4817]: I0218 14:00:17.601148 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982-cni-sysctl-allowlist\") pod \"3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982\" (UID: \"3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982\") " Feb 18 14:00:17 crc kubenswrapper[4817]: I0218 14:00:17.601290 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsknh\" (UniqueName: \"kubernetes.io/projected/3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982-kube-api-access-rsknh\") pod \"3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982\" (UID: \"3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982\") " Feb 18 14:00:17 crc kubenswrapper[4817]: I0218 14:00:17.601337 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982" (UID: "3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 14:00:17 crc kubenswrapper[4817]: I0218 14:00:17.601563 4817 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:17 crc kubenswrapper[4817]: I0218 14:00:17.601909 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982-ready" (OuterVolumeSpecName: "ready") pod "3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982" (UID: "3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:00:17 crc kubenswrapper[4817]: I0218 14:00:17.602339 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982" (UID: "3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:00:17 crc kubenswrapper[4817]: I0218 14:00:17.612473 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982-kube-api-access-rsknh" (OuterVolumeSpecName: "kube-api-access-rsknh") pod "3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982" (UID: "3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982"). InnerVolumeSpecName "kube-api-access-rsknh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:00:17 crc kubenswrapper[4817]: E0218 14:00:17.633679 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 18 14:00:17 crc kubenswrapper[4817]: E0218 14:00:17.633905 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cc7nx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-54ftv_openshift-marketplace(307f9900-9137-46bb-9b32-254ae14c8c17): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 18 14:00:17 crc kubenswrapper[4817]: E0218 14:00:17.635231 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-54ftv" podUID="307f9900-9137-46bb-9b32-254ae14c8c17" Feb 18 14:00:17 crc kubenswrapper[4817]: I0218 14:00:17.703797 4817 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:17 crc kubenswrapper[4817]: I0218 14:00:17.703827 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsknh\" (UniqueName: \"kubernetes.io/projected/3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982-kube-api-access-rsknh\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:17 crc kubenswrapper[4817]: I0218 14:00:17.703837 4817 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982-ready\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:17 crc kubenswrapper[4817]: I0218 14:00:17.898701 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523720-6wkzh"] Feb 18 14:00:17 crc kubenswrapper[4817]: W0218 14:00:17.937948 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22a49a67_343c_4b86_87b7_68804e001fb2.slice/crio-e808cfa31880b402b35c5ef6792afb02b22aa9e05e72af321a88c4d5998c3649 WatchSource:0}: Error finding container e808cfa31880b402b35c5ef6792afb02b22aa9e05e72af321a88c4d5998c3649: Status 404 returned error can't find the container with id e808cfa31880b402b35c5ef6792afb02b22aa9e05e72af321a88c4d5998c3649 Feb 18 14:00:17 crc kubenswrapper[4817]: I0218 14:00:17.992818 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523720-6wkzh" event={"ID":"22a49a67-343c-4b86-87b7-68804e001fb2","Type":"ContainerStarted","Data":"e808cfa31880b402b35c5ef6792afb02b22aa9e05e72af321a88c4d5998c3649"} Feb 18 14:00:17 crc kubenswrapper[4817]: I0218 14:00:17.995049 4817 generic.go:334] "Generic (PLEG): container finished" podID="a860a4b3-b2dd-4d2f-8d2f-a959007a6197" containerID="a4645ae34569b8c312505667114b45db0df325131011b13469e6290f5980f326" exitCode=0 Feb 18 14:00:17 crc kubenswrapper[4817]: I0218 14:00:17.995134 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jp6w5" event={"ID":"a860a4b3-b2dd-4d2f-8d2f-a959007a6197","Type":"ContainerDied","Data":"a4645ae34569b8c312505667114b45db0df325131011b13469e6290f5980f326"} Feb 18 14:00:18 crc kubenswrapper[4817]: I0218 14:00:18.004422 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-pb2jx_3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982/kube-multus-additional-cni-plugins/0.log" Feb 18 14:00:18 crc kubenswrapper[4817]: I0218 14:00:18.004586 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-pb2jx" event={"ID":"3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982","Type":"ContainerDied","Data":"4e261bdd0ab08dd24f717eaf074c2768ee0497563b058db330449c524941fcbe"} Feb 18 14:00:18 crc kubenswrapper[4817]: I0218 14:00:18.004684 4817 scope.go:117] "RemoveContainer" containerID="1c94d838fdef8a73cec016eeb66e2e241cbdafd8f76a6afa7ad7b14c12985adc" Feb 18 14:00:18 crc kubenswrapper[4817]: I0218 14:00:18.004836 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-pb2jx" Feb 18 14:00:18 crc kubenswrapper[4817]: I0218 14:00:18.019571 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7925" event={"ID":"44a62058-ed9b-4364-97d0-09af2bb1c22d","Type":"ContainerStarted","Data":"b6d03a5f99c01f8b29d89d9b1cc1fa0abe0ae20e5a3c80d1e983f151b0f73e02"} Feb 18 14:00:18 crc kubenswrapper[4817]: E0218 14:00:18.032477 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-2z68r" podUID="c8e393a5-61e4-4e91-bec4-770687b8d01b" Feb 18 14:00:18 crc kubenswrapper[4817]: E0218 14:00:18.038077 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-54ftv" podUID="307f9900-9137-46bb-9b32-254ae14c8c17" Feb 18 14:00:18 crc kubenswrapper[4817]: I0218 14:00:18.117194 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-pb2jx"] Feb 18 14:00:18 crc kubenswrapper[4817]: I0218 14:00:18.121661 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-pb2jx"] Feb 18 14:00:18 crc kubenswrapper[4817]: I0218 14:00:18.180672 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982" path="/var/lib/kubelet/pods/3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982/volumes" Feb 18 14:00:19 crc kubenswrapper[4817]: I0218 14:00:19.029874 4817 generic.go:334] "Generic (PLEG): container finished" podID="44a62058-ed9b-4364-97d0-09af2bb1c22d" containerID="b6d03a5f99c01f8b29d89d9b1cc1fa0abe0ae20e5a3c80d1e983f151b0f73e02" exitCode=0 Feb 18 14:00:19 crc kubenswrapper[4817]: I0218 14:00:19.029963 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7925" event={"ID":"44a62058-ed9b-4364-97d0-09af2bb1c22d","Type":"ContainerDied","Data":"b6d03a5f99c01f8b29d89d9b1cc1fa0abe0ae20e5a3c80d1e983f151b0f73e02"} Feb 18 14:00:19 crc kubenswrapper[4817]: I0218 14:00:19.036154 4817 generic.go:334] "Generic (PLEG): container finished" podID="22a49a67-343c-4b86-87b7-68804e001fb2" containerID="4ba66eb21295d93203ca6f817ddb9a3cf504e5521deb2f3fc1a0b9db0a0b6954" exitCode=0 Feb 18 14:00:19 crc kubenswrapper[4817]: I0218 14:00:19.036338 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523720-6wkzh" event={"ID":"22a49a67-343c-4b86-87b7-68804e001fb2","Type":"ContainerDied","Data":"4ba66eb21295d93203ca6f817ddb9a3cf504e5521deb2f3fc1a0b9db0a0b6954"} Feb 18 14:00:19 crc kubenswrapper[4817]: I0218 14:00:19.041396 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jp6w5" event={"ID":"a860a4b3-b2dd-4d2f-8d2f-a959007a6197","Type":"ContainerStarted","Data":"52302c5635b0814b8f9c823564514b85b7eb1dc5e0dee685fbbb31135ee65bec"} Feb 18 14:00:19 crc kubenswrapper[4817]: I0218 14:00:19.097935 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jp6w5" podStartSLOduration=1.975889467 podStartE2EDuration="35.097903675s" podCreationTimestamp="2026-02-18 13:59:44 +0000 UTC" firstStartedPulling="2026-02-18 13:59:45.372318041 +0000 UTC m=+47.947854024" lastFinishedPulling="2026-02-18 14:00:18.494332239 +0000 UTC m=+81.069868232" observedRunningTime="2026-02-18 14:00:19.095403491 +0000 UTC m=+81.670939504" watchObservedRunningTime="2026-02-18 14:00:19.097903675 +0000 UTC m=+81.673439668" Feb 18 14:00:20 crc kubenswrapper[4817]: I0218 14:00:20.050052 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7925" event={"ID":"44a62058-ed9b-4364-97d0-09af2bb1c22d","Type":"ContainerStarted","Data":"827bcc31902382441f5dc5292cbc88bab9c20d1b18edec4252963d92d90ca87d"} Feb 18 14:00:20 crc kubenswrapper[4817]: I0218 14:00:20.073997 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b7925" podStartSLOduration=3.1747254480000002 podStartE2EDuration="35.073945619s" podCreationTimestamp="2026-02-18 13:59:45 +0000 UTC" firstStartedPulling="2026-02-18 13:59:47.59391751 +0000 UTC m=+50.169453493" lastFinishedPulling="2026-02-18 14:00:19.493137641 +0000 UTC m=+82.068673664" observedRunningTime="2026-02-18 14:00:20.06896992 +0000 UTC m=+82.644505903" watchObservedRunningTime="2026-02-18 14:00:20.073945619 +0000 UTC m=+82.649481612" Feb 18 14:00:20 crc kubenswrapper[4817]: I0218 14:00:20.393059 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-l7vpp"] Feb 18 14:00:20 crc kubenswrapper[4817]: I0218 14:00:20.393872 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" podUID="43f553ef-0150-4383-8c39-5db2cbcab63d" containerName="controller-manager" containerID="cri-o://72e0b658c7f8a4f66e8521c14702f84177b35fba5b01b6da44ee82f081f3819e" gracePeriod=30 Feb 18 14:00:20 crc kubenswrapper[4817]: I0218 14:00:20.487548 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr"] Feb 18 14:00:20 crc kubenswrapper[4817]: I0218 14:00:20.487833 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr" podUID="3707018b-031a-4902-8e5c-ba5bc46cc4c4" containerName="route-controller-manager" containerID="cri-o://3abfed9bee2d45733707c3a904422161329ef07885de51da9a8b1ec2afa1c502" gracePeriod=30 Feb 18 14:00:20 crc kubenswrapper[4817]: I0218 14:00:20.624402 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523720-6wkzh" Feb 18 14:00:20 crc kubenswrapper[4817]: I0218 14:00:20.743767 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22a49a67-343c-4b86-87b7-68804e001fb2-secret-volume\") pod \"22a49a67-343c-4b86-87b7-68804e001fb2\" (UID: \"22a49a67-343c-4b86-87b7-68804e001fb2\") " Feb 18 14:00:20 crc kubenswrapper[4817]: I0218 14:00:20.743852 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46kp5\" (UniqueName: \"kubernetes.io/projected/22a49a67-343c-4b86-87b7-68804e001fb2-kube-api-access-46kp5\") pod \"22a49a67-343c-4b86-87b7-68804e001fb2\" (UID: \"22a49a67-343c-4b86-87b7-68804e001fb2\") " Feb 18 14:00:20 crc kubenswrapper[4817]: I0218 14:00:20.743924 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22a49a67-343c-4b86-87b7-68804e001fb2-config-volume\") pod \"22a49a67-343c-4b86-87b7-68804e001fb2\" (UID: \"22a49a67-343c-4b86-87b7-68804e001fb2\") " Feb 18 14:00:20 crc kubenswrapper[4817]: I0218 14:00:20.745260 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22a49a67-343c-4b86-87b7-68804e001fb2-config-volume" (OuterVolumeSpecName: "config-volume") pod "22a49a67-343c-4b86-87b7-68804e001fb2" (UID: "22a49a67-343c-4b86-87b7-68804e001fb2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:00:20 crc kubenswrapper[4817]: I0218 14:00:20.756288 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22a49a67-343c-4b86-87b7-68804e001fb2-kube-api-access-46kp5" (OuterVolumeSpecName: "kube-api-access-46kp5") pod "22a49a67-343c-4b86-87b7-68804e001fb2" (UID: "22a49a67-343c-4b86-87b7-68804e001fb2"). InnerVolumeSpecName "kube-api-access-46kp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:00:20 crc kubenswrapper[4817]: I0218 14:00:20.756392 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22a49a67-343c-4b86-87b7-68804e001fb2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "22a49a67-343c-4b86-87b7-68804e001fb2" (UID: "22a49a67-343c-4b86-87b7-68804e001fb2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:00:20 crc kubenswrapper[4817]: I0218 14:00:20.845126 4817 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22a49a67-343c-4b86-87b7-68804e001fb2-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:20 crc kubenswrapper[4817]: I0218 14:00:20.845177 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46kp5\" (UniqueName: \"kubernetes.io/projected/22a49a67-343c-4b86-87b7-68804e001fb2-kube-api-access-46kp5\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:20 crc kubenswrapper[4817]: I0218 14:00:20.845190 4817 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22a49a67-343c-4b86-87b7-68804e001fb2-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:20 crc kubenswrapper[4817]: I0218 14:00:20.992315 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr" Feb 18 14:00:20 crc kubenswrapper[4817]: I0218 14:00:20.998964 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.049394 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58vgz\" (UniqueName: \"kubernetes.io/projected/3707018b-031a-4902-8e5c-ba5bc46cc4c4-kube-api-access-58vgz\") pod \"3707018b-031a-4902-8e5c-ba5bc46cc4c4\" (UID: \"3707018b-031a-4902-8e5c-ba5bc46cc4c4\") " Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.049467 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43f553ef-0150-4383-8c39-5db2cbcab63d-config\") pod \"43f553ef-0150-4383-8c39-5db2cbcab63d\" (UID: \"43f553ef-0150-4383-8c39-5db2cbcab63d\") " Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.049959 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/43f553ef-0150-4383-8c39-5db2cbcab63d-client-ca\") pod \"43f553ef-0150-4383-8c39-5db2cbcab63d\" (UID: \"43f553ef-0150-4383-8c39-5db2cbcab63d\") " Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.050011 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/43f553ef-0150-4383-8c39-5db2cbcab63d-proxy-ca-bundles\") pod \"43f553ef-0150-4383-8c39-5db2cbcab63d\" (UID: \"43f553ef-0150-4383-8c39-5db2cbcab63d\") " Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.050046 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3707018b-031a-4902-8e5c-ba5bc46cc4c4-serving-cert\") pod \"3707018b-031a-4902-8e5c-ba5bc46cc4c4\" (UID: \"3707018b-031a-4902-8e5c-ba5bc46cc4c4\") " Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.050512 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3707018b-031a-4902-8e5c-ba5bc46cc4c4-config\") pod \"3707018b-031a-4902-8e5c-ba5bc46cc4c4\" (UID: \"3707018b-031a-4902-8e5c-ba5bc46cc4c4\") " Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.050593 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3707018b-031a-4902-8e5c-ba5bc46cc4c4-client-ca\") pod \"3707018b-031a-4902-8e5c-ba5bc46cc4c4\" (UID: \"3707018b-031a-4902-8e5c-ba5bc46cc4c4\") " Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.050659 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxmdb\" (UniqueName: \"kubernetes.io/projected/43f553ef-0150-4383-8c39-5db2cbcab63d-kube-api-access-kxmdb\") pod \"43f553ef-0150-4383-8c39-5db2cbcab63d\" (UID: \"43f553ef-0150-4383-8c39-5db2cbcab63d\") " Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.050686 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43f553ef-0150-4383-8c39-5db2cbcab63d-serving-cert\") pod \"43f553ef-0150-4383-8c39-5db2cbcab63d\" (UID: \"43f553ef-0150-4383-8c39-5db2cbcab63d\") " Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.052475 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3707018b-031a-4902-8e5c-ba5bc46cc4c4-client-ca" (OuterVolumeSpecName: "client-ca") pod "3707018b-031a-4902-8e5c-ba5bc46cc4c4" (UID: "3707018b-031a-4902-8e5c-ba5bc46cc4c4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.052552 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3707018b-031a-4902-8e5c-ba5bc46cc4c4-config" (OuterVolumeSpecName: "config") pod "3707018b-031a-4902-8e5c-ba5bc46cc4c4" (UID: "3707018b-031a-4902-8e5c-ba5bc46cc4c4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.053472 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43f553ef-0150-4383-8c39-5db2cbcab63d-config" (OuterVolumeSpecName: "config") pod "43f553ef-0150-4383-8c39-5db2cbcab63d" (UID: "43f553ef-0150-4383-8c39-5db2cbcab63d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.054112 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43f553ef-0150-4383-8c39-5db2cbcab63d-client-ca" (OuterVolumeSpecName: "client-ca") pod "43f553ef-0150-4383-8c39-5db2cbcab63d" (UID: "43f553ef-0150-4383-8c39-5db2cbcab63d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.054886 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43f553ef-0150-4383-8c39-5db2cbcab63d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "43f553ef-0150-4383-8c39-5db2cbcab63d" (UID: "43f553ef-0150-4383-8c39-5db2cbcab63d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.056353 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3707018b-031a-4902-8e5c-ba5bc46cc4c4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3707018b-031a-4902-8e5c-ba5bc46cc4c4" (UID: "3707018b-031a-4902-8e5c-ba5bc46cc4c4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.056374 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43f553ef-0150-4383-8c39-5db2cbcab63d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "43f553ef-0150-4383-8c39-5db2cbcab63d" (UID: "43f553ef-0150-4383-8c39-5db2cbcab63d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.056731 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43f553ef-0150-4383-8c39-5db2cbcab63d-kube-api-access-kxmdb" (OuterVolumeSpecName: "kube-api-access-kxmdb") pod "43f553ef-0150-4383-8c39-5db2cbcab63d" (UID: "43f553ef-0150-4383-8c39-5db2cbcab63d"). InnerVolumeSpecName "kube-api-access-kxmdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.060711 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3707018b-031a-4902-8e5c-ba5bc46cc4c4-kube-api-access-58vgz" (OuterVolumeSpecName: "kube-api-access-58vgz") pod "3707018b-031a-4902-8e5c-ba5bc46cc4c4" (UID: "3707018b-031a-4902-8e5c-ba5bc46cc4c4"). InnerVolumeSpecName "kube-api-access-58vgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.068765 4817 generic.go:334] "Generic (PLEG): container finished" podID="3707018b-031a-4902-8e5c-ba5bc46cc4c4" containerID="3abfed9bee2d45733707c3a904422161329ef07885de51da9a8b1ec2afa1c502" exitCode=0 Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.068842 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr" Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.068880 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr" event={"ID":"3707018b-031a-4902-8e5c-ba5bc46cc4c4","Type":"ContainerDied","Data":"3abfed9bee2d45733707c3a904422161329ef07885de51da9a8b1ec2afa1c502"} Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.068958 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr" event={"ID":"3707018b-031a-4902-8e5c-ba5bc46cc4c4","Type":"ContainerDied","Data":"9fd80a5bb0c0fbd10d8b687a39ea6f110c7385b2921dfc302b535b082dcc4fa3"} Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.069011 4817 scope.go:117] "RemoveContainer" containerID="3abfed9bee2d45733707c3a904422161329ef07885de51da9a8b1ec2afa1c502" Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.072630 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523720-6wkzh" event={"ID":"22a49a67-343c-4b86-87b7-68804e001fb2","Type":"ContainerDied","Data":"e808cfa31880b402b35c5ef6792afb02b22aa9e05e72af321a88c4d5998c3649"} Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.072668 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e808cfa31880b402b35c5ef6792afb02b22aa9e05e72af321a88c4d5998c3649" Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.072758 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523720-6wkzh" Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.080436 4817 generic.go:334] "Generic (PLEG): container finished" podID="43f553ef-0150-4383-8c39-5db2cbcab63d" containerID="72e0b658c7f8a4f66e8521c14702f84177b35fba5b01b6da44ee82f081f3819e" exitCode=0 Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.080594 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.080644 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" event={"ID":"43f553ef-0150-4383-8c39-5db2cbcab63d","Type":"ContainerDied","Data":"72e0b658c7f8a4f66e8521c14702f84177b35fba5b01b6da44ee82f081f3819e"} Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.080689 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-l7vpp" event={"ID":"43f553ef-0150-4383-8c39-5db2cbcab63d","Type":"ContainerDied","Data":"4e3851f97c3471d3814ea376705e8761238b95fdfd86b1dcc595c576a6eea6f2"} Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.112333 4817 scope.go:117] "RemoveContainer" containerID="3abfed9bee2d45733707c3a904422161329ef07885de51da9a8b1ec2afa1c502" Feb 18 14:00:21 crc kubenswrapper[4817]: E0218 14:00:21.113097 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3abfed9bee2d45733707c3a904422161329ef07885de51da9a8b1ec2afa1c502\": container with ID starting with 3abfed9bee2d45733707c3a904422161329ef07885de51da9a8b1ec2afa1c502 not found: ID does not exist" containerID="3abfed9bee2d45733707c3a904422161329ef07885de51da9a8b1ec2afa1c502" Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.113210 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3abfed9bee2d45733707c3a904422161329ef07885de51da9a8b1ec2afa1c502"} err="failed to get container status \"3abfed9bee2d45733707c3a904422161329ef07885de51da9a8b1ec2afa1c502\": rpc error: code = NotFound desc = could not find container \"3abfed9bee2d45733707c3a904422161329ef07885de51da9a8b1ec2afa1c502\": container with ID starting with 3abfed9bee2d45733707c3a904422161329ef07885de51da9a8b1ec2afa1c502 not found: ID does not exist" Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.113371 4817 scope.go:117] "RemoveContainer" containerID="72e0b658c7f8a4f66e8521c14702f84177b35fba5b01b6da44ee82f081f3819e" Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.124811 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr"] Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.130043 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hrdpr"] Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.132222 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-l7vpp"] Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.133404 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-l7vpp"] Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.135236 4817 scope.go:117] "RemoveContainer" containerID="72e0b658c7f8a4f66e8521c14702f84177b35fba5b01b6da44ee82f081f3819e" Feb 18 14:00:21 crc kubenswrapper[4817]: E0218 14:00:21.135698 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72e0b658c7f8a4f66e8521c14702f84177b35fba5b01b6da44ee82f081f3819e\": container with ID starting with 72e0b658c7f8a4f66e8521c14702f84177b35fba5b01b6da44ee82f081f3819e not found: ID does not exist" containerID="72e0b658c7f8a4f66e8521c14702f84177b35fba5b01b6da44ee82f081f3819e" Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.135725 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72e0b658c7f8a4f66e8521c14702f84177b35fba5b01b6da44ee82f081f3819e"} err="failed to get container status \"72e0b658c7f8a4f66e8521c14702f84177b35fba5b01b6da44ee82f081f3819e\": rpc error: code = NotFound desc = could not find container \"72e0b658c7f8a4f66e8521c14702f84177b35fba5b01b6da44ee82f081f3819e\": container with ID starting with 72e0b658c7f8a4f66e8521c14702f84177b35fba5b01b6da44ee82f081f3819e not found: ID does not exist" Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.152526 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43f553ef-0150-4383-8c39-5db2cbcab63d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.152566 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58vgz\" (UniqueName: \"kubernetes.io/projected/3707018b-031a-4902-8e5c-ba5bc46cc4c4-kube-api-access-58vgz\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.152583 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43f553ef-0150-4383-8c39-5db2cbcab63d-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.152592 4817 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/43f553ef-0150-4383-8c39-5db2cbcab63d-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.152603 4817 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/43f553ef-0150-4383-8c39-5db2cbcab63d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.152613 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3707018b-031a-4902-8e5c-ba5bc46cc4c4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.152626 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3707018b-031a-4902-8e5c-ba5bc46cc4c4-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.152636 4817 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3707018b-031a-4902-8e5c-ba5bc46cc4c4-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:21 crc kubenswrapper[4817]: I0218 14:00:21.152645 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxmdb\" (UniqueName: \"kubernetes.io/projected/43f553ef-0150-4383-8c39-5db2cbcab63d-kube-api-access-kxmdb\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.186932 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3707018b-031a-4902-8e5c-ba5bc46cc4c4" path="/var/lib/kubelet/pods/3707018b-031a-4902-8e5c-ba5bc46cc4c4/volumes" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.187659 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43f553ef-0150-4383-8c39-5db2cbcab63d" path="/var/lib/kubelet/pods/43f553ef-0150-4383-8c39-5db2cbcab63d/volumes" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.435824 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5d6b7b4f9b-rj4jr"] Feb 18 14:00:22 crc kubenswrapper[4817]: E0218 14:00:22.436247 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22a49a67-343c-4b86-87b7-68804e001fb2" containerName="collect-profiles" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.436269 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="22a49a67-343c-4b86-87b7-68804e001fb2" containerName="collect-profiles" Feb 18 14:00:22 crc kubenswrapper[4817]: E0218 14:00:22.436287 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3707018b-031a-4902-8e5c-ba5bc46cc4c4" containerName="route-controller-manager" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.436298 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="3707018b-031a-4902-8e5c-ba5bc46cc4c4" containerName="route-controller-manager" Feb 18 14:00:22 crc kubenswrapper[4817]: E0218 14:00:22.436309 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43f553ef-0150-4383-8c39-5db2cbcab63d" containerName="controller-manager" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.436317 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="43f553ef-0150-4383-8c39-5db2cbcab63d" containerName="controller-manager" Feb 18 14:00:22 crc kubenswrapper[4817]: E0218 14:00:22.436332 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982" containerName="kube-multus-additional-cni-plugins" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.436340 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982" containerName="kube-multus-additional-cni-plugins" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.436490 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fa6d1ae-bbc9-4cb8-a1ca-56da8608e982" containerName="kube-multus-additional-cni-plugins" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.436505 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="22a49a67-343c-4b86-87b7-68804e001fb2" containerName="collect-profiles" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.436524 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="43f553ef-0150-4383-8c39-5db2cbcab63d" containerName="controller-manager" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.436532 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="3707018b-031a-4902-8e5c-ba5bc46cc4c4" containerName="route-controller-manager" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.437867 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d6b7b4f9b-rj4jr" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.440576 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.440639 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.440817 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.440945 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.441276 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.442066 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.445358 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67679b45f6-zc9dp"] Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.446090 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67679b45f6-zc9dp" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.451399 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.452120 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.452273 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.452367 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.452532 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.452714 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.452760 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.461679 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67679b45f6-zc9dp"] Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.465646 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d6b7b4f9b-rj4jr"] Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.473101 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baf9f8d0-6947-408f-b2c9-404c3ee736ef-config\") pod \"controller-manager-5d6b7b4f9b-rj4jr\" (UID: \"baf9f8d0-6947-408f-b2c9-404c3ee736ef\") " pod="openshift-controller-manager/controller-manager-5d6b7b4f9b-rj4jr" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.473152 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/baf9f8d0-6947-408f-b2c9-404c3ee736ef-client-ca\") pod \"controller-manager-5d6b7b4f9b-rj4jr\" (UID: \"baf9f8d0-6947-408f-b2c9-404c3ee736ef\") " pod="openshift-controller-manager/controller-manager-5d6b7b4f9b-rj4jr" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.473212 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/baf9f8d0-6947-408f-b2c9-404c3ee736ef-proxy-ca-bundles\") pod \"controller-manager-5d6b7b4f9b-rj4jr\" (UID: \"baf9f8d0-6947-408f-b2c9-404c3ee736ef\") " pod="openshift-controller-manager/controller-manager-5d6b7b4f9b-rj4jr" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.473242 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhdml\" (UniqueName: \"kubernetes.io/projected/baf9f8d0-6947-408f-b2c9-404c3ee736ef-kube-api-access-fhdml\") pod \"controller-manager-5d6b7b4f9b-rj4jr\" (UID: \"baf9f8d0-6947-408f-b2c9-404c3ee736ef\") " pod="openshift-controller-manager/controller-manager-5d6b7b4f9b-rj4jr" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.473271 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/baf9f8d0-6947-408f-b2c9-404c3ee736ef-serving-cert\") pod \"controller-manager-5d6b7b4f9b-rj4jr\" (UID: \"baf9f8d0-6947-408f-b2c9-404c3ee736ef\") " pod="openshift-controller-manager/controller-manager-5d6b7b4f9b-rj4jr" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.574425 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7d34513-4dab-417b-8a46-485f4d69b0f0-serving-cert\") pod \"route-controller-manager-67679b45f6-zc9dp\" (UID: \"a7d34513-4dab-417b-8a46-485f4d69b0f0\") " pod="openshift-route-controller-manager/route-controller-manager-67679b45f6-zc9dp" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.574513 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/baf9f8d0-6947-408f-b2c9-404c3ee736ef-proxy-ca-bundles\") pod \"controller-manager-5d6b7b4f9b-rj4jr\" (UID: \"baf9f8d0-6947-408f-b2c9-404c3ee736ef\") " pod="openshift-controller-manager/controller-manager-5d6b7b4f9b-rj4jr" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.574556 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhdml\" (UniqueName: \"kubernetes.io/projected/baf9f8d0-6947-408f-b2c9-404c3ee736ef-kube-api-access-fhdml\") pod \"controller-manager-5d6b7b4f9b-rj4jr\" (UID: \"baf9f8d0-6947-408f-b2c9-404c3ee736ef\") " pod="openshift-controller-manager/controller-manager-5d6b7b4f9b-rj4jr" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.574595 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/baf9f8d0-6947-408f-b2c9-404c3ee736ef-serving-cert\") pod \"controller-manager-5d6b7b4f9b-rj4jr\" (UID: \"baf9f8d0-6947-408f-b2c9-404c3ee736ef\") " pod="openshift-controller-manager/controller-manager-5d6b7b4f9b-rj4jr" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.574647 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7d34513-4dab-417b-8a46-485f4d69b0f0-client-ca\") pod \"route-controller-manager-67679b45f6-zc9dp\" (UID: \"a7d34513-4dab-417b-8a46-485f4d69b0f0\") " pod="openshift-route-controller-manager/route-controller-manager-67679b45f6-zc9dp" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.574732 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7d34513-4dab-417b-8a46-485f4d69b0f0-config\") pod \"route-controller-manager-67679b45f6-zc9dp\" (UID: \"a7d34513-4dab-417b-8a46-485f4d69b0f0\") " pod="openshift-route-controller-manager/route-controller-manager-67679b45f6-zc9dp" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.574797 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baf9f8d0-6947-408f-b2c9-404c3ee736ef-config\") pod \"controller-manager-5d6b7b4f9b-rj4jr\" (UID: \"baf9f8d0-6947-408f-b2c9-404c3ee736ef\") " pod="openshift-controller-manager/controller-manager-5d6b7b4f9b-rj4jr" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.574853 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/baf9f8d0-6947-408f-b2c9-404c3ee736ef-client-ca\") pod \"controller-manager-5d6b7b4f9b-rj4jr\" (UID: \"baf9f8d0-6947-408f-b2c9-404c3ee736ef\") " pod="openshift-controller-manager/controller-manager-5d6b7b4f9b-rj4jr" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.574897 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlwtd\" (UniqueName: \"kubernetes.io/projected/a7d34513-4dab-417b-8a46-485f4d69b0f0-kube-api-access-zlwtd\") pod \"route-controller-manager-67679b45f6-zc9dp\" (UID: \"a7d34513-4dab-417b-8a46-485f4d69b0f0\") " pod="openshift-route-controller-manager/route-controller-manager-67679b45f6-zc9dp" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.576601 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/baf9f8d0-6947-408f-b2c9-404c3ee736ef-client-ca\") pod \"controller-manager-5d6b7b4f9b-rj4jr\" (UID: \"baf9f8d0-6947-408f-b2c9-404c3ee736ef\") " pod="openshift-controller-manager/controller-manager-5d6b7b4f9b-rj4jr" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.576624 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/baf9f8d0-6947-408f-b2c9-404c3ee736ef-proxy-ca-bundles\") pod \"controller-manager-5d6b7b4f9b-rj4jr\" (UID: \"baf9f8d0-6947-408f-b2c9-404c3ee736ef\") " pod="openshift-controller-manager/controller-manager-5d6b7b4f9b-rj4jr" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.577568 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baf9f8d0-6947-408f-b2c9-404c3ee736ef-config\") pod \"controller-manager-5d6b7b4f9b-rj4jr\" (UID: \"baf9f8d0-6947-408f-b2c9-404c3ee736ef\") " pod="openshift-controller-manager/controller-manager-5d6b7b4f9b-rj4jr" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.580389 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/baf9f8d0-6947-408f-b2c9-404c3ee736ef-serving-cert\") pod \"controller-manager-5d6b7b4f9b-rj4jr\" (UID: \"baf9f8d0-6947-408f-b2c9-404c3ee736ef\") " pod="openshift-controller-manager/controller-manager-5d6b7b4f9b-rj4jr" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.592817 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhdml\" (UniqueName: \"kubernetes.io/projected/baf9f8d0-6947-408f-b2c9-404c3ee736ef-kube-api-access-fhdml\") pod \"controller-manager-5d6b7b4f9b-rj4jr\" (UID: \"baf9f8d0-6947-408f-b2c9-404c3ee736ef\") " pod="openshift-controller-manager/controller-manager-5d6b7b4f9b-rj4jr" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.676619 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7d34513-4dab-417b-8a46-485f4d69b0f0-serving-cert\") pod \"route-controller-manager-67679b45f6-zc9dp\" (UID: \"a7d34513-4dab-417b-8a46-485f4d69b0f0\") " pod="openshift-route-controller-manager/route-controller-manager-67679b45f6-zc9dp" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.676733 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7d34513-4dab-417b-8a46-485f4d69b0f0-client-ca\") pod \"route-controller-manager-67679b45f6-zc9dp\" (UID: \"a7d34513-4dab-417b-8a46-485f4d69b0f0\") " pod="openshift-route-controller-manager/route-controller-manager-67679b45f6-zc9dp" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.676803 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7d34513-4dab-417b-8a46-485f4d69b0f0-config\") pod \"route-controller-manager-67679b45f6-zc9dp\" (UID: \"a7d34513-4dab-417b-8a46-485f4d69b0f0\") " pod="openshift-route-controller-manager/route-controller-manager-67679b45f6-zc9dp" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.676854 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlwtd\" (UniqueName: \"kubernetes.io/projected/a7d34513-4dab-417b-8a46-485f4d69b0f0-kube-api-access-zlwtd\") pod \"route-controller-manager-67679b45f6-zc9dp\" (UID: \"a7d34513-4dab-417b-8a46-485f4d69b0f0\") " pod="openshift-route-controller-manager/route-controller-manager-67679b45f6-zc9dp" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.678035 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7d34513-4dab-417b-8a46-485f4d69b0f0-client-ca\") pod \"route-controller-manager-67679b45f6-zc9dp\" (UID: \"a7d34513-4dab-417b-8a46-485f4d69b0f0\") " pod="openshift-route-controller-manager/route-controller-manager-67679b45f6-zc9dp" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.678348 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7d34513-4dab-417b-8a46-485f4d69b0f0-config\") pod \"route-controller-manager-67679b45f6-zc9dp\" (UID: \"a7d34513-4dab-417b-8a46-485f4d69b0f0\") " pod="openshift-route-controller-manager/route-controller-manager-67679b45f6-zc9dp" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.682144 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7d34513-4dab-417b-8a46-485f4d69b0f0-serving-cert\") pod \"route-controller-manager-67679b45f6-zc9dp\" (UID: \"a7d34513-4dab-417b-8a46-485f4d69b0f0\") " pod="openshift-route-controller-manager/route-controller-manager-67679b45f6-zc9dp" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.695031 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlwtd\" (UniqueName: \"kubernetes.io/projected/a7d34513-4dab-417b-8a46-485f4d69b0f0-kube-api-access-zlwtd\") pod \"route-controller-manager-67679b45f6-zc9dp\" (UID: \"a7d34513-4dab-417b-8a46-485f4d69b0f0\") " pod="openshift-route-controller-manager/route-controller-manager-67679b45f6-zc9dp" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.779825 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d6b7b4f9b-rj4jr" Feb 18 14:00:22 crc kubenswrapper[4817]: I0218 14:00:22.788619 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67679b45f6-zc9dp" Feb 18 14:00:23 crc kubenswrapper[4817]: I0218 14:00:23.093340 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d6b7b4f9b-rj4jr"] Feb 18 14:00:23 crc kubenswrapper[4817]: I0218 14:00:23.237475 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67679b45f6-zc9dp"] Feb 18 14:00:23 crc kubenswrapper[4817]: W0218 14:00:23.241199 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7d34513_4dab_417b_8a46_485f4d69b0f0.slice/crio-bc1039387b7498ab4518b596624e4a28b03653d2edde0ac65f82a20cc51b4715 WatchSource:0}: Error finding container bc1039387b7498ab4518b596624e4a28b03653d2edde0ac65f82a20cc51b4715: Status 404 returned error can't find the container with id bc1039387b7498ab4518b596624e4a28b03653d2edde0ac65f82a20cc51b4715 Feb 18 14:00:24 crc kubenswrapper[4817]: I0218 14:00:24.109106 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67679b45f6-zc9dp" event={"ID":"a7d34513-4dab-417b-8a46-485f4d69b0f0","Type":"ContainerStarted","Data":"81747b7bce374f699fde9e66791d8f8e3fb4e115052f57b1f6cee52363d1e7b9"} Feb 18 14:00:24 crc kubenswrapper[4817]: I0218 14:00:24.109640 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-67679b45f6-zc9dp" Feb 18 14:00:24 crc kubenswrapper[4817]: I0218 14:00:24.109661 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67679b45f6-zc9dp" event={"ID":"a7d34513-4dab-417b-8a46-485f4d69b0f0","Type":"ContainerStarted","Data":"bc1039387b7498ab4518b596624e4a28b03653d2edde0ac65f82a20cc51b4715"} Feb 18 14:00:24 crc kubenswrapper[4817]: I0218 14:00:24.111461 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d6b7b4f9b-rj4jr" event={"ID":"baf9f8d0-6947-408f-b2c9-404c3ee736ef","Type":"ContainerStarted","Data":"40b7df6d72c27392b5e43146e1e4e12a88088599937810b2285e75a27969cdfd"} Feb 18 14:00:24 crc kubenswrapper[4817]: I0218 14:00:24.111537 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d6b7b4f9b-rj4jr" event={"ID":"baf9f8d0-6947-408f-b2c9-404c3ee736ef","Type":"ContainerStarted","Data":"b5c66ec1889d9fcc32d135292dcbaaa3d8bc66756b48840ea1632a04f8c3832d"} Feb 18 14:00:24 crc kubenswrapper[4817]: I0218 14:00:24.111778 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5d6b7b4f9b-rj4jr" Feb 18 14:00:24 crc kubenswrapper[4817]: I0218 14:00:24.117316 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5d6b7b4f9b-rj4jr" Feb 18 14:00:24 crc kubenswrapper[4817]: I0218 14:00:24.135586 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-67679b45f6-zc9dp" podStartSLOduration=4.135560886 podStartE2EDuration="4.135560886s" podCreationTimestamp="2026-02-18 14:00:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:00:24.131936082 +0000 UTC m=+86.707472085" watchObservedRunningTime="2026-02-18 14:00:24.135560886 +0000 UTC m=+86.711096889" Feb 18 14:00:24 crc kubenswrapper[4817]: I0218 14:00:24.153354 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5d6b7b4f9b-rj4jr" podStartSLOduration=4.153335126 podStartE2EDuration="4.153335126s" podCreationTimestamp="2026-02-18 14:00:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:00:24.151865458 +0000 UTC m=+86.727401451" watchObservedRunningTime="2026-02-18 14:00:24.153335126 +0000 UTC m=+86.728871119" Feb 18 14:00:24 crc kubenswrapper[4817]: I0218 14:00:24.338694 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-67679b45f6-zc9dp" Feb 18 14:00:24 crc kubenswrapper[4817]: I0218 14:00:24.689616 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jp6w5" Feb 18 14:00:24 crc kubenswrapper[4817]: I0218 14:00:24.689855 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jp6w5" Feb 18 14:00:24 crc kubenswrapper[4817]: I0218 14:00:24.903224 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jp6w5" Feb 18 14:00:25 crc kubenswrapper[4817]: I0218 14:00:25.033417 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 18 14:00:25 crc kubenswrapper[4817]: I0218 14:00:25.034869 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 14:00:25 crc kubenswrapper[4817]: I0218 14:00:25.037813 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 18 14:00:25 crc kubenswrapper[4817]: I0218 14:00:25.038723 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 18 14:00:25 crc kubenswrapper[4817]: I0218 14:00:25.051061 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 18 14:00:25 crc kubenswrapper[4817]: I0218 14:00:25.109887 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3355df72-6ce6-4303-93e6-c957dc265d43-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3355df72-6ce6-4303-93e6-c957dc265d43\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 14:00:25 crc kubenswrapper[4817]: I0218 14:00:25.110090 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3355df72-6ce6-4303-93e6-c957dc265d43-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3355df72-6ce6-4303-93e6-c957dc265d43\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 14:00:25 crc kubenswrapper[4817]: I0218 14:00:25.159864 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jp6w5" Feb 18 14:00:25 crc kubenswrapper[4817]: I0218 14:00:25.211373 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3355df72-6ce6-4303-93e6-c957dc265d43-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3355df72-6ce6-4303-93e6-c957dc265d43\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 14:00:25 crc kubenswrapper[4817]: I0218 14:00:25.211510 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3355df72-6ce6-4303-93e6-c957dc265d43-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3355df72-6ce6-4303-93e6-c957dc265d43\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 14:00:25 crc kubenswrapper[4817]: I0218 14:00:25.211810 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3355df72-6ce6-4303-93e6-c957dc265d43-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3355df72-6ce6-4303-93e6-c957dc265d43\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 14:00:25 crc kubenswrapper[4817]: I0218 14:00:25.232959 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3355df72-6ce6-4303-93e6-c957dc265d43-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3355df72-6ce6-4303-93e6-c957dc265d43\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 14:00:25 crc kubenswrapper[4817]: I0218 14:00:25.352246 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 14:00:25 crc kubenswrapper[4817]: I0218 14:00:25.824899 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 18 14:00:25 crc kubenswrapper[4817]: W0218 14:00:25.833212 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3355df72_6ce6_4303_93e6_c957dc265d43.slice/crio-a216e32c9ee8e2f98b0c09b9ed7b50631efc5ca910ba06e2372631d183e2dd00 WatchSource:0}: Error finding container a216e32c9ee8e2f98b0c09b9ed7b50631efc5ca910ba06e2372631d183e2dd00: Status 404 returned error can't find the container with id a216e32c9ee8e2f98b0c09b9ed7b50631efc5ca910ba06e2372631d183e2dd00 Feb 18 14:00:25 crc kubenswrapper[4817]: I0218 14:00:25.916642 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b7925" Feb 18 14:00:25 crc kubenswrapper[4817]: I0218 14:00:25.916730 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b7925" Feb 18 14:00:26 crc kubenswrapper[4817]: I0218 14:00:26.124797 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3355df72-6ce6-4303-93e6-c957dc265d43","Type":"ContainerStarted","Data":"a216e32c9ee8e2f98b0c09b9ed7b50631efc5ca910ba06e2372631d183e2dd00"} Feb 18 14:00:26 crc kubenswrapper[4817]: I0218 14:00:26.977877 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b7925" podUID="44a62058-ed9b-4364-97d0-09af2bb1c22d" containerName="registry-server" probeResult="failure" output=< Feb 18 14:00:26 crc kubenswrapper[4817]: timeout: failed to connect service ":50051" within 1s Feb 18 14:00:26 crc kubenswrapper[4817]: > Feb 18 14:00:27 crc kubenswrapper[4817]: I0218 14:00:27.136046 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3355df72-6ce6-4303-93e6-c957dc265d43","Type":"ContainerStarted","Data":"d7299e2f8396f1e5d0d2399ed313dea3d83cced8214426be417ac0b5257587c7"} Feb 18 14:00:27 crc kubenswrapper[4817]: I0218 14:00:27.156879 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.156854016 podStartE2EDuration="2.156854016s" podCreationTimestamp="2026-02-18 14:00:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:00:27.156496657 +0000 UTC m=+89.732032650" watchObservedRunningTime="2026-02-18 14:00:27.156854016 +0000 UTC m=+89.732389999" Feb 18 14:00:28 crc kubenswrapper[4817]: I0218 14:00:28.144629 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7rx2s" event={"ID":"6a6721c8-f88b-4812-9cce-0e6d959d5fa6","Type":"ContainerStarted","Data":"a6f16947d34f156b8bc2a1bbb19ce4c538cad6f5c149b7b7328d86a76beb7b91"} Feb 18 14:00:28 crc kubenswrapper[4817]: I0218 14:00:28.148079 4817 generic.go:334] "Generic (PLEG): container finished" podID="3355df72-6ce6-4303-93e6-c957dc265d43" containerID="d7299e2f8396f1e5d0d2399ed313dea3d83cced8214426be417ac0b5257587c7" exitCode=0 Feb 18 14:00:28 crc kubenswrapper[4817]: I0218 14:00:28.148153 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3355df72-6ce6-4303-93e6-c957dc265d43","Type":"ContainerDied","Data":"d7299e2f8396f1e5d0d2399ed313dea3d83cced8214426be417ac0b5257587c7"} Feb 18 14:00:28 crc kubenswrapper[4817]: I0218 14:00:28.151511 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnrfx" event={"ID":"262d049a-2c52-453b-a054-3a17b595d535","Type":"ContainerStarted","Data":"2d9d2e2ff1a1966c633ab7b28ba0f193515d3c1f1e534ef12985232374250949"} Feb 18 14:00:29 crc kubenswrapper[4817]: I0218 14:00:29.163941 4817 generic.go:334] "Generic (PLEG): container finished" podID="6a6721c8-f88b-4812-9cce-0e6d959d5fa6" containerID="a6f16947d34f156b8bc2a1bbb19ce4c538cad6f5c149b7b7328d86a76beb7b91" exitCode=0 Feb 18 14:00:29 crc kubenswrapper[4817]: I0218 14:00:29.164042 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7rx2s" event={"ID":"6a6721c8-f88b-4812-9cce-0e6d959d5fa6","Type":"ContainerDied","Data":"a6f16947d34f156b8bc2a1bbb19ce4c538cad6f5c149b7b7328d86a76beb7b91"} Feb 18 14:00:29 crc kubenswrapper[4817]: I0218 14:00:29.168361 4817 generic.go:334] "Generic (PLEG): container finished" podID="262d049a-2c52-453b-a054-3a17b595d535" containerID="2d9d2e2ff1a1966c633ab7b28ba0f193515d3c1f1e534ef12985232374250949" exitCode=0 Feb 18 14:00:29 crc kubenswrapper[4817]: I0218 14:00:29.168415 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnrfx" event={"ID":"262d049a-2c52-453b-a054-3a17b595d535","Type":"ContainerDied","Data":"2d9d2e2ff1a1966c633ab7b28ba0f193515d3c1f1e534ef12985232374250949"} Feb 18 14:00:29 crc kubenswrapper[4817]: I0218 14:00:29.965220 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 18 14:00:30 crc kubenswrapper[4817]: I0218 14:00:30.260862 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 14:00:30 crc kubenswrapper[4817]: I0218 14:00:30.391426 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3355df72-6ce6-4303-93e6-c957dc265d43-kubelet-dir\") pod \"3355df72-6ce6-4303-93e6-c957dc265d43\" (UID: \"3355df72-6ce6-4303-93e6-c957dc265d43\") " Feb 18 14:00:30 crc kubenswrapper[4817]: I0218 14:00:30.391902 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3355df72-6ce6-4303-93e6-c957dc265d43-kube-api-access\") pod \"3355df72-6ce6-4303-93e6-c957dc265d43\" (UID: \"3355df72-6ce6-4303-93e6-c957dc265d43\") " Feb 18 14:00:30 crc kubenswrapper[4817]: I0218 14:00:30.391628 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3355df72-6ce6-4303-93e6-c957dc265d43-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3355df72-6ce6-4303-93e6-c957dc265d43" (UID: "3355df72-6ce6-4303-93e6-c957dc265d43"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 14:00:30 crc kubenswrapper[4817]: I0218 14:00:30.392704 4817 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3355df72-6ce6-4303-93e6-c957dc265d43-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:30 crc kubenswrapper[4817]: I0218 14:00:30.402689 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3355df72-6ce6-4303-93e6-c957dc265d43-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3355df72-6ce6-4303-93e6-c957dc265d43" (UID: "3355df72-6ce6-4303-93e6-c957dc265d43"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:00:30 crc kubenswrapper[4817]: I0218 14:00:30.494292 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3355df72-6ce6-4303-93e6-c957dc265d43-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:31 crc kubenswrapper[4817]: I0218 14:00:31.189185 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7rx2s" event={"ID":"6a6721c8-f88b-4812-9cce-0e6d959d5fa6","Type":"ContainerStarted","Data":"6f8024755b91fac9bd1bc15e10de3a92c8205a5f11e81299e51d2373a75e4f2e"} Feb 18 14:00:31 crc kubenswrapper[4817]: I0218 14:00:31.191099 4817 generic.go:334] "Generic (PLEG): container finished" podID="99fce15c-b13c-4341-b1a0-494d5bd3f76a" containerID="ba58bd67f7538c6969c429f5ac015b747a611196bf153a0a065e5999ea137338" exitCode=0 Feb 18 14:00:31 crc kubenswrapper[4817]: I0218 14:00:31.191144 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-86tmt" event={"ID":"99fce15c-b13c-4341-b1a0-494d5bd3f76a","Type":"ContainerDied","Data":"ba58bd67f7538c6969c429f5ac015b747a611196bf153a0a065e5999ea137338"} Feb 18 14:00:31 crc kubenswrapper[4817]: I0218 14:00:31.193326 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 18 14:00:31 crc kubenswrapper[4817]: I0218 14:00:31.193313 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3355df72-6ce6-4303-93e6-c957dc265d43","Type":"ContainerDied","Data":"a216e32c9ee8e2f98b0c09b9ed7b50631efc5ca910ba06e2372631d183e2dd00"} Feb 18 14:00:31 crc kubenswrapper[4817]: I0218 14:00:31.193430 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a216e32c9ee8e2f98b0c09b9ed7b50631efc5ca910ba06e2372631d183e2dd00" Feb 18 14:00:31 crc kubenswrapper[4817]: I0218 14:00:31.196742 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnrfx" event={"ID":"262d049a-2c52-453b-a054-3a17b595d535","Type":"ContainerStarted","Data":"7ec2b58eecf629d97d2f22905fe6a3111b678e0ccdcf90e5126468ab53580908"} Feb 18 14:00:31 crc kubenswrapper[4817]: I0218 14:00:31.292309 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7rx2s" podStartSLOduration=3.518297794 podStartE2EDuration="47.292283873s" podCreationTimestamp="2026-02-18 13:59:44 +0000 UTC" firstStartedPulling="2026-02-18 13:59:46.507740907 +0000 UTC m=+49.083276890" lastFinishedPulling="2026-02-18 14:00:30.281726946 +0000 UTC m=+92.857262969" observedRunningTime="2026-02-18 14:00:31.289218343 +0000 UTC m=+93.864754326" watchObservedRunningTime="2026-02-18 14:00:31.292283873 +0000 UTC m=+93.867819886" Feb 18 14:00:31 crc kubenswrapper[4817]: I0218 14:00:31.309820 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tnrfx" podStartSLOduration=3.607965817 podStartE2EDuration="46.309800146s" podCreationTimestamp="2026-02-18 13:59:45 +0000 UTC" firstStartedPulling="2026-02-18 13:59:47.611848694 +0000 UTC m=+50.187384677" lastFinishedPulling="2026-02-18 14:00:30.313683023 +0000 UTC m=+92.889219006" observedRunningTime="2026-02-18 14:00:31.308086401 +0000 UTC m=+93.883622384" watchObservedRunningTime="2026-02-18 14:00:31.309800146 +0000 UTC m=+93.885336129" Feb 18 14:00:32 crc kubenswrapper[4817]: I0218 14:00:32.205224 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-86tmt" event={"ID":"99fce15c-b13c-4341-b1a0-494d5bd3f76a","Type":"ContainerStarted","Data":"7bc4127ce676e9891883629547522ec5e19fbe72da2493c2841734f470eb6e51"} Feb 18 14:00:32 crc kubenswrapper[4817]: I0218 14:00:32.226347 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-86tmt" podStartSLOduration=4.005505182 podStartE2EDuration="50.226323918s" podCreationTimestamp="2026-02-18 13:59:42 +0000 UTC" firstStartedPulling="2026-02-18 13:59:45.448995745 +0000 UTC m=+48.024531728" lastFinishedPulling="2026-02-18 14:00:31.669814471 +0000 UTC m=+94.245350464" observedRunningTime="2026-02-18 14:00:32.225871297 +0000 UTC m=+94.801407280" watchObservedRunningTime="2026-02-18 14:00:32.226323918 +0000 UTC m=+94.801859901" Feb 18 14:00:32 crc kubenswrapper[4817]: I0218 14:00:32.434897 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 18 14:00:32 crc kubenswrapper[4817]: E0218 14:00:32.438424 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3355df72-6ce6-4303-93e6-c957dc265d43" containerName="pruner" Feb 18 14:00:32 crc kubenswrapper[4817]: I0218 14:00:32.438465 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="3355df72-6ce6-4303-93e6-c957dc265d43" containerName="pruner" Feb 18 14:00:32 crc kubenswrapper[4817]: I0218 14:00:32.438602 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="3355df72-6ce6-4303-93e6-c957dc265d43" containerName="pruner" Feb 18 14:00:32 crc kubenswrapper[4817]: I0218 14:00:32.439219 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 18 14:00:32 crc kubenswrapper[4817]: I0218 14:00:32.442122 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 18 14:00:32 crc kubenswrapper[4817]: I0218 14:00:32.442369 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 18 14:00:32 crc kubenswrapper[4817]: I0218 14:00:32.460756 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 18 14:00:32 crc kubenswrapper[4817]: I0218 14:00:32.527504 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c98705e3-757f-4e25-88df-5dcb9a727afa-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c98705e3-757f-4e25-88df-5dcb9a727afa\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 14:00:32 crc kubenswrapper[4817]: I0218 14:00:32.528101 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c98705e3-757f-4e25-88df-5dcb9a727afa-var-lock\") pod \"installer-9-crc\" (UID: \"c98705e3-757f-4e25-88df-5dcb9a727afa\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 14:00:32 crc kubenswrapper[4817]: I0218 14:00:32.528135 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c98705e3-757f-4e25-88df-5dcb9a727afa-kube-api-access\") pod \"installer-9-crc\" (UID: \"c98705e3-757f-4e25-88df-5dcb9a727afa\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 14:00:32 crc kubenswrapper[4817]: I0218 14:00:32.629368 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c98705e3-757f-4e25-88df-5dcb9a727afa-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c98705e3-757f-4e25-88df-5dcb9a727afa\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 14:00:32 crc kubenswrapper[4817]: I0218 14:00:32.629463 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c98705e3-757f-4e25-88df-5dcb9a727afa-var-lock\") pod \"installer-9-crc\" (UID: \"c98705e3-757f-4e25-88df-5dcb9a727afa\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 14:00:32 crc kubenswrapper[4817]: I0218 14:00:32.629491 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c98705e3-757f-4e25-88df-5dcb9a727afa-kube-api-access\") pod \"installer-9-crc\" (UID: \"c98705e3-757f-4e25-88df-5dcb9a727afa\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 14:00:32 crc kubenswrapper[4817]: I0218 14:00:32.629516 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c98705e3-757f-4e25-88df-5dcb9a727afa-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c98705e3-757f-4e25-88df-5dcb9a727afa\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 14:00:32 crc kubenswrapper[4817]: I0218 14:00:32.629675 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c98705e3-757f-4e25-88df-5dcb9a727afa-var-lock\") pod \"installer-9-crc\" (UID: \"c98705e3-757f-4e25-88df-5dcb9a727afa\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 14:00:32 crc kubenswrapper[4817]: I0218 14:00:32.650917 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c98705e3-757f-4e25-88df-5dcb9a727afa-kube-api-access\") pod \"installer-9-crc\" (UID: \"c98705e3-757f-4e25-88df-5dcb9a727afa\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 18 14:00:32 crc kubenswrapper[4817]: I0218 14:00:32.755795 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 18 14:00:33 crc kubenswrapper[4817]: I0218 14:00:33.192367 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 18 14:00:33 crc kubenswrapper[4817]: I0218 14:00:33.211402 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c98705e3-757f-4e25-88df-5dcb9a727afa","Type":"ContainerStarted","Data":"4cd170d3dbdcaeed34661518f0b963bd6d63d9b4fea413b6e8d9cc38c804f622"} Feb 18 14:00:33 crc kubenswrapper[4817]: I0218 14:00:33.214890 4817 generic.go:334] "Generic (PLEG): container finished" podID="c8e393a5-61e4-4e91-bec4-770687b8d01b" containerID="20c296afdc5f49ae6034c399882826415a69343541925a9cc42470bfa011b92a" exitCode=0 Feb 18 14:00:33 crc kubenswrapper[4817]: I0218 14:00:33.215030 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2z68r" event={"ID":"c8e393a5-61e4-4e91-bec4-770687b8d01b","Type":"ContainerDied","Data":"20c296afdc5f49ae6034c399882826415a69343541925a9cc42470bfa011b92a"} Feb 18 14:00:33 crc kubenswrapper[4817]: I0218 14:00:33.219235 4817 generic.go:334] "Generic (PLEG): container finished" podID="8162b014-86d1-482a-8c7c-eba34fed3f62" containerID="1237dc5b449277e51dd56196d51e311f3272f33a7d5d2085f79ca8f81f4cc16c" exitCode=0 Feb 18 14:00:33 crc kubenswrapper[4817]: I0218 14:00:33.219304 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hxln5" event={"ID":"8162b014-86d1-482a-8c7c-eba34fed3f62","Type":"ContainerDied","Data":"1237dc5b449277e51dd56196d51e311f3272f33a7d5d2085f79ca8f81f4cc16c"} Feb 18 14:00:33 crc kubenswrapper[4817]: I0218 14:00:33.602695 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-86tmt" Feb 18 14:00:33 crc kubenswrapper[4817]: I0218 14:00:33.603279 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-86tmt" Feb 18 14:00:34 crc kubenswrapper[4817]: I0218 14:00:34.228031 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c98705e3-757f-4e25-88df-5dcb9a727afa","Type":"ContainerStarted","Data":"29dde688cf7db1249d35eb8cc245d1611738535cc862bdcf2c52dc0e2ff83e16"} Feb 18 14:00:34 crc kubenswrapper[4817]: I0218 14:00:34.650319 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-86tmt" podUID="99fce15c-b13c-4341-b1a0-494d5bd3f76a" containerName="registry-server" probeResult="failure" output=< Feb 18 14:00:34 crc kubenswrapper[4817]: timeout: failed to connect service ":50051" within 1s Feb 18 14:00:34 crc kubenswrapper[4817]: > Feb 18 14:00:35 crc kubenswrapper[4817]: I0218 14:00:35.139141 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7rx2s" Feb 18 14:00:35 crc kubenswrapper[4817]: I0218 14:00:35.139204 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7rx2s" Feb 18 14:00:35 crc kubenswrapper[4817]: I0218 14:00:35.186198 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7rx2s" Feb 18 14:00:35 crc kubenswrapper[4817]: I0218 14:00:35.236950 4817 generic.go:334] "Generic (PLEG): container finished" podID="307f9900-9137-46bb-9b32-254ae14c8c17" containerID="d19b260d2c3e89c304837eeb0754a6305edfadd5407dda9fd1a76556c9870a23" exitCode=0 Feb 18 14:00:35 crc kubenswrapper[4817]: I0218 14:00:35.237061 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54ftv" event={"ID":"307f9900-9137-46bb-9b32-254ae14c8c17","Type":"ContainerDied","Data":"d19b260d2c3e89c304837eeb0754a6305edfadd5407dda9fd1a76556c9870a23"} Feb 18 14:00:35 crc kubenswrapper[4817]: I0218 14:00:35.241475 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2z68r" event={"ID":"c8e393a5-61e4-4e91-bec4-770687b8d01b","Type":"ContainerStarted","Data":"7fb66e54528ab6ff9d1c0c8d4b09001788707003b8efc16d7c6c01848f98f457"} Feb 18 14:00:35 crc kubenswrapper[4817]: I0218 14:00:35.246815 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hxln5" event={"ID":"8162b014-86d1-482a-8c7c-eba34fed3f62","Type":"ContainerStarted","Data":"7775c4178f5b2931c0ed71c9f9adbc601246b52aff904af7bd43e960221a1d9b"} Feb 18 14:00:35 crc kubenswrapper[4817]: I0218 14:00:35.300647 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hxln5" podStartSLOduration=3.093440735 podStartE2EDuration="53.300624491s" podCreationTimestamp="2026-02-18 13:59:42 +0000 UTC" firstStartedPulling="2026-02-18 13:59:44.253539034 +0000 UTC m=+46.829075017" lastFinishedPulling="2026-02-18 14:00:34.46072279 +0000 UTC m=+97.036258773" observedRunningTime="2026-02-18 14:00:35.281459935 +0000 UTC m=+97.856995908" watchObservedRunningTime="2026-02-18 14:00:35.300624491 +0000 UTC m=+97.876160474" Feb 18 14:00:35 crc kubenswrapper[4817]: I0218 14:00:35.306424 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.30640832 podStartE2EDuration="3.30640832s" podCreationTimestamp="2026-02-18 14:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:00:35.300247171 +0000 UTC m=+97.875783174" watchObservedRunningTime="2026-02-18 14:00:35.30640832 +0000 UTC m=+97.881944303" Feb 18 14:00:35 crc kubenswrapper[4817]: I0218 14:00:35.307622 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7rx2s" Feb 18 14:00:35 crc kubenswrapper[4817]: I0218 14:00:35.319163 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2z68r" podStartSLOduration=3.210490054 podStartE2EDuration="53.31913567s" podCreationTimestamp="2026-02-18 13:59:42 +0000 UTC" firstStartedPulling="2026-02-18 13:59:44.244382137 +0000 UTC m=+46.819918120" lastFinishedPulling="2026-02-18 14:00:34.353027753 +0000 UTC m=+96.928563736" observedRunningTime="2026-02-18 14:00:35.318321469 +0000 UTC m=+97.893857452" watchObservedRunningTime="2026-02-18 14:00:35.31913567 +0000 UTC m=+97.894671653" Feb 18 14:00:35 crc kubenswrapper[4817]: I0218 14:00:35.966951 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b7925" Feb 18 14:00:36 crc kubenswrapper[4817]: I0218 14:00:36.012409 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b7925" Feb 18 14:00:36 crc kubenswrapper[4817]: I0218 14:00:36.268341 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tnrfx" Feb 18 14:00:36 crc kubenswrapper[4817]: I0218 14:00:36.268401 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tnrfx" Feb 18 14:00:36 crc kubenswrapper[4817]: I0218 14:00:36.557092 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7rx2s"] Feb 18 14:00:37 crc kubenswrapper[4817]: I0218 14:00:37.261409 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7rx2s" podUID="6a6721c8-f88b-4812-9cce-0e6d959d5fa6" containerName="registry-server" containerID="cri-o://6f8024755b91fac9bd1bc15e10de3a92c8205a5f11e81299e51d2373a75e4f2e" gracePeriod=2 Feb 18 14:00:37 crc kubenswrapper[4817]: I0218 14:00:37.311743 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tnrfx" podUID="262d049a-2c52-453b-a054-3a17b595d535" containerName="registry-server" probeResult="failure" output=< Feb 18 14:00:37 crc kubenswrapper[4817]: timeout: failed to connect service ":50051" within 1s Feb 18 14:00:37 crc kubenswrapper[4817]: > Feb 18 14:00:38 crc kubenswrapper[4817]: I0218 14:00:38.269606 4817 generic.go:334] "Generic (PLEG): container finished" podID="6a6721c8-f88b-4812-9cce-0e6d959d5fa6" containerID="6f8024755b91fac9bd1bc15e10de3a92c8205a5f11e81299e51d2373a75e4f2e" exitCode=0 Feb 18 14:00:38 crc kubenswrapper[4817]: I0218 14:00:38.269701 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7rx2s" event={"ID":"6a6721c8-f88b-4812-9cce-0e6d959d5fa6","Type":"ContainerDied","Data":"6f8024755b91fac9bd1bc15e10de3a92c8205a5f11e81299e51d2373a75e4f2e"} Feb 18 14:00:38 crc kubenswrapper[4817]: I0218 14:00:38.275000 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54ftv" event={"ID":"307f9900-9137-46bb-9b32-254ae14c8c17","Type":"ContainerStarted","Data":"c26cc50afbd01264ca9d349db92397eee85590040151ab1b7faed36a648304e5"} Feb 18 14:00:38 crc kubenswrapper[4817]: I0218 14:00:38.301412 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-54ftv" podStartSLOduration=2.964726325 podStartE2EDuration="56.30138651s" podCreationTimestamp="2026-02-18 13:59:42 +0000 UTC" firstStartedPulling="2026-02-18 13:59:44.213141589 +0000 UTC m=+46.788677572" lastFinishedPulling="2026-02-18 14:00:37.549801764 +0000 UTC m=+100.125337757" observedRunningTime="2026-02-18 14:00:38.298535536 +0000 UTC m=+100.874071539" watchObservedRunningTime="2026-02-18 14:00:38.30138651 +0000 UTC m=+100.876922493" Feb 18 14:00:38 crc kubenswrapper[4817]: I0218 14:00:38.744705 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7rx2s" Feb 18 14:00:38 crc kubenswrapper[4817]: I0218 14:00:38.819537 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkmk4\" (UniqueName: \"kubernetes.io/projected/6a6721c8-f88b-4812-9cce-0e6d959d5fa6-kube-api-access-dkmk4\") pod \"6a6721c8-f88b-4812-9cce-0e6d959d5fa6\" (UID: \"6a6721c8-f88b-4812-9cce-0e6d959d5fa6\") " Feb 18 14:00:38 crc kubenswrapper[4817]: I0218 14:00:38.819625 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a6721c8-f88b-4812-9cce-0e6d959d5fa6-utilities\") pod \"6a6721c8-f88b-4812-9cce-0e6d959d5fa6\" (UID: \"6a6721c8-f88b-4812-9cce-0e6d959d5fa6\") " Feb 18 14:00:38 crc kubenswrapper[4817]: I0218 14:00:38.819673 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a6721c8-f88b-4812-9cce-0e6d959d5fa6-catalog-content\") pod \"6a6721c8-f88b-4812-9cce-0e6d959d5fa6\" (UID: \"6a6721c8-f88b-4812-9cce-0e6d959d5fa6\") " Feb 18 14:00:38 crc kubenswrapper[4817]: I0218 14:00:38.820564 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a6721c8-f88b-4812-9cce-0e6d959d5fa6-utilities" (OuterVolumeSpecName: "utilities") pod "6a6721c8-f88b-4812-9cce-0e6d959d5fa6" (UID: "6a6721c8-f88b-4812-9cce-0e6d959d5fa6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:00:38 crc kubenswrapper[4817]: I0218 14:00:38.827202 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a6721c8-f88b-4812-9cce-0e6d959d5fa6-kube-api-access-dkmk4" (OuterVolumeSpecName: "kube-api-access-dkmk4") pod "6a6721c8-f88b-4812-9cce-0e6d959d5fa6" (UID: "6a6721c8-f88b-4812-9cce-0e6d959d5fa6"). InnerVolumeSpecName "kube-api-access-dkmk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:00:38 crc kubenswrapper[4817]: I0218 14:00:38.859390 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a6721c8-f88b-4812-9cce-0e6d959d5fa6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a6721c8-f88b-4812-9cce-0e6d959d5fa6" (UID: "6a6721c8-f88b-4812-9cce-0e6d959d5fa6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:00:38 crc kubenswrapper[4817]: I0218 14:00:38.924645 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkmk4\" (UniqueName: \"kubernetes.io/projected/6a6721c8-f88b-4812-9cce-0e6d959d5fa6-kube-api-access-dkmk4\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:38 crc kubenswrapper[4817]: I0218 14:00:38.924686 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a6721c8-f88b-4812-9cce-0e6d959d5fa6-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:38 crc kubenswrapper[4817]: I0218 14:00:38.924697 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a6721c8-f88b-4812-9cce-0e6d959d5fa6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:39 crc kubenswrapper[4817]: I0218 14:00:39.284540 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7rx2s" event={"ID":"6a6721c8-f88b-4812-9cce-0e6d959d5fa6","Type":"ContainerDied","Data":"89123074c9267fed28141100445d38cd151b06de54c450a33a7a15d7b3afd2e5"} Feb 18 14:00:39 crc kubenswrapper[4817]: I0218 14:00:39.284635 4817 scope.go:117] "RemoveContainer" containerID="6f8024755b91fac9bd1bc15e10de3a92c8205a5f11e81299e51d2373a75e4f2e" Feb 18 14:00:39 crc kubenswrapper[4817]: I0218 14:00:39.284576 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7rx2s" Feb 18 14:00:39 crc kubenswrapper[4817]: I0218 14:00:39.317196 4817 scope.go:117] "RemoveContainer" containerID="a6f16947d34f156b8bc2a1bbb19ce4c538cad6f5c149b7b7328d86a76beb7b91" Feb 18 14:00:39 crc kubenswrapper[4817]: I0218 14:00:39.318644 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7rx2s"] Feb 18 14:00:39 crc kubenswrapper[4817]: I0218 14:00:39.323814 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7rx2s"] Feb 18 14:00:39 crc kubenswrapper[4817]: I0218 14:00:39.354612 4817 scope.go:117] "RemoveContainer" containerID="0c1fe5e133d24e57580de1ecbfddb8804019d9dbcbcecc4e4d25a091a3cdef12" Feb 18 14:00:40 crc kubenswrapper[4817]: I0218 14:00:40.184700 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a6721c8-f88b-4812-9cce-0e6d959d5fa6" path="/var/lib/kubelet/pods/6a6721c8-f88b-4812-9cce-0e6d959d5fa6/volumes" Feb 18 14:00:40 crc kubenswrapper[4817]: I0218 14:00:40.390548 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d6b7b4f9b-rj4jr"] Feb 18 14:00:40 crc kubenswrapper[4817]: I0218 14:00:40.390836 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5d6b7b4f9b-rj4jr" podUID="baf9f8d0-6947-408f-b2c9-404c3ee736ef" containerName="controller-manager" containerID="cri-o://40b7df6d72c27392b5e43146e1e4e12a88088599937810b2285e75a27969cdfd" gracePeriod=30 Feb 18 14:00:40 crc kubenswrapper[4817]: I0218 14:00:40.413694 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67679b45f6-zc9dp"] Feb 18 14:00:40 crc kubenswrapper[4817]: I0218 14:00:40.413946 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-67679b45f6-zc9dp" podUID="a7d34513-4dab-417b-8a46-485f4d69b0f0" containerName="route-controller-manager" containerID="cri-o://81747b7bce374f699fde9e66791d8f8e3fb4e115052f57b1f6cee52363d1e7b9" gracePeriod=30 Feb 18 14:00:41 crc kubenswrapper[4817]: I0218 14:00:41.304904 4817 generic.go:334] "Generic (PLEG): container finished" podID="baf9f8d0-6947-408f-b2c9-404c3ee736ef" containerID="40b7df6d72c27392b5e43146e1e4e12a88088599937810b2285e75a27969cdfd" exitCode=0 Feb 18 14:00:41 crc kubenswrapper[4817]: I0218 14:00:41.305064 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d6b7b4f9b-rj4jr" event={"ID":"baf9f8d0-6947-408f-b2c9-404c3ee736ef","Type":"ContainerDied","Data":"40b7df6d72c27392b5e43146e1e4e12a88088599937810b2285e75a27969cdfd"} Feb 18 14:00:41 crc kubenswrapper[4817]: I0218 14:00:41.307291 4817 generic.go:334] "Generic (PLEG): container finished" podID="a7d34513-4dab-417b-8a46-485f4d69b0f0" containerID="81747b7bce374f699fde9e66791d8f8e3fb4e115052f57b1f6cee52363d1e7b9" exitCode=0 Feb 18 14:00:41 crc kubenswrapper[4817]: I0218 14:00:41.307362 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67679b45f6-zc9dp" event={"ID":"a7d34513-4dab-417b-8a46-485f4d69b0f0","Type":"ContainerDied","Data":"81747b7bce374f699fde9e66791d8f8e3fb4e115052f57b1f6cee52363d1e7b9"} Feb 18 14:00:41 crc kubenswrapper[4817]: I0218 14:00:41.887182 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67679b45f6-zc9dp" Feb 18 14:00:41 crc kubenswrapper[4817]: I0218 14:00:41.918268 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75d476cb65-v9cbc"] Feb 18 14:00:41 crc kubenswrapper[4817]: E0218 14:00:41.918686 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a6721c8-f88b-4812-9cce-0e6d959d5fa6" containerName="extract-content" Feb 18 14:00:41 crc kubenswrapper[4817]: I0218 14:00:41.918705 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a6721c8-f88b-4812-9cce-0e6d959d5fa6" containerName="extract-content" Feb 18 14:00:41 crc kubenswrapper[4817]: E0218 14:00:41.918745 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7d34513-4dab-417b-8a46-485f4d69b0f0" containerName="route-controller-manager" Feb 18 14:00:41 crc kubenswrapper[4817]: I0218 14:00:41.918756 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d34513-4dab-417b-8a46-485f4d69b0f0" containerName="route-controller-manager" Feb 18 14:00:41 crc kubenswrapper[4817]: E0218 14:00:41.918772 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a6721c8-f88b-4812-9cce-0e6d959d5fa6" containerName="registry-server" Feb 18 14:00:41 crc kubenswrapper[4817]: I0218 14:00:41.918780 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a6721c8-f88b-4812-9cce-0e6d959d5fa6" containerName="registry-server" Feb 18 14:00:41 crc kubenswrapper[4817]: E0218 14:00:41.918819 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a6721c8-f88b-4812-9cce-0e6d959d5fa6" containerName="extract-utilities" Feb 18 14:00:41 crc kubenswrapper[4817]: I0218 14:00:41.918828 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a6721c8-f88b-4812-9cce-0e6d959d5fa6" containerName="extract-utilities" Feb 18 14:00:41 crc kubenswrapper[4817]: I0218 14:00:41.919057 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7d34513-4dab-417b-8a46-485f4d69b0f0" containerName="route-controller-manager" Feb 18 14:00:41 crc kubenswrapper[4817]: I0218 14:00:41.919084 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a6721c8-f88b-4812-9cce-0e6d959d5fa6" containerName="registry-server" Feb 18 14:00:41 crc kubenswrapper[4817]: I0218 14:00:41.919857 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75d476cb65-v9cbc" Feb 18 14:00:41 crc kubenswrapper[4817]: I0218 14:00:41.940006 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75d476cb65-v9cbc"] Feb 18 14:00:41 crc kubenswrapper[4817]: I0218 14:00:41.963955 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7d34513-4dab-417b-8a46-485f4d69b0f0-serving-cert\") pod \"a7d34513-4dab-417b-8a46-485f4d69b0f0\" (UID: \"a7d34513-4dab-417b-8a46-485f4d69b0f0\") " Feb 18 14:00:41 crc kubenswrapper[4817]: I0218 14:00:41.964045 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7d34513-4dab-417b-8a46-485f4d69b0f0-client-ca\") pod \"a7d34513-4dab-417b-8a46-485f4d69b0f0\" (UID: \"a7d34513-4dab-417b-8a46-485f4d69b0f0\") " Feb 18 14:00:41 crc kubenswrapper[4817]: I0218 14:00:41.964087 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlwtd\" (UniqueName: \"kubernetes.io/projected/a7d34513-4dab-417b-8a46-485f4d69b0f0-kube-api-access-zlwtd\") pod \"a7d34513-4dab-417b-8a46-485f4d69b0f0\" (UID: \"a7d34513-4dab-417b-8a46-485f4d69b0f0\") " Feb 18 14:00:41 crc kubenswrapper[4817]: I0218 14:00:41.964164 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7d34513-4dab-417b-8a46-485f4d69b0f0-config\") pod \"a7d34513-4dab-417b-8a46-485f4d69b0f0\" (UID: \"a7d34513-4dab-417b-8a46-485f4d69b0f0\") " Feb 18 14:00:41 crc kubenswrapper[4817]: I0218 14:00:41.964333 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51819d15-196a-46db-95d9-fd71a6c42921-config\") pod \"route-controller-manager-75d476cb65-v9cbc\" (UID: \"51819d15-196a-46db-95d9-fd71a6c42921\") " pod="openshift-route-controller-manager/route-controller-manager-75d476cb65-v9cbc" Feb 18 14:00:41 crc kubenswrapper[4817]: I0218 14:00:41.964413 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4mm9\" (UniqueName: \"kubernetes.io/projected/51819d15-196a-46db-95d9-fd71a6c42921-kube-api-access-w4mm9\") pod \"route-controller-manager-75d476cb65-v9cbc\" (UID: \"51819d15-196a-46db-95d9-fd71a6c42921\") " pod="openshift-route-controller-manager/route-controller-manager-75d476cb65-v9cbc" Feb 18 14:00:41 crc kubenswrapper[4817]: I0218 14:00:41.964470 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51819d15-196a-46db-95d9-fd71a6c42921-serving-cert\") pod \"route-controller-manager-75d476cb65-v9cbc\" (UID: \"51819d15-196a-46db-95d9-fd71a6c42921\") " pod="openshift-route-controller-manager/route-controller-manager-75d476cb65-v9cbc" Feb 18 14:00:41 crc kubenswrapper[4817]: I0218 14:00:41.964498 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51819d15-196a-46db-95d9-fd71a6c42921-client-ca\") pod \"route-controller-manager-75d476cb65-v9cbc\" (UID: \"51819d15-196a-46db-95d9-fd71a6c42921\") " pod="openshift-route-controller-manager/route-controller-manager-75d476cb65-v9cbc" Feb 18 14:00:41 crc kubenswrapper[4817]: I0218 14:00:41.965247 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7d34513-4dab-417b-8a46-485f4d69b0f0-client-ca" (OuterVolumeSpecName: "client-ca") pod "a7d34513-4dab-417b-8a46-485f4d69b0f0" (UID: "a7d34513-4dab-417b-8a46-485f4d69b0f0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:00:41 crc kubenswrapper[4817]: I0218 14:00:41.965440 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7d34513-4dab-417b-8a46-485f4d69b0f0-config" (OuterVolumeSpecName: "config") pod "a7d34513-4dab-417b-8a46-485f4d69b0f0" (UID: "a7d34513-4dab-417b-8a46-485f4d69b0f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:00:41 crc kubenswrapper[4817]: I0218 14:00:41.971352 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7d34513-4dab-417b-8a46-485f4d69b0f0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a7d34513-4dab-417b-8a46-485f4d69b0f0" (UID: "a7d34513-4dab-417b-8a46-485f4d69b0f0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:00:41 crc kubenswrapper[4817]: I0218 14:00:41.972792 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7d34513-4dab-417b-8a46-485f4d69b0f0-kube-api-access-zlwtd" (OuterVolumeSpecName: "kube-api-access-zlwtd") pod "a7d34513-4dab-417b-8a46-485f4d69b0f0" (UID: "a7d34513-4dab-417b-8a46-485f4d69b0f0"). InnerVolumeSpecName "kube-api-access-zlwtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.049883 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d6b7b4f9b-rj4jr" Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.065914 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4mm9\" (UniqueName: \"kubernetes.io/projected/51819d15-196a-46db-95d9-fd71a6c42921-kube-api-access-w4mm9\") pod \"route-controller-manager-75d476cb65-v9cbc\" (UID: \"51819d15-196a-46db-95d9-fd71a6c42921\") " pod="openshift-route-controller-manager/route-controller-manager-75d476cb65-v9cbc" Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.066021 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51819d15-196a-46db-95d9-fd71a6c42921-serving-cert\") pod \"route-controller-manager-75d476cb65-v9cbc\" (UID: \"51819d15-196a-46db-95d9-fd71a6c42921\") " pod="openshift-route-controller-manager/route-controller-manager-75d476cb65-v9cbc" Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.066050 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51819d15-196a-46db-95d9-fd71a6c42921-client-ca\") pod \"route-controller-manager-75d476cb65-v9cbc\" (UID: \"51819d15-196a-46db-95d9-fd71a6c42921\") " pod="openshift-route-controller-manager/route-controller-manager-75d476cb65-v9cbc" Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.066092 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51819d15-196a-46db-95d9-fd71a6c42921-config\") pod \"route-controller-manager-75d476cb65-v9cbc\" (UID: \"51819d15-196a-46db-95d9-fd71a6c42921\") " pod="openshift-route-controller-manager/route-controller-manager-75d476cb65-v9cbc" Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.066137 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7d34513-4dab-417b-8a46-485f4d69b0f0-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.066174 4817 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7d34513-4dab-417b-8a46-485f4d69b0f0-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.066183 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlwtd\" (UniqueName: \"kubernetes.io/projected/a7d34513-4dab-417b-8a46-485f4d69b0f0-kube-api-access-zlwtd\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.066194 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7d34513-4dab-417b-8a46-485f4d69b0f0-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.067409 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51819d15-196a-46db-95d9-fd71a6c42921-client-ca\") pod \"route-controller-manager-75d476cb65-v9cbc\" (UID: \"51819d15-196a-46db-95d9-fd71a6c42921\") " pod="openshift-route-controller-manager/route-controller-manager-75d476cb65-v9cbc" Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.067991 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51819d15-196a-46db-95d9-fd71a6c42921-config\") pod \"route-controller-manager-75d476cb65-v9cbc\" (UID: \"51819d15-196a-46db-95d9-fd71a6c42921\") " pod="openshift-route-controller-manager/route-controller-manager-75d476cb65-v9cbc" Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.073758 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51819d15-196a-46db-95d9-fd71a6c42921-serving-cert\") pod \"route-controller-manager-75d476cb65-v9cbc\" (UID: \"51819d15-196a-46db-95d9-fd71a6c42921\") " pod="openshift-route-controller-manager/route-controller-manager-75d476cb65-v9cbc" Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.089317 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4mm9\" (UniqueName: \"kubernetes.io/projected/51819d15-196a-46db-95d9-fd71a6c42921-kube-api-access-w4mm9\") pod \"route-controller-manager-75d476cb65-v9cbc\" (UID: \"51819d15-196a-46db-95d9-fd71a6c42921\") " pod="openshift-route-controller-manager/route-controller-manager-75d476cb65-v9cbc" Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.166917 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baf9f8d0-6947-408f-b2c9-404c3ee736ef-config\") pod \"baf9f8d0-6947-408f-b2c9-404c3ee736ef\" (UID: \"baf9f8d0-6947-408f-b2c9-404c3ee736ef\") " Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.166952 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/baf9f8d0-6947-408f-b2c9-404c3ee736ef-client-ca\") pod \"baf9f8d0-6947-408f-b2c9-404c3ee736ef\" (UID: \"baf9f8d0-6947-408f-b2c9-404c3ee736ef\") " Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.166968 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/baf9f8d0-6947-408f-b2c9-404c3ee736ef-serving-cert\") pod \"baf9f8d0-6947-408f-b2c9-404c3ee736ef\" (UID: \"baf9f8d0-6947-408f-b2c9-404c3ee736ef\") " Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.167046 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhdml\" (UniqueName: \"kubernetes.io/projected/baf9f8d0-6947-408f-b2c9-404c3ee736ef-kube-api-access-fhdml\") pod \"baf9f8d0-6947-408f-b2c9-404c3ee736ef\" (UID: \"baf9f8d0-6947-408f-b2c9-404c3ee736ef\") " Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.167132 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/baf9f8d0-6947-408f-b2c9-404c3ee736ef-proxy-ca-bundles\") pod \"baf9f8d0-6947-408f-b2c9-404c3ee736ef\" (UID: \"baf9f8d0-6947-408f-b2c9-404c3ee736ef\") " Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.168269 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baf9f8d0-6947-408f-b2c9-404c3ee736ef-client-ca" (OuterVolumeSpecName: "client-ca") pod "baf9f8d0-6947-408f-b2c9-404c3ee736ef" (UID: "baf9f8d0-6947-408f-b2c9-404c3ee736ef"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.168303 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baf9f8d0-6947-408f-b2c9-404c3ee736ef-config" (OuterVolumeSpecName: "config") pod "baf9f8d0-6947-408f-b2c9-404c3ee736ef" (UID: "baf9f8d0-6947-408f-b2c9-404c3ee736ef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.168833 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baf9f8d0-6947-408f-b2c9-404c3ee736ef-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "baf9f8d0-6947-408f-b2c9-404c3ee736ef" (UID: "baf9f8d0-6947-408f-b2c9-404c3ee736ef"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.170563 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baf9f8d0-6947-408f-b2c9-404c3ee736ef-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "baf9f8d0-6947-408f-b2c9-404c3ee736ef" (UID: "baf9f8d0-6947-408f-b2c9-404c3ee736ef"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.170952 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baf9f8d0-6947-408f-b2c9-404c3ee736ef-kube-api-access-fhdml" (OuterVolumeSpecName: "kube-api-access-fhdml") pod "baf9f8d0-6947-408f-b2c9-404c3ee736ef" (UID: "baf9f8d0-6947-408f-b2c9-404c3ee736ef"). InnerVolumeSpecName "kube-api-access-fhdml". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.244093 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75d476cb65-v9cbc" Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.271919 4817 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/baf9f8d0-6947-408f-b2c9-404c3ee736ef-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.272111 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/baf9f8d0-6947-408f-b2c9-404c3ee736ef-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.272122 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baf9f8d0-6947-408f-b2c9-404c3ee736ef-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.272133 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhdml\" (UniqueName: \"kubernetes.io/projected/baf9f8d0-6947-408f-b2c9-404c3ee736ef-kube-api-access-fhdml\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.272159 4817 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/baf9f8d0-6947-408f-b2c9-404c3ee736ef-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.313905 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67679b45f6-zc9dp" event={"ID":"a7d34513-4dab-417b-8a46-485f4d69b0f0","Type":"ContainerDied","Data":"bc1039387b7498ab4518b596624e4a28b03653d2edde0ac65f82a20cc51b4715"} Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.313963 4817 scope.go:117] "RemoveContainer" containerID="81747b7bce374f699fde9e66791d8f8e3fb4e115052f57b1f6cee52363d1e7b9" Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.314093 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67679b45f6-zc9dp" Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.319833 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d6b7b4f9b-rj4jr" event={"ID":"baf9f8d0-6947-408f-b2c9-404c3ee736ef","Type":"ContainerDied","Data":"b5c66ec1889d9fcc32d135292dcbaaa3d8bc66756b48840ea1632a04f8c3832d"} Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.319924 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d6b7b4f9b-rj4jr" Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.334473 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67679b45f6-zc9dp"] Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.343119 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67679b45f6-zc9dp"] Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.358041 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d6b7b4f9b-rj4jr"] Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.359684 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5d6b7b4f9b-rj4jr"] Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.364044 4817 scope.go:117] "RemoveContainer" containerID="40b7df6d72c27392b5e43146e1e4e12a88088599937810b2285e75a27969cdfd" Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.464929 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75d476cb65-v9cbc"] Feb 18 14:00:42 crc kubenswrapper[4817]: W0218 14:00:42.471194 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51819d15_196a_46db_95d9_fd71a6c42921.slice/crio-f28d27fe581798ba069678bc445176f4b584615c91f3395751c3baad9fd1d4ea WatchSource:0}: Error finding container f28d27fe581798ba069678bc445176f4b584615c91f3395751c3baad9fd1d4ea: Status 404 returned error can't find the container with id f28d27fe581798ba069678bc445176f4b584615c91f3395751c3baad9fd1d4ea Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.693855 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-54ftv" Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.694373 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-54ftv" Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.751780 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-54ftv" Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.863211 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hxln5" Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.863293 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hxln5" Feb 18 14:00:42 crc kubenswrapper[4817]: I0218 14:00:42.913791 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hxln5" Feb 18 14:00:43 crc kubenswrapper[4817]: I0218 14:00:43.053585 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2z68r" Feb 18 14:00:43 crc kubenswrapper[4817]: I0218 14:00:43.053654 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2z68r" Feb 18 14:00:43 crc kubenswrapper[4817]: I0218 14:00:43.095061 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2z68r" Feb 18 14:00:43 crc kubenswrapper[4817]: I0218 14:00:43.332633 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75d476cb65-v9cbc" event={"ID":"51819d15-196a-46db-95d9-fd71a6c42921","Type":"ContainerStarted","Data":"943381e1111f66492875ce8aa6d32c207ac46e89f0bcc4062866e8ae745c065e"} Feb 18 14:00:43 crc kubenswrapper[4817]: I0218 14:00:43.332683 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75d476cb65-v9cbc" event={"ID":"51819d15-196a-46db-95d9-fd71a6c42921","Type":"ContainerStarted","Data":"f28d27fe581798ba069678bc445176f4b584615c91f3395751c3baad9fd1d4ea"} Feb 18 14:00:43 crc kubenswrapper[4817]: I0218 14:00:43.352228 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-75d476cb65-v9cbc" podStartSLOduration=3.352195419 podStartE2EDuration="3.352195419s" podCreationTimestamp="2026-02-18 14:00:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:00:43.351787209 +0000 UTC m=+105.927323222" watchObservedRunningTime="2026-02-18 14:00:43.352195419 +0000 UTC m=+105.927731432" Feb 18 14:00:43 crc kubenswrapper[4817]: I0218 14:00:43.382424 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hxln5" Feb 18 14:00:43 crc kubenswrapper[4817]: I0218 14:00:43.396277 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2z68r" Feb 18 14:00:43 crc kubenswrapper[4817]: I0218 14:00:43.396417 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-54ftv" Feb 18 14:00:43 crc kubenswrapper[4817]: I0218 14:00:43.642524 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-86tmt" Feb 18 14:00:43 crc kubenswrapper[4817]: I0218 14:00:43.701362 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-86tmt" Feb 18 14:00:44 crc kubenswrapper[4817]: I0218 14:00:44.104638 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bclz6"] Feb 18 14:00:44 crc kubenswrapper[4817]: I0218 14:00:44.179284 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7d34513-4dab-417b-8a46-485f4d69b0f0" path="/var/lib/kubelet/pods/a7d34513-4dab-417b-8a46-485f4d69b0f0/volumes" Feb 18 14:00:44 crc kubenswrapper[4817]: I0218 14:00:44.179937 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baf9f8d0-6947-408f-b2c9-404c3ee736ef" path="/var/lib/kubelet/pods/baf9f8d0-6947-408f-b2c9-404c3ee736ef/volumes" Feb 18 14:00:44 crc kubenswrapper[4817]: I0218 14:00:44.338830 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-75d476cb65-v9cbc" Feb 18 14:00:44 crc kubenswrapper[4817]: I0218 14:00:44.344052 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-75d476cb65-v9cbc" Feb 18 14:00:44 crc kubenswrapper[4817]: I0218 14:00:44.450498 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7cdfb6bb75-csdcq"] Feb 18 14:00:44 crc kubenswrapper[4817]: E0218 14:00:44.450822 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baf9f8d0-6947-408f-b2c9-404c3ee736ef" containerName="controller-manager" Feb 18 14:00:44 crc kubenswrapper[4817]: I0218 14:00:44.450835 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="baf9f8d0-6947-408f-b2c9-404c3ee736ef" containerName="controller-manager" Feb 18 14:00:44 crc kubenswrapper[4817]: I0218 14:00:44.450948 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="baf9f8d0-6947-408f-b2c9-404c3ee736ef" containerName="controller-manager" Feb 18 14:00:44 crc kubenswrapper[4817]: I0218 14:00:44.451759 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cdfb6bb75-csdcq" Feb 18 14:00:44 crc kubenswrapper[4817]: I0218 14:00:44.457932 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 18 14:00:44 crc kubenswrapper[4817]: I0218 14:00:44.458178 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 18 14:00:44 crc kubenswrapper[4817]: I0218 14:00:44.458322 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 18 14:00:44 crc kubenswrapper[4817]: I0218 14:00:44.458371 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 18 14:00:44 crc kubenswrapper[4817]: I0218 14:00:44.458899 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 18 14:00:44 crc kubenswrapper[4817]: I0218 14:00:44.459039 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 18 14:00:44 crc kubenswrapper[4817]: I0218 14:00:44.462456 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 18 14:00:44 crc kubenswrapper[4817]: I0218 14:00:44.463854 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7cdfb6bb75-csdcq"] Feb 18 14:00:44 crc kubenswrapper[4817]: I0218 14:00:44.504732 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f619e35-6845-4798-8498-93a544ebf76f-serving-cert\") pod \"controller-manager-7cdfb6bb75-csdcq\" (UID: \"2f619e35-6845-4798-8498-93a544ebf76f\") " pod="openshift-controller-manager/controller-manager-7cdfb6bb75-csdcq" Feb 18 14:00:44 crc kubenswrapper[4817]: I0218 14:00:44.504926 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fvgg\" (UniqueName: \"kubernetes.io/projected/2f619e35-6845-4798-8498-93a544ebf76f-kube-api-access-8fvgg\") pod \"controller-manager-7cdfb6bb75-csdcq\" (UID: \"2f619e35-6845-4798-8498-93a544ebf76f\") " pod="openshift-controller-manager/controller-manager-7cdfb6bb75-csdcq" Feb 18 14:00:44 crc kubenswrapper[4817]: I0218 14:00:44.505151 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f619e35-6845-4798-8498-93a544ebf76f-proxy-ca-bundles\") pod \"controller-manager-7cdfb6bb75-csdcq\" (UID: \"2f619e35-6845-4798-8498-93a544ebf76f\") " pod="openshift-controller-manager/controller-manager-7cdfb6bb75-csdcq" Feb 18 14:00:44 crc kubenswrapper[4817]: I0218 14:00:44.505250 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f619e35-6845-4798-8498-93a544ebf76f-client-ca\") pod \"controller-manager-7cdfb6bb75-csdcq\" (UID: \"2f619e35-6845-4798-8498-93a544ebf76f\") " pod="openshift-controller-manager/controller-manager-7cdfb6bb75-csdcq" Feb 18 14:00:44 crc kubenswrapper[4817]: I0218 14:00:44.505335 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f619e35-6845-4798-8498-93a544ebf76f-config\") pod \"controller-manager-7cdfb6bb75-csdcq\" (UID: \"2f619e35-6845-4798-8498-93a544ebf76f\") " pod="openshift-controller-manager/controller-manager-7cdfb6bb75-csdcq" Feb 18 14:00:44 crc kubenswrapper[4817]: I0218 14:00:44.607023 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f619e35-6845-4798-8498-93a544ebf76f-config\") pod \"controller-manager-7cdfb6bb75-csdcq\" (UID: \"2f619e35-6845-4798-8498-93a544ebf76f\") " pod="openshift-controller-manager/controller-manager-7cdfb6bb75-csdcq" Feb 18 14:00:44 crc kubenswrapper[4817]: I0218 14:00:44.607112 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f619e35-6845-4798-8498-93a544ebf76f-serving-cert\") pod \"controller-manager-7cdfb6bb75-csdcq\" (UID: \"2f619e35-6845-4798-8498-93a544ebf76f\") " pod="openshift-controller-manager/controller-manager-7cdfb6bb75-csdcq" Feb 18 14:00:44 crc kubenswrapper[4817]: I0218 14:00:44.607209 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fvgg\" (UniqueName: \"kubernetes.io/projected/2f619e35-6845-4798-8498-93a544ebf76f-kube-api-access-8fvgg\") pod \"controller-manager-7cdfb6bb75-csdcq\" (UID: \"2f619e35-6845-4798-8498-93a544ebf76f\") " pod="openshift-controller-manager/controller-manager-7cdfb6bb75-csdcq" Feb 18 14:00:44 crc kubenswrapper[4817]: I0218 14:00:44.607267 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f619e35-6845-4798-8498-93a544ebf76f-proxy-ca-bundles\") pod \"controller-manager-7cdfb6bb75-csdcq\" (UID: \"2f619e35-6845-4798-8498-93a544ebf76f\") " pod="openshift-controller-manager/controller-manager-7cdfb6bb75-csdcq" Feb 18 14:00:44 crc kubenswrapper[4817]: I0218 14:00:44.607308 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f619e35-6845-4798-8498-93a544ebf76f-client-ca\") pod \"controller-manager-7cdfb6bb75-csdcq\" (UID: \"2f619e35-6845-4798-8498-93a544ebf76f\") " pod="openshift-controller-manager/controller-manager-7cdfb6bb75-csdcq" Feb 18 14:00:44 crc kubenswrapper[4817]: I0218 14:00:44.609577 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f619e35-6845-4798-8498-93a544ebf76f-proxy-ca-bundles\") pod \"controller-manager-7cdfb6bb75-csdcq\" (UID: \"2f619e35-6845-4798-8498-93a544ebf76f\") " pod="openshift-controller-manager/controller-manager-7cdfb6bb75-csdcq" Feb 18 14:00:44 crc kubenswrapper[4817]: I0218 14:00:44.609959 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f619e35-6845-4798-8498-93a544ebf76f-client-ca\") pod \"controller-manager-7cdfb6bb75-csdcq\" (UID: \"2f619e35-6845-4798-8498-93a544ebf76f\") " pod="openshift-controller-manager/controller-manager-7cdfb6bb75-csdcq" Feb 18 14:00:44 crc kubenswrapper[4817]: I0218 14:00:44.611150 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f619e35-6845-4798-8498-93a544ebf76f-config\") pod \"controller-manager-7cdfb6bb75-csdcq\" (UID: \"2f619e35-6845-4798-8498-93a544ebf76f\") " pod="openshift-controller-manager/controller-manager-7cdfb6bb75-csdcq" Feb 18 14:00:44 crc kubenswrapper[4817]: I0218 14:00:44.618831 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f619e35-6845-4798-8498-93a544ebf76f-serving-cert\") pod \"controller-manager-7cdfb6bb75-csdcq\" (UID: \"2f619e35-6845-4798-8498-93a544ebf76f\") " pod="openshift-controller-manager/controller-manager-7cdfb6bb75-csdcq" Feb 18 14:00:44 crc kubenswrapper[4817]: I0218 14:00:44.635558 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fvgg\" (UniqueName: \"kubernetes.io/projected/2f619e35-6845-4798-8498-93a544ebf76f-kube-api-access-8fvgg\") pod \"controller-manager-7cdfb6bb75-csdcq\" (UID: \"2f619e35-6845-4798-8498-93a544ebf76f\") " pod="openshift-controller-manager/controller-manager-7cdfb6bb75-csdcq" Feb 18 14:00:44 crc kubenswrapper[4817]: I0218 14:00:44.778057 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cdfb6bb75-csdcq" Feb 18 14:00:45 crc kubenswrapper[4817]: I0218 14:00:45.204557 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7cdfb6bb75-csdcq"] Feb 18 14:00:45 crc kubenswrapper[4817]: W0218 14:00:45.216382 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f619e35_6845_4798_8498_93a544ebf76f.slice/crio-75c49a9c6b83512f923731bba7f88996725874ca94dd37dcf4c79e39c660014c WatchSource:0}: Error finding container 75c49a9c6b83512f923731bba7f88996725874ca94dd37dcf4c79e39c660014c: Status 404 returned error can't find the container with id 75c49a9c6b83512f923731bba7f88996725874ca94dd37dcf4c79e39c660014c Feb 18 14:00:45 crc kubenswrapper[4817]: I0218 14:00:45.346451 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cdfb6bb75-csdcq" event={"ID":"2f619e35-6845-4798-8498-93a544ebf76f","Type":"ContainerStarted","Data":"75c49a9c6b83512f923731bba7f88996725874ca94dd37dcf4c79e39c660014c"} Feb 18 14:00:45 crc kubenswrapper[4817]: I0218 14:00:45.554154 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-86tmt"] Feb 18 14:00:45 crc kubenswrapper[4817]: I0218 14:00:45.554809 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-86tmt" podUID="99fce15c-b13c-4341-b1a0-494d5bd3f76a" containerName="registry-server" containerID="cri-o://7bc4127ce676e9891883629547522ec5e19fbe72da2493c2841734f470eb6e51" gracePeriod=2 Feb 18 14:00:45 crc kubenswrapper[4817]: I0218 14:00:45.999303 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-86tmt" Feb 18 14:00:46 crc kubenswrapper[4817]: I0218 14:00:46.128795 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvd9n\" (UniqueName: \"kubernetes.io/projected/99fce15c-b13c-4341-b1a0-494d5bd3f76a-kube-api-access-wvd9n\") pod \"99fce15c-b13c-4341-b1a0-494d5bd3f76a\" (UID: \"99fce15c-b13c-4341-b1a0-494d5bd3f76a\") " Feb 18 14:00:46 crc kubenswrapper[4817]: I0218 14:00:46.128858 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99fce15c-b13c-4341-b1a0-494d5bd3f76a-catalog-content\") pod \"99fce15c-b13c-4341-b1a0-494d5bd3f76a\" (UID: \"99fce15c-b13c-4341-b1a0-494d5bd3f76a\") " Feb 18 14:00:46 crc kubenswrapper[4817]: I0218 14:00:46.128887 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99fce15c-b13c-4341-b1a0-494d5bd3f76a-utilities\") pod \"99fce15c-b13c-4341-b1a0-494d5bd3f76a\" (UID: \"99fce15c-b13c-4341-b1a0-494d5bd3f76a\") " Feb 18 14:00:46 crc kubenswrapper[4817]: I0218 14:00:46.130237 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99fce15c-b13c-4341-b1a0-494d5bd3f76a-utilities" (OuterVolumeSpecName: "utilities") pod "99fce15c-b13c-4341-b1a0-494d5bd3f76a" (UID: "99fce15c-b13c-4341-b1a0-494d5bd3f76a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:00:46 crc kubenswrapper[4817]: I0218 14:00:46.136380 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99fce15c-b13c-4341-b1a0-494d5bd3f76a-kube-api-access-wvd9n" (OuterVolumeSpecName: "kube-api-access-wvd9n") pod "99fce15c-b13c-4341-b1a0-494d5bd3f76a" (UID: "99fce15c-b13c-4341-b1a0-494d5bd3f76a"). InnerVolumeSpecName "kube-api-access-wvd9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:00:46 crc kubenswrapper[4817]: I0218 14:00:46.186858 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99fce15c-b13c-4341-b1a0-494d5bd3f76a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "99fce15c-b13c-4341-b1a0-494d5bd3f76a" (UID: "99fce15c-b13c-4341-b1a0-494d5bd3f76a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:00:46 crc kubenswrapper[4817]: I0218 14:00:46.234925 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99fce15c-b13c-4341-b1a0-494d5bd3f76a-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:46 crc kubenswrapper[4817]: I0218 14:00:46.234968 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvd9n\" (UniqueName: \"kubernetes.io/projected/99fce15c-b13c-4341-b1a0-494d5bd3f76a-kube-api-access-wvd9n\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:46 crc kubenswrapper[4817]: I0218 14:00:46.234995 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99fce15c-b13c-4341-b1a0-494d5bd3f76a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:46 crc kubenswrapper[4817]: I0218 14:00:46.312709 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tnrfx" Feb 18 14:00:46 crc kubenswrapper[4817]: I0218 14:00:46.357742 4817 generic.go:334] "Generic (PLEG): container finished" podID="99fce15c-b13c-4341-b1a0-494d5bd3f76a" containerID="7bc4127ce676e9891883629547522ec5e19fbe72da2493c2841734f470eb6e51" exitCode=0 Feb 18 14:00:46 crc kubenswrapper[4817]: I0218 14:00:46.357834 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-86tmt" event={"ID":"99fce15c-b13c-4341-b1a0-494d5bd3f76a","Type":"ContainerDied","Data":"7bc4127ce676e9891883629547522ec5e19fbe72da2493c2841734f470eb6e51"} Feb 18 14:00:46 crc kubenswrapper[4817]: I0218 14:00:46.357872 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-86tmt" event={"ID":"99fce15c-b13c-4341-b1a0-494d5bd3f76a","Type":"ContainerDied","Data":"8c99bfa6e9fdc892640d7765a99b85a4c0be2f0c50d238a35cc4f5c958f1db72"} Feb 18 14:00:46 crc kubenswrapper[4817]: I0218 14:00:46.357893 4817 scope.go:117] "RemoveContainer" containerID="7bc4127ce676e9891883629547522ec5e19fbe72da2493c2841734f470eb6e51" Feb 18 14:00:46 crc kubenswrapper[4817]: I0218 14:00:46.357836 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-86tmt" Feb 18 14:00:46 crc kubenswrapper[4817]: I0218 14:00:46.363663 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cdfb6bb75-csdcq" event={"ID":"2f619e35-6845-4798-8498-93a544ebf76f","Type":"ContainerStarted","Data":"9ac1e2da0ff962a3195b6a5c228b395faf7fdab23558c6c11f3be0a185213827"} Feb 18 14:00:46 crc kubenswrapper[4817]: I0218 14:00:46.365111 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tnrfx" Feb 18 14:00:46 crc kubenswrapper[4817]: I0218 14:00:46.381889 4817 scope.go:117] "RemoveContainer" containerID="ba58bd67f7538c6969c429f5ac015b747a611196bf153a0a065e5999ea137338" Feb 18 14:00:46 crc kubenswrapper[4817]: I0218 14:00:46.413069 4817 scope.go:117] "RemoveContainer" containerID="000e4ecf16c79304599cad55da15072d5d55c86d1dfb50e157fdbb969ff7b132" Feb 18 14:00:46 crc kubenswrapper[4817]: I0218 14:00:46.414372 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7cdfb6bb75-csdcq" podStartSLOduration=6.414343507 podStartE2EDuration="6.414343507s" podCreationTimestamp="2026-02-18 14:00:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:00:46.388471408 +0000 UTC m=+108.964007601" watchObservedRunningTime="2026-02-18 14:00:46.414343507 +0000 UTC m=+108.989879490" Feb 18 14:00:46 crc kubenswrapper[4817]: I0218 14:00:46.426947 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-86tmt"] Feb 18 14:00:46 crc kubenswrapper[4817]: I0218 14:00:46.430888 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-86tmt"] Feb 18 14:00:46 crc kubenswrapper[4817]: I0218 14:00:46.443237 4817 scope.go:117] "RemoveContainer" containerID="7bc4127ce676e9891883629547522ec5e19fbe72da2493c2841734f470eb6e51" Feb 18 14:00:46 crc kubenswrapper[4817]: E0218 14:00:46.445642 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bc4127ce676e9891883629547522ec5e19fbe72da2493c2841734f470eb6e51\": container with ID starting with 7bc4127ce676e9891883629547522ec5e19fbe72da2493c2841734f470eb6e51 not found: ID does not exist" containerID="7bc4127ce676e9891883629547522ec5e19fbe72da2493c2841734f470eb6e51" Feb 18 14:00:46 crc kubenswrapper[4817]: I0218 14:00:46.445761 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bc4127ce676e9891883629547522ec5e19fbe72da2493c2841734f470eb6e51"} err="failed to get container status \"7bc4127ce676e9891883629547522ec5e19fbe72da2493c2841734f470eb6e51\": rpc error: code = NotFound desc = could not find container \"7bc4127ce676e9891883629547522ec5e19fbe72da2493c2841734f470eb6e51\": container with ID starting with 7bc4127ce676e9891883629547522ec5e19fbe72da2493c2841734f470eb6e51 not found: ID does not exist" Feb 18 14:00:46 crc kubenswrapper[4817]: I0218 14:00:46.445812 4817 scope.go:117] "RemoveContainer" containerID="ba58bd67f7538c6969c429f5ac015b747a611196bf153a0a065e5999ea137338" Feb 18 14:00:46 crc kubenswrapper[4817]: E0218 14:00:46.446350 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba58bd67f7538c6969c429f5ac015b747a611196bf153a0a065e5999ea137338\": container with ID starting with ba58bd67f7538c6969c429f5ac015b747a611196bf153a0a065e5999ea137338 not found: ID does not exist" containerID="ba58bd67f7538c6969c429f5ac015b747a611196bf153a0a065e5999ea137338" Feb 18 14:00:46 crc kubenswrapper[4817]: I0218 14:00:46.446421 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba58bd67f7538c6969c429f5ac015b747a611196bf153a0a065e5999ea137338"} err="failed to get container status \"ba58bd67f7538c6969c429f5ac015b747a611196bf153a0a065e5999ea137338\": rpc error: code = NotFound desc = could not find container \"ba58bd67f7538c6969c429f5ac015b747a611196bf153a0a065e5999ea137338\": container with ID starting with ba58bd67f7538c6969c429f5ac015b747a611196bf153a0a065e5999ea137338 not found: ID does not exist" Feb 18 14:00:46 crc kubenswrapper[4817]: I0218 14:00:46.446709 4817 scope.go:117] "RemoveContainer" containerID="000e4ecf16c79304599cad55da15072d5d55c86d1dfb50e157fdbb969ff7b132" Feb 18 14:00:46 crc kubenswrapper[4817]: E0218 14:00:46.447317 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"000e4ecf16c79304599cad55da15072d5d55c86d1dfb50e157fdbb969ff7b132\": container with ID starting with 000e4ecf16c79304599cad55da15072d5d55c86d1dfb50e157fdbb969ff7b132 not found: ID does not exist" containerID="000e4ecf16c79304599cad55da15072d5d55c86d1dfb50e157fdbb969ff7b132" Feb 18 14:00:46 crc kubenswrapper[4817]: I0218 14:00:46.447391 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"000e4ecf16c79304599cad55da15072d5d55c86d1dfb50e157fdbb969ff7b132"} err="failed to get container status \"000e4ecf16c79304599cad55da15072d5d55c86d1dfb50e157fdbb969ff7b132\": rpc error: code = NotFound desc = could not find container \"000e4ecf16c79304599cad55da15072d5d55c86d1dfb50e157fdbb969ff7b132\": container with ID starting with 000e4ecf16c79304599cad55da15072d5d55c86d1dfb50e157fdbb969ff7b132 not found: ID does not exist" Feb 18 14:00:46 crc kubenswrapper[4817]: I0218 14:00:46.957863 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2z68r"] Feb 18 14:00:46 crc kubenswrapper[4817]: I0218 14:00:46.958349 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2z68r" podUID="c8e393a5-61e4-4e91-bec4-770687b8d01b" containerName="registry-server" containerID="cri-o://7fb66e54528ab6ff9d1c0c8d4b09001788707003b8efc16d7c6c01848f98f457" gracePeriod=2 Feb 18 14:00:47 crc kubenswrapper[4817]: I0218 14:00:47.371407 4817 generic.go:334] "Generic (PLEG): container finished" podID="c8e393a5-61e4-4e91-bec4-770687b8d01b" containerID="7fb66e54528ab6ff9d1c0c8d4b09001788707003b8efc16d7c6c01848f98f457" exitCode=0 Feb 18 14:00:47 crc kubenswrapper[4817]: I0218 14:00:47.371634 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2z68r" event={"ID":"c8e393a5-61e4-4e91-bec4-770687b8d01b","Type":"ContainerDied","Data":"7fb66e54528ab6ff9d1c0c8d4b09001788707003b8efc16d7c6c01848f98f457"} Feb 18 14:00:47 crc kubenswrapper[4817]: I0218 14:00:47.371812 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2z68r" event={"ID":"c8e393a5-61e4-4e91-bec4-770687b8d01b","Type":"ContainerDied","Data":"f812b0571f00f76f0c5dd6615443393efddcb51f670924edf17c87aeb6413b65"} Feb 18 14:00:47 crc kubenswrapper[4817]: I0218 14:00:47.371831 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f812b0571f00f76f0c5dd6615443393efddcb51f670924edf17c87aeb6413b65" Feb 18 14:00:47 crc kubenswrapper[4817]: I0218 14:00:47.373971 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7cdfb6bb75-csdcq" Feb 18 14:00:47 crc kubenswrapper[4817]: I0218 14:00:47.375101 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2z68r" Feb 18 14:00:47 crc kubenswrapper[4817]: I0218 14:00:47.379996 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7cdfb6bb75-csdcq" Feb 18 14:00:47 crc kubenswrapper[4817]: I0218 14:00:47.464518 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxcv9\" (UniqueName: \"kubernetes.io/projected/c8e393a5-61e4-4e91-bec4-770687b8d01b-kube-api-access-rxcv9\") pod \"c8e393a5-61e4-4e91-bec4-770687b8d01b\" (UID: \"c8e393a5-61e4-4e91-bec4-770687b8d01b\") " Feb 18 14:00:47 crc kubenswrapper[4817]: I0218 14:00:47.464602 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8e393a5-61e4-4e91-bec4-770687b8d01b-utilities\") pod \"c8e393a5-61e4-4e91-bec4-770687b8d01b\" (UID: \"c8e393a5-61e4-4e91-bec4-770687b8d01b\") " Feb 18 14:00:47 crc kubenswrapper[4817]: I0218 14:00:47.464666 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8e393a5-61e4-4e91-bec4-770687b8d01b-catalog-content\") pod \"c8e393a5-61e4-4e91-bec4-770687b8d01b\" (UID: \"c8e393a5-61e4-4e91-bec4-770687b8d01b\") " Feb 18 14:00:47 crc kubenswrapper[4817]: I0218 14:00:47.466141 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8e393a5-61e4-4e91-bec4-770687b8d01b-utilities" (OuterVolumeSpecName: "utilities") pod "c8e393a5-61e4-4e91-bec4-770687b8d01b" (UID: "c8e393a5-61e4-4e91-bec4-770687b8d01b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:00:47 crc kubenswrapper[4817]: I0218 14:00:47.471183 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8e393a5-61e4-4e91-bec4-770687b8d01b-kube-api-access-rxcv9" (OuterVolumeSpecName: "kube-api-access-rxcv9") pod "c8e393a5-61e4-4e91-bec4-770687b8d01b" (UID: "c8e393a5-61e4-4e91-bec4-770687b8d01b"). InnerVolumeSpecName "kube-api-access-rxcv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:00:47 crc kubenswrapper[4817]: I0218 14:00:47.534240 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8e393a5-61e4-4e91-bec4-770687b8d01b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8e393a5-61e4-4e91-bec4-770687b8d01b" (UID: "c8e393a5-61e4-4e91-bec4-770687b8d01b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:00:47 crc kubenswrapper[4817]: I0218 14:00:47.566512 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxcv9\" (UniqueName: \"kubernetes.io/projected/c8e393a5-61e4-4e91-bec4-770687b8d01b-kube-api-access-rxcv9\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:47 crc kubenswrapper[4817]: I0218 14:00:47.566598 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8e393a5-61e4-4e91-bec4-770687b8d01b-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:47 crc kubenswrapper[4817]: I0218 14:00:47.566613 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8e393a5-61e4-4e91-bec4-770687b8d01b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:48 crc kubenswrapper[4817]: I0218 14:00:48.184236 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99fce15c-b13c-4341-b1a0-494d5bd3f76a" path="/var/lib/kubelet/pods/99fce15c-b13c-4341-b1a0-494d5bd3f76a/volumes" Feb 18 14:00:48 crc kubenswrapper[4817]: I0218 14:00:48.380298 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2z68r" Feb 18 14:00:48 crc kubenswrapper[4817]: I0218 14:00:48.401417 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2z68r"] Feb 18 14:00:48 crc kubenswrapper[4817]: I0218 14:00:48.405749 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2z68r"] Feb 18 14:00:49 crc kubenswrapper[4817]: I0218 14:00:49.953750 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tnrfx"] Feb 18 14:00:49 crc kubenswrapper[4817]: I0218 14:00:49.954672 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tnrfx" podUID="262d049a-2c52-453b-a054-3a17b595d535" containerName="registry-server" containerID="cri-o://7ec2b58eecf629d97d2f22905fe6a3111b678e0ccdcf90e5126468ab53580908" gracePeriod=2 Feb 18 14:00:50 crc kubenswrapper[4817]: I0218 14:00:50.179170 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8e393a5-61e4-4e91-bec4-770687b8d01b" path="/var/lib/kubelet/pods/c8e393a5-61e4-4e91-bec4-770687b8d01b/volumes" Feb 18 14:00:50 crc kubenswrapper[4817]: I0218 14:00:50.405677 4817 generic.go:334] "Generic (PLEG): container finished" podID="262d049a-2c52-453b-a054-3a17b595d535" containerID="7ec2b58eecf629d97d2f22905fe6a3111b678e0ccdcf90e5126468ab53580908" exitCode=0 Feb 18 14:00:50 crc kubenswrapper[4817]: I0218 14:00:50.405753 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnrfx" event={"ID":"262d049a-2c52-453b-a054-3a17b595d535","Type":"ContainerDied","Data":"7ec2b58eecf629d97d2f22905fe6a3111b678e0ccdcf90e5126468ab53580908"} Feb 18 14:00:50 crc kubenswrapper[4817]: I0218 14:00:50.405797 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnrfx" event={"ID":"262d049a-2c52-453b-a054-3a17b595d535","Type":"ContainerDied","Data":"a3db0c2b12a86a48bcdd47504e810d8fa320217e8c4786b0882dc49689502b25"} Feb 18 14:00:50 crc kubenswrapper[4817]: I0218 14:00:50.405819 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3db0c2b12a86a48bcdd47504e810d8fa320217e8c4786b0882dc49689502b25" Feb 18 14:00:50 crc kubenswrapper[4817]: I0218 14:00:50.430855 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tnrfx" Feb 18 14:00:50 crc kubenswrapper[4817]: I0218 14:00:50.519955 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/262d049a-2c52-453b-a054-3a17b595d535-utilities\") pod \"262d049a-2c52-453b-a054-3a17b595d535\" (UID: \"262d049a-2c52-453b-a054-3a17b595d535\") " Feb 18 14:00:50 crc kubenswrapper[4817]: I0218 14:00:50.520062 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/262d049a-2c52-453b-a054-3a17b595d535-catalog-content\") pod \"262d049a-2c52-453b-a054-3a17b595d535\" (UID: \"262d049a-2c52-453b-a054-3a17b595d535\") " Feb 18 14:00:50 crc kubenswrapper[4817]: I0218 14:00:50.520116 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rlmz\" (UniqueName: \"kubernetes.io/projected/262d049a-2c52-453b-a054-3a17b595d535-kube-api-access-6rlmz\") pod \"262d049a-2c52-453b-a054-3a17b595d535\" (UID: \"262d049a-2c52-453b-a054-3a17b595d535\") " Feb 18 14:00:50 crc kubenswrapper[4817]: I0218 14:00:50.521073 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/262d049a-2c52-453b-a054-3a17b595d535-utilities" (OuterVolumeSpecName: "utilities") pod "262d049a-2c52-453b-a054-3a17b595d535" (UID: "262d049a-2c52-453b-a054-3a17b595d535"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:00:50 crc kubenswrapper[4817]: I0218 14:00:50.527255 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/262d049a-2c52-453b-a054-3a17b595d535-kube-api-access-6rlmz" (OuterVolumeSpecName: "kube-api-access-6rlmz") pod "262d049a-2c52-453b-a054-3a17b595d535" (UID: "262d049a-2c52-453b-a054-3a17b595d535"). InnerVolumeSpecName "kube-api-access-6rlmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:00:50 crc kubenswrapper[4817]: I0218 14:00:50.621602 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rlmz\" (UniqueName: \"kubernetes.io/projected/262d049a-2c52-453b-a054-3a17b595d535-kube-api-access-6rlmz\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:50 crc kubenswrapper[4817]: I0218 14:00:50.621653 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/262d049a-2c52-453b-a054-3a17b595d535-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:50 crc kubenswrapper[4817]: I0218 14:00:50.656039 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/262d049a-2c52-453b-a054-3a17b595d535-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "262d049a-2c52-453b-a054-3a17b595d535" (UID: "262d049a-2c52-453b-a054-3a17b595d535"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:00:50 crc kubenswrapper[4817]: I0218 14:00:50.723128 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/262d049a-2c52-453b-a054-3a17b595d535-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 14:00:51 crc kubenswrapper[4817]: I0218 14:00:51.412338 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tnrfx" Feb 18 14:00:51 crc kubenswrapper[4817]: I0218 14:00:51.451542 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tnrfx"] Feb 18 14:00:51 crc kubenswrapper[4817]: I0218 14:00:51.455727 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tnrfx"] Feb 18 14:00:52 crc kubenswrapper[4817]: I0218 14:00:52.187689 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="262d049a-2c52-453b-a054-3a17b595d535" path="/var/lib/kubelet/pods/262d049a-2c52-453b-a054-3a17b595d535/volumes" Feb 18 14:01:00 crc kubenswrapper[4817]: I0218 14:01:00.393080 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7cdfb6bb75-csdcq"] Feb 18 14:01:00 crc kubenswrapper[4817]: I0218 14:01:00.395212 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7cdfb6bb75-csdcq" podUID="2f619e35-6845-4798-8498-93a544ebf76f" containerName="controller-manager" containerID="cri-o://9ac1e2da0ff962a3195b6a5c228b395faf7fdab23558c6c11f3be0a185213827" gracePeriod=30 Feb 18 14:01:00 crc kubenswrapper[4817]: I0218 14:01:00.497962 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75d476cb65-v9cbc"] Feb 18 14:01:00 crc kubenswrapper[4817]: I0218 14:01:00.498223 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-75d476cb65-v9cbc" podUID="51819d15-196a-46db-95d9-fd71a6c42921" containerName="route-controller-manager" containerID="cri-o://943381e1111f66492875ce8aa6d32c207ac46e89f0bcc4062866e8ae745c065e" gracePeriod=30 Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.051215 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75d476cb65-v9cbc" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.158547 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51819d15-196a-46db-95d9-fd71a6c42921-client-ca\") pod \"51819d15-196a-46db-95d9-fd71a6c42921\" (UID: \"51819d15-196a-46db-95d9-fd71a6c42921\") " Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.158621 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51819d15-196a-46db-95d9-fd71a6c42921-serving-cert\") pod \"51819d15-196a-46db-95d9-fd71a6c42921\" (UID: \"51819d15-196a-46db-95d9-fd71a6c42921\") " Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.158713 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51819d15-196a-46db-95d9-fd71a6c42921-config\") pod \"51819d15-196a-46db-95d9-fd71a6c42921\" (UID: \"51819d15-196a-46db-95d9-fd71a6c42921\") " Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.158755 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4mm9\" (UniqueName: \"kubernetes.io/projected/51819d15-196a-46db-95d9-fd71a6c42921-kube-api-access-w4mm9\") pod \"51819d15-196a-46db-95d9-fd71a6c42921\" (UID: \"51819d15-196a-46db-95d9-fd71a6c42921\") " Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.160409 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51819d15-196a-46db-95d9-fd71a6c42921-config" (OuterVolumeSpecName: "config") pod "51819d15-196a-46db-95d9-fd71a6c42921" (UID: "51819d15-196a-46db-95d9-fd71a6c42921"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.160465 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51819d15-196a-46db-95d9-fd71a6c42921-client-ca" (OuterVolumeSpecName: "client-ca") pod "51819d15-196a-46db-95d9-fd71a6c42921" (UID: "51819d15-196a-46db-95d9-fd71a6c42921"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.167382 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51819d15-196a-46db-95d9-fd71a6c42921-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "51819d15-196a-46db-95d9-fd71a6c42921" (UID: "51819d15-196a-46db-95d9-fd71a6c42921"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.167379 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51819d15-196a-46db-95d9-fd71a6c42921-kube-api-access-w4mm9" (OuterVolumeSpecName: "kube-api-access-w4mm9") pod "51819d15-196a-46db-95d9-fd71a6c42921" (UID: "51819d15-196a-46db-95d9-fd71a6c42921"). InnerVolumeSpecName "kube-api-access-w4mm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.240842 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cdfb6bb75-csdcq" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.261631 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51819d15-196a-46db-95d9-fd71a6c42921-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.261674 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51819d15-196a-46db-95d9-fd71a6c42921-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.261684 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4mm9\" (UniqueName: \"kubernetes.io/projected/51819d15-196a-46db-95d9-fd71a6c42921-kube-api-access-w4mm9\") on node \"crc\" DevicePath \"\"" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.261693 4817 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51819d15-196a-46db-95d9-fd71a6c42921-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.362661 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fvgg\" (UniqueName: \"kubernetes.io/projected/2f619e35-6845-4798-8498-93a544ebf76f-kube-api-access-8fvgg\") pod \"2f619e35-6845-4798-8498-93a544ebf76f\" (UID: \"2f619e35-6845-4798-8498-93a544ebf76f\") " Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.362759 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f619e35-6845-4798-8498-93a544ebf76f-client-ca\") pod \"2f619e35-6845-4798-8498-93a544ebf76f\" (UID: \"2f619e35-6845-4798-8498-93a544ebf76f\") " Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.362801 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f619e35-6845-4798-8498-93a544ebf76f-proxy-ca-bundles\") pod \"2f619e35-6845-4798-8498-93a544ebf76f\" (UID: \"2f619e35-6845-4798-8498-93a544ebf76f\") " Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.362850 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f619e35-6845-4798-8498-93a544ebf76f-config\") pod \"2f619e35-6845-4798-8498-93a544ebf76f\" (UID: \"2f619e35-6845-4798-8498-93a544ebf76f\") " Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.362877 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f619e35-6845-4798-8498-93a544ebf76f-serving-cert\") pod \"2f619e35-6845-4798-8498-93a544ebf76f\" (UID: \"2f619e35-6845-4798-8498-93a544ebf76f\") " Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.363694 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f619e35-6845-4798-8498-93a544ebf76f-client-ca" (OuterVolumeSpecName: "client-ca") pod "2f619e35-6845-4798-8498-93a544ebf76f" (UID: "2f619e35-6845-4798-8498-93a544ebf76f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.363777 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f619e35-6845-4798-8498-93a544ebf76f-config" (OuterVolumeSpecName: "config") pod "2f619e35-6845-4798-8498-93a544ebf76f" (UID: "2f619e35-6845-4798-8498-93a544ebf76f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.363871 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f619e35-6845-4798-8498-93a544ebf76f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2f619e35-6845-4798-8498-93a544ebf76f" (UID: "2f619e35-6845-4798-8498-93a544ebf76f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.365917 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f619e35-6845-4798-8498-93a544ebf76f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2f619e35-6845-4798-8498-93a544ebf76f" (UID: "2f619e35-6845-4798-8498-93a544ebf76f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.366277 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f619e35-6845-4798-8498-93a544ebf76f-kube-api-access-8fvgg" (OuterVolumeSpecName: "kube-api-access-8fvgg") pod "2f619e35-6845-4798-8498-93a544ebf76f" (UID: "2f619e35-6845-4798-8498-93a544ebf76f"). InnerVolumeSpecName "kube-api-access-8fvgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.465402 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f619e35-6845-4798-8498-93a544ebf76f-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.465460 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f619e35-6845-4798-8498-93a544ebf76f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.465487 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fvgg\" (UniqueName: \"kubernetes.io/projected/2f619e35-6845-4798-8498-93a544ebf76f-kube-api-access-8fvgg\") on node \"crc\" DevicePath \"\"" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.465509 4817 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f619e35-6845-4798-8498-93a544ebf76f-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.465527 4817 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f619e35-6845-4798-8498-93a544ebf76f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.466533 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-78c74bbf86-277pl"] Feb 18 14:01:01 crc kubenswrapper[4817]: E0218 14:01:01.466771 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="262d049a-2c52-453b-a054-3a17b595d535" containerName="extract-content" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.466786 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="262d049a-2c52-453b-a054-3a17b595d535" containerName="extract-content" Feb 18 14:01:01 crc kubenswrapper[4817]: E0218 14:01:01.466803 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f619e35-6845-4798-8498-93a544ebf76f" containerName="controller-manager" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.466812 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f619e35-6845-4798-8498-93a544ebf76f" containerName="controller-manager" Feb 18 14:01:01 crc kubenswrapper[4817]: E0218 14:01:01.466828 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="262d049a-2c52-453b-a054-3a17b595d535" containerName="registry-server" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.466838 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="262d049a-2c52-453b-a054-3a17b595d535" containerName="registry-server" Feb 18 14:01:01 crc kubenswrapper[4817]: E0218 14:01:01.466853 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99fce15c-b13c-4341-b1a0-494d5bd3f76a" containerName="extract-content" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.466862 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="99fce15c-b13c-4341-b1a0-494d5bd3f76a" containerName="extract-content" Feb 18 14:01:01 crc kubenswrapper[4817]: E0218 14:01:01.466887 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99fce15c-b13c-4341-b1a0-494d5bd3f76a" containerName="registry-server" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.466895 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="99fce15c-b13c-4341-b1a0-494d5bd3f76a" containerName="registry-server" Feb 18 14:01:01 crc kubenswrapper[4817]: E0218 14:01:01.466909 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51819d15-196a-46db-95d9-fd71a6c42921" containerName="route-controller-manager" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.466917 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="51819d15-196a-46db-95d9-fd71a6c42921" containerName="route-controller-manager" Feb 18 14:01:01 crc kubenswrapper[4817]: E0218 14:01:01.466931 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99fce15c-b13c-4341-b1a0-494d5bd3f76a" containerName="extract-utilities" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.466940 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="99fce15c-b13c-4341-b1a0-494d5bd3f76a" containerName="extract-utilities" Feb 18 14:01:01 crc kubenswrapper[4817]: E0218 14:01:01.466955 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8e393a5-61e4-4e91-bec4-770687b8d01b" containerName="extract-utilities" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.466966 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8e393a5-61e4-4e91-bec4-770687b8d01b" containerName="extract-utilities" Feb 18 14:01:01 crc kubenswrapper[4817]: E0218 14:01:01.466999 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="262d049a-2c52-453b-a054-3a17b595d535" containerName="extract-utilities" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.467011 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="262d049a-2c52-453b-a054-3a17b595d535" containerName="extract-utilities" Feb 18 14:01:01 crc kubenswrapper[4817]: E0218 14:01:01.467024 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8e393a5-61e4-4e91-bec4-770687b8d01b" containerName="extract-content" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.467037 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8e393a5-61e4-4e91-bec4-770687b8d01b" containerName="extract-content" Feb 18 14:01:01 crc kubenswrapper[4817]: E0218 14:01:01.467050 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8e393a5-61e4-4e91-bec4-770687b8d01b" containerName="registry-server" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.467059 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8e393a5-61e4-4e91-bec4-770687b8d01b" containerName="registry-server" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.467184 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="262d049a-2c52-453b-a054-3a17b595d535" containerName="registry-server" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.467206 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8e393a5-61e4-4e91-bec4-770687b8d01b" containerName="registry-server" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.467218 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="51819d15-196a-46db-95d9-fd71a6c42921" containerName="route-controller-manager" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.467229 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f619e35-6845-4798-8498-93a544ebf76f" containerName="controller-manager" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.467243 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="99fce15c-b13c-4341-b1a0-494d5bd3f76a" containerName="registry-server" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.467443 4817 generic.go:334] "Generic (PLEG): container finished" podID="51819d15-196a-46db-95d9-fd71a6c42921" containerID="943381e1111f66492875ce8aa6d32c207ac46e89f0bcc4062866e8ae745c065e" exitCode=0 Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.467594 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75d476cb65-v9cbc" event={"ID":"51819d15-196a-46db-95d9-fd71a6c42921","Type":"ContainerDied","Data":"943381e1111f66492875ce8aa6d32c207ac46e89f0bcc4062866e8ae745c065e"} Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.467629 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75d476cb65-v9cbc" event={"ID":"51819d15-196a-46db-95d9-fd71a6c42921","Type":"ContainerDied","Data":"f28d27fe581798ba069678bc445176f4b584615c91f3395751c3baad9fd1d4ea"} Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.467645 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75d476cb65-v9cbc" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.467651 4817 scope.go:117] "RemoveContainer" containerID="943381e1111f66492875ce8aa6d32c207ac46e89f0bcc4062866e8ae745c065e" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.468380 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78c74bbf86-277pl" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.475285 4817 generic.go:334] "Generic (PLEG): container finished" podID="2f619e35-6845-4798-8498-93a544ebf76f" containerID="9ac1e2da0ff962a3195b6a5c228b395faf7fdab23558c6c11f3be0a185213827" exitCode=0 Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.475333 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cdfb6bb75-csdcq" event={"ID":"2f619e35-6845-4798-8498-93a544ebf76f","Type":"ContainerDied","Data":"9ac1e2da0ff962a3195b6a5c228b395faf7fdab23558c6c11f3be0a185213827"} Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.475370 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cdfb6bb75-csdcq" event={"ID":"2f619e35-6845-4798-8498-93a544ebf76f","Type":"ContainerDied","Data":"75c49a9c6b83512f923731bba7f88996725874ca94dd37dcf4c79e39c660014c"} Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.475433 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cdfb6bb75-csdcq" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.480361 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78c74bbf86-277pl"] Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.506594 4817 scope.go:117] "RemoveContainer" containerID="943381e1111f66492875ce8aa6d32c207ac46e89f0bcc4062866e8ae745c065e" Feb 18 14:01:01 crc kubenswrapper[4817]: E0218 14:01:01.506957 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"943381e1111f66492875ce8aa6d32c207ac46e89f0bcc4062866e8ae745c065e\": container with ID starting with 943381e1111f66492875ce8aa6d32c207ac46e89f0bcc4062866e8ae745c065e not found: ID does not exist" containerID="943381e1111f66492875ce8aa6d32c207ac46e89f0bcc4062866e8ae745c065e" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.512069 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"943381e1111f66492875ce8aa6d32c207ac46e89f0bcc4062866e8ae745c065e"} err="failed to get container status \"943381e1111f66492875ce8aa6d32c207ac46e89f0bcc4062866e8ae745c065e\": rpc error: code = NotFound desc = could not find container \"943381e1111f66492875ce8aa6d32c207ac46e89f0bcc4062866e8ae745c065e\": container with ID starting with 943381e1111f66492875ce8aa6d32c207ac46e89f0bcc4062866e8ae745c065e not found: ID does not exist" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.512101 4817 scope.go:117] "RemoveContainer" containerID="9ac1e2da0ff962a3195b6a5c228b395faf7fdab23558c6c11f3be0a185213827" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.515438 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7cdfb6bb75-csdcq"] Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.520358 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7cdfb6bb75-csdcq"] Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.536462 4817 scope.go:117] "RemoveContainer" containerID="9ac1e2da0ff962a3195b6a5c228b395faf7fdab23558c6c11f3be0a185213827" Feb 18 14:01:01 crc kubenswrapper[4817]: E0218 14:01:01.537019 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ac1e2da0ff962a3195b6a5c228b395faf7fdab23558c6c11f3be0a185213827\": container with ID starting with 9ac1e2da0ff962a3195b6a5c228b395faf7fdab23558c6c11f3be0a185213827 not found: ID does not exist" containerID="9ac1e2da0ff962a3195b6a5c228b395faf7fdab23558c6c11f3be0a185213827" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.537067 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ac1e2da0ff962a3195b6a5c228b395faf7fdab23558c6c11f3be0a185213827"} err="failed to get container status \"9ac1e2da0ff962a3195b6a5c228b395faf7fdab23558c6c11f3be0a185213827\": rpc error: code = NotFound desc = could not find container \"9ac1e2da0ff962a3195b6a5c228b395faf7fdab23558c6c11f3be0a185213827\": container with ID starting with 9ac1e2da0ff962a3195b6a5c228b395faf7fdab23558c6c11f3be0a185213827 not found: ID does not exist" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.542887 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75d476cb65-v9cbc"] Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.546662 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75d476cb65-v9cbc"] Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.566927 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f24f96b-bc64-4d8e-be89-4b64b35ca424-client-ca\") pod \"controller-manager-78c74bbf86-277pl\" (UID: \"0f24f96b-bc64-4d8e-be89-4b64b35ca424\") " pod="openshift-controller-manager/controller-manager-78c74bbf86-277pl" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.566971 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f24f96b-bc64-4d8e-be89-4b64b35ca424-config\") pod \"controller-manager-78c74bbf86-277pl\" (UID: \"0f24f96b-bc64-4d8e-be89-4b64b35ca424\") " pod="openshift-controller-manager/controller-manager-78c74bbf86-277pl" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.567015 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f24f96b-bc64-4d8e-be89-4b64b35ca424-serving-cert\") pod \"controller-manager-78c74bbf86-277pl\" (UID: \"0f24f96b-bc64-4d8e-be89-4b64b35ca424\") " pod="openshift-controller-manager/controller-manager-78c74bbf86-277pl" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.567316 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzjfc\" (UniqueName: \"kubernetes.io/projected/0f24f96b-bc64-4d8e-be89-4b64b35ca424-kube-api-access-lzjfc\") pod \"controller-manager-78c74bbf86-277pl\" (UID: \"0f24f96b-bc64-4d8e-be89-4b64b35ca424\") " pod="openshift-controller-manager/controller-manager-78c74bbf86-277pl" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.567433 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0f24f96b-bc64-4d8e-be89-4b64b35ca424-proxy-ca-bundles\") pod \"controller-manager-78c74bbf86-277pl\" (UID: \"0f24f96b-bc64-4d8e-be89-4b64b35ca424\") " pod="openshift-controller-manager/controller-manager-78c74bbf86-277pl" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.668493 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f24f96b-bc64-4d8e-be89-4b64b35ca424-client-ca\") pod \"controller-manager-78c74bbf86-277pl\" (UID: \"0f24f96b-bc64-4d8e-be89-4b64b35ca424\") " pod="openshift-controller-manager/controller-manager-78c74bbf86-277pl" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.668547 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f24f96b-bc64-4d8e-be89-4b64b35ca424-config\") pod \"controller-manager-78c74bbf86-277pl\" (UID: \"0f24f96b-bc64-4d8e-be89-4b64b35ca424\") " pod="openshift-controller-manager/controller-manager-78c74bbf86-277pl" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.668568 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f24f96b-bc64-4d8e-be89-4b64b35ca424-serving-cert\") pod \"controller-manager-78c74bbf86-277pl\" (UID: \"0f24f96b-bc64-4d8e-be89-4b64b35ca424\") " pod="openshift-controller-manager/controller-manager-78c74bbf86-277pl" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.668610 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzjfc\" (UniqueName: \"kubernetes.io/projected/0f24f96b-bc64-4d8e-be89-4b64b35ca424-kube-api-access-lzjfc\") pod \"controller-manager-78c74bbf86-277pl\" (UID: \"0f24f96b-bc64-4d8e-be89-4b64b35ca424\") " pod="openshift-controller-manager/controller-manager-78c74bbf86-277pl" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.668635 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0f24f96b-bc64-4d8e-be89-4b64b35ca424-proxy-ca-bundles\") pod \"controller-manager-78c74bbf86-277pl\" (UID: \"0f24f96b-bc64-4d8e-be89-4b64b35ca424\") " pod="openshift-controller-manager/controller-manager-78c74bbf86-277pl" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.669859 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0f24f96b-bc64-4d8e-be89-4b64b35ca424-proxy-ca-bundles\") pod \"controller-manager-78c74bbf86-277pl\" (UID: \"0f24f96b-bc64-4d8e-be89-4b64b35ca424\") " pod="openshift-controller-manager/controller-manager-78c74bbf86-277pl" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.670349 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f24f96b-bc64-4d8e-be89-4b64b35ca424-client-ca\") pod \"controller-manager-78c74bbf86-277pl\" (UID: \"0f24f96b-bc64-4d8e-be89-4b64b35ca424\") " pod="openshift-controller-manager/controller-manager-78c74bbf86-277pl" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.670441 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f24f96b-bc64-4d8e-be89-4b64b35ca424-config\") pod \"controller-manager-78c74bbf86-277pl\" (UID: \"0f24f96b-bc64-4d8e-be89-4b64b35ca424\") " pod="openshift-controller-manager/controller-manager-78c74bbf86-277pl" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.676624 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f24f96b-bc64-4d8e-be89-4b64b35ca424-serving-cert\") pod \"controller-manager-78c74bbf86-277pl\" (UID: \"0f24f96b-bc64-4d8e-be89-4b64b35ca424\") " pod="openshift-controller-manager/controller-manager-78c74bbf86-277pl" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.684946 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzjfc\" (UniqueName: \"kubernetes.io/projected/0f24f96b-bc64-4d8e-be89-4b64b35ca424-kube-api-access-lzjfc\") pod \"controller-manager-78c74bbf86-277pl\" (UID: \"0f24f96b-bc64-4d8e-be89-4b64b35ca424\") " pod="openshift-controller-manager/controller-manager-78c74bbf86-277pl" Feb 18 14:01:01 crc kubenswrapper[4817]: I0218 14:01:01.797657 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78c74bbf86-277pl" Feb 18 14:01:02 crc kubenswrapper[4817]: I0218 14:01:02.180901 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f619e35-6845-4798-8498-93a544ebf76f" path="/var/lib/kubelet/pods/2f619e35-6845-4798-8498-93a544ebf76f/volumes" Feb 18 14:01:02 crc kubenswrapper[4817]: I0218 14:01:02.182015 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51819d15-196a-46db-95d9-fd71a6c42921" path="/var/lib/kubelet/pods/51819d15-196a-46db-95d9-fd71a6c42921/volumes" Feb 18 14:01:02 crc kubenswrapper[4817]: I0218 14:01:02.236492 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78c74bbf86-277pl"] Feb 18 14:01:02 crc kubenswrapper[4817]: I0218 14:01:02.469465 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f8cd4bb4b-8wbjf"] Feb 18 14:01:02 crc kubenswrapper[4817]: I0218 14:01:02.470420 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f8cd4bb4b-8wbjf" Feb 18 14:01:02 crc kubenswrapper[4817]: I0218 14:01:02.476371 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 18 14:01:02 crc kubenswrapper[4817]: I0218 14:01:02.477098 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 18 14:01:02 crc kubenswrapper[4817]: I0218 14:01:02.477297 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 18 14:01:02 crc kubenswrapper[4817]: I0218 14:01:02.477899 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 18 14:01:02 crc kubenswrapper[4817]: I0218 14:01:02.478340 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 18 14:01:02 crc kubenswrapper[4817]: I0218 14:01:02.478368 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 18 14:01:02 crc kubenswrapper[4817]: I0218 14:01:02.483474 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78c74bbf86-277pl" event={"ID":"0f24f96b-bc64-4d8e-be89-4b64b35ca424","Type":"ContainerStarted","Data":"2c8a52905476463555a33d7685f2400f6515981f007fb379d32b98c00a426d81"} Feb 18 14:01:02 crc kubenswrapper[4817]: I0218 14:01:02.515814 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f8cd4bb4b-8wbjf"] Feb 18 14:01:02 crc kubenswrapper[4817]: I0218 14:01:02.579253 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/03358e20-1724-438e-b1eb-3d9e8ea550e4-client-ca\") pod \"route-controller-manager-7f8cd4bb4b-8wbjf\" (UID: \"03358e20-1724-438e-b1eb-3d9e8ea550e4\") " pod="openshift-route-controller-manager/route-controller-manager-7f8cd4bb4b-8wbjf" Feb 18 14:01:02 crc kubenswrapper[4817]: I0218 14:01:02.579507 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqrq2\" (UniqueName: \"kubernetes.io/projected/03358e20-1724-438e-b1eb-3d9e8ea550e4-kube-api-access-gqrq2\") pod \"route-controller-manager-7f8cd4bb4b-8wbjf\" (UID: \"03358e20-1724-438e-b1eb-3d9e8ea550e4\") " pod="openshift-route-controller-manager/route-controller-manager-7f8cd4bb4b-8wbjf" Feb 18 14:01:02 crc kubenswrapper[4817]: I0218 14:01:02.579723 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03358e20-1724-438e-b1eb-3d9e8ea550e4-config\") pod \"route-controller-manager-7f8cd4bb4b-8wbjf\" (UID: \"03358e20-1724-438e-b1eb-3d9e8ea550e4\") " pod="openshift-route-controller-manager/route-controller-manager-7f8cd4bb4b-8wbjf" Feb 18 14:01:02 crc kubenswrapper[4817]: I0218 14:01:02.579801 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03358e20-1724-438e-b1eb-3d9e8ea550e4-serving-cert\") pod \"route-controller-manager-7f8cd4bb4b-8wbjf\" (UID: \"03358e20-1724-438e-b1eb-3d9e8ea550e4\") " pod="openshift-route-controller-manager/route-controller-manager-7f8cd4bb4b-8wbjf" Feb 18 14:01:02 crc kubenswrapper[4817]: I0218 14:01:02.680876 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/03358e20-1724-438e-b1eb-3d9e8ea550e4-client-ca\") pod \"route-controller-manager-7f8cd4bb4b-8wbjf\" (UID: \"03358e20-1724-438e-b1eb-3d9e8ea550e4\") " pod="openshift-route-controller-manager/route-controller-manager-7f8cd4bb4b-8wbjf" Feb 18 14:01:02 crc kubenswrapper[4817]: I0218 14:01:02.680963 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqrq2\" (UniqueName: \"kubernetes.io/projected/03358e20-1724-438e-b1eb-3d9e8ea550e4-kube-api-access-gqrq2\") pod \"route-controller-manager-7f8cd4bb4b-8wbjf\" (UID: \"03358e20-1724-438e-b1eb-3d9e8ea550e4\") " pod="openshift-route-controller-manager/route-controller-manager-7f8cd4bb4b-8wbjf" Feb 18 14:01:02 crc kubenswrapper[4817]: I0218 14:01:02.681017 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03358e20-1724-438e-b1eb-3d9e8ea550e4-config\") pod \"route-controller-manager-7f8cd4bb4b-8wbjf\" (UID: \"03358e20-1724-438e-b1eb-3d9e8ea550e4\") " pod="openshift-route-controller-manager/route-controller-manager-7f8cd4bb4b-8wbjf" Feb 18 14:01:02 crc kubenswrapper[4817]: I0218 14:01:02.681040 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03358e20-1724-438e-b1eb-3d9e8ea550e4-serving-cert\") pod \"route-controller-manager-7f8cd4bb4b-8wbjf\" (UID: \"03358e20-1724-438e-b1eb-3d9e8ea550e4\") " pod="openshift-route-controller-manager/route-controller-manager-7f8cd4bb4b-8wbjf" Feb 18 14:01:02 crc kubenswrapper[4817]: I0218 14:01:02.682831 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/03358e20-1724-438e-b1eb-3d9e8ea550e4-client-ca\") pod \"route-controller-manager-7f8cd4bb4b-8wbjf\" (UID: \"03358e20-1724-438e-b1eb-3d9e8ea550e4\") " pod="openshift-route-controller-manager/route-controller-manager-7f8cd4bb4b-8wbjf" Feb 18 14:01:02 crc kubenswrapper[4817]: I0218 14:01:02.683126 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03358e20-1724-438e-b1eb-3d9e8ea550e4-config\") pod \"route-controller-manager-7f8cd4bb4b-8wbjf\" (UID: \"03358e20-1724-438e-b1eb-3d9e8ea550e4\") " pod="openshift-route-controller-manager/route-controller-manager-7f8cd4bb4b-8wbjf" Feb 18 14:01:02 crc kubenswrapper[4817]: I0218 14:01:02.687130 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03358e20-1724-438e-b1eb-3d9e8ea550e4-serving-cert\") pod \"route-controller-manager-7f8cd4bb4b-8wbjf\" (UID: \"03358e20-1724-438e-b1eb-3d9e8ea550e4\") " pod="openshift-route-controller-manager/route-controller-manager-7f8cd4bb4b-8wbjf" Feb 18 14:01:02 crc kubenswrapper[4817]: I0218 14:01:02.707910 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqrq2\" (UniqueName: \"kubernetes.io/projected/03358e20-1724-438e-b1eb-3d9e8ea550e4-kube-api-access-gqrq2\") pod \"route-controller-manager-7f8cd4bb4b-8wbjf\" (UID: \"03358e20-1724-438e-b1eb-3d9e8ea550e4\") " pod="openshift-route-controller-manager/route-controller-manager-7f8cd4bb4b-8wbjf" Feb 18 14:01:02 crc kubenswrapper[4817]: I0218 14:01:02.831104 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f8cd4bb4b-8wbjf" Feb 18 14:01:03 crc kubenswrapper[4817]: I0218 14:01:03.068604 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f8cd4bb4b-8wbjf"] Feb 18 14:01:03 crc kubenswrapper[4817]: W0218 14:01:03.077604 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03358e20_1724_438e_b1eb_3d9e8ea550e4.slice/crio-dfee772858f26ab02ae54e51815f6c4955cbe4fed51cdf8f3ed4a22d3d28ee20 WatchSource:0}: Error finding container dfee772858f26ab02ae54e51815f6c4955cbe4fed51cdf8f3ed4a22d3d28ee20: Status 404 returned error can't find the container with id dfee772858f26ab02ae54e51815f6c4955cbe4fed51cdf8f3ed4a22d3d28ee20 Feb 18 14:01:03 crc kubenswrapper[4817]: I0218 14:01:03.496125 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f8cd4bb4b-8wbjf" event={"ID":"03358e20-1724-438e-b1eb-3d9e8ea550e4","Type":"ContainerStarted","Data":"ee9b4bc9970eb5d0e3787349501bd01e48d196cdc5483e58078728cd2e03b0e8"} Feb 18 14:01:03 crc kubenswrapper[4817]: I0218 14:01:03.496179 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f8cd4bb4b-8wbjf" event={"ID":"03358e20-1724-438e-b1eb-3d9e8ea550e4","Type":"ContainerStarted","Data":"dfee772858f26ab02ae54e51815f6c4955cbe4fed51cdf8f3ed4a22d3d28ee20"} Feb 18 14:01:03 crc kubenswrapper[4817]: I0218 14:01:03.497433 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7f8cd4bb4b-8wbjf" Feb 18 14:01:03 crc kubenswrapper[4817]: I0218 14:01:03.501170 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78c74bbf86-277pl" event={"ID":"0f24f96b-bc64-4d8e-be89-4b64b35ca424","Type":"ContainerStarted","Data":"f89359d23961476d842cdd26f2f753c3c0f3a162b22885b4f584ba3d2dff1754"} Feb 18 14:01:03 crc kubenswrapper[4817]: I0218 14:01:03.501421 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-78c74bbf86-277pl" Feb 18 14:01:03 crc kubenswrapper[4817]: I0218 14:01:03.507874 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-78c74bbf86-277pl" Feb 18 14:01:03 crc kubenswrapper[4817]: I0218 14:01:03.532851 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7f8cd4bb4b-8wbjf" podStartSLOduration=3.532821535 podStartE2EDuration="3.532821535s" podCreationTimestamp="2026-02-18 14:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:01:03.529425987 +0000 UTC m=+126.104961970" watchObservedRunningTime="2026-02-18 14:01:03.532821535 +0000 UTC m=+126.108357558" Feb 18 14:01:03 crc kubenswrapper[4817]: I0218 14:01:03.556937 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-78c74bbf86-277pl" podStartSLOduration=3.556915069 podStartE2EDuration="3.556915069s" podCreationTimestamp="2026-02-18 14:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:01:03.548635524 +0000 UTC m=+126.124171517" watchObservedRunningTime="2026-02-18 14:01:03.556915069 +0000 UTC m=+126.132451062" Feb 18 14:01:03 crc kubenswrapper[4817]: I0218 14:01:03.565349 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7f8cd4bb4b-8wbjf" Feb 18 14:01:09 crc kubenswrapper[4817]: I0218 14:01:09.164069 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" podUID="39a56faf-6fea-45d0-9531-fb86f571fd8b" containerName="oauth-openshift" containerID="cri-o://91fb855daa3b08d69c92e04c18b40eaae7d8f9a536834a012f82f23b235f7ec7" gracePeriod=15 Feb 18 14:01:09 crc kubenswrapper[4817]: I0218 14:01:09.551444 4817 generic.go:334] "Generic (PLEG): container finished" podID="39a56faf-6fea-45d0-9531-fb86f571fd8b" containerID="91fb855daa3b08d69c92e04c18b40eaae7d8f9a536834a012f82f23b235f7ec7" exitCode=0 Feb 18 14:01:09 crc kubenswrapper[4817]: I0218 14:01:09.551488 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" event={"ID":"39a56faf-6fea-45d0-9531-fb86f571fd8b","Type":"ContainerDied","Data":"91fb855daa3b08d69c92e04c18b40eaae7d8f9a536834a012f82f23b235f7ec7"} Feb 18 14:01:09 crc kubenswrapper[4817]: I0218 14:01:09.751124 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 14:01:09 crc kubenswrapper[4817]: I0218 14:01:09.901719 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-ocp-branding-template\") pod \"39a56faf-6fea-45d0-9531-fb86f571fd8b\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " Feb 18 14:01:09 crc kubenswrapper[4817]: I0218 14:01:09.901949 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-login\") pod \"39a56faf-6fea-45d0-9531-fb86f571fd8b\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " Feb 18 14:01:09 crc kubenswrapper[4817]: I0218 14:01:09.902036 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-router-certs\") pod \"39a56faf-6fea-45d0-9531-fb86f571fd8b\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " Feb 18 14:01:09 crc kubenswrapper[4817]: I0218 14:01:09.902113 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-error\") pod \"39a56faf-6fea-45d0-9531-fb86f571fd8b\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " Feb 18 14:01:09 crc kubenswrapper[4817]: I0218 14:01:09.902159 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/39a56faf-6fea-45d0-9531-fb86f571fd8b-audit-dir\") pod \"39a56faf-6fea-45d0-9531-fb86f571fd8b\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " Feb 18 14:01:09 crc kubenswrapper[4817]: I0218 14:01:09.902213 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-serving-cert\") pod \"39a56faf-6fea-45d0-9531-fb86f571fd8b\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " Feb 18 14:01:09 crc kubenswrapper[4817]: I0218 14:01:09.902302 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-audit-policies\") pod \"39a56faf-6fea-45d0-9531-fb86f571fd8b\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " Feb 18 14:01:09 crc kubenswrapper[4817]: I0218 14:01:09.902370 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-idp-0-file-data\") pod \"39a56faf-6fea-45d0-9531-fb86f571fd8b\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " Feb 18 14:01:09 crc kubenswrapper[4817]: I0218 14:01:09.902423 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-cliconfig\") pod \"39a56faf-6fea-45d0-9531-fb86f571fd8b\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " Feb 18 14:01:09 crc kubenswrapper[4817]: I0218 14:01:09.902460 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-service-ca\") pod \"39a56faf-6fea-45d0-9531-fb86f571fd8b\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " Feb 18 14:01:09 crc kubenswrapper[4817]: I0218 14:01:09.902516 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtns2\" (UniqueName: \"kubernetes.io/projected/39a56faf-6fea-45d0-9531-fb86f571fd8b-kube-api-access-gtns2\") pod \"39a56faf-6fea-45d0-9531-fb86f571fd8b\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " Feb 18 14:01:09 crc kubenswrapper[4817]: I0218 14:01:09.902558 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-provider-selection\") pod \"39a56faf-6fea-45d0-9531-fb86f571fd8b\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " Feb 18 14:01:09 crc kubenswrapper[4817]: I0218 14:01:09.902608 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-trusted-ca-bundle\") pod \"39a56faf-6fea-45d0-9531-fb86f571fd8b\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " Feb 18 14:01:09 crc kubenswrapper[4817]: I0218 14:01:09.902646 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-session\") pod \"39a56faf-6fea-45d0-9531-fb86f571fd8b\" (UID: \"39a56faf-6fea-45d0-9531-fb86f571fd8b\") " Feb 18 14:01:09 crc kubenswrapper[4817]: I0218 14:01:09.903159 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/39a56faf-6fea-45d0-9531-fb86f571fd8b-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "39a56faf-6fea-45d0-9531-fb86f571fd8b" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 14:01:09 crc kubenswrapper[4817]: I0218 14:01:09.904358 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "39a56faf-6fea-45d0-9531-fb86f571fd8b" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:01:09 crc kubenswrapper[4817]: I0218 14:01:09.904885 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "39a56faf-6fea-45d0-9531-fb86f571fd8b" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:01:09 crc kubenswrapper[4817]: I0218 14:01:09.905161 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "39a56faf-6fea-45d0-9531-fb86f571fd8b" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:01:09 crc kubenswrapper[4817]: I0218 14:01:09.905754 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "39a56faf-6fea-45d0-9531-fb86f571fd8b" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:01:09 crc kubenswrapper[4817]: I0218 14:01:09.911122 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "39a56faf-6fea-45d0-9531-fb86f571fd8b" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:01:09 crc kubenswrapper[4817]: I0218 14:01:09.911473 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "39a56faf-6fea-45d0-9531-fb86f571fd8b" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:01:09 crc kubenswrapper[4817]: I0218 14:01:09.912246 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "39a56faf-6fea-45d0-9531-fb86f571fd8b" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:01:09 crc kubenswrapper[4817]: I0218 14:01:09.912469 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "39a56faf-6fea-45d0-9531-fb86f571fd8b" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:01:09 crc kubenswrapper[4817]: I0218 14:01:09.912467 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39a56faf-6fea-45d0-9531-fb86f571fd8b-kube-api-access-gtns2" (OuterVolumeSpecName: "kube-api-access-gtns2") pod "39a56faf-6fea-45d0-9531-fb86f571fd8b" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b"). InnerVolumeSpecName "kube-api-access-gtns2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:01:09 crc kubenswrapper[4817]: I0218 14:01:09.913108 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "39a56faf-6fea-45d0-9531-fb86f571fd8b" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:01:09 crc kubenswrapper[4817]: I0218 14:01:09.913722 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "39a56faf-6fea-45d0-9531-fb86f571fd8b" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:01:09 crc kubenswrapper[4817]: I0218 14:01:09.914048 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "39a56faf-6fea-45d0-9531-fb86f571fd8b" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:01:09 crc kubenswrapper[4817]: I0218 14:01:09.914633 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "39a56faf-6fea-45d0-9531-fb86f571fd8b" (UID: "39a56faf-6fea-45d0-9531-fb86f571fd8b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:01:10 crc kubenswrapper[4817]: I0218 14:01:10.004234 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 18 14:01:10 crc kubenswrapper[4817]: I0218 14:01:10.004304 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtns2\" (UniqueName: \"kubernetes.io/projected/39a56faf-6fea-45d0-9531-fb86f571fd8b-kube-api-access-gtns2\") on node \"crc\" DevicePath \"\"" Feb 18 14:01:10 crc kubenswrapper[4817]: I0218 14:01:10.004328 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 18 14:01:10 crc kubenswrapper[4817]: I0218 14:01:10.004352 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:01:10 crc kubenswrapper[4817]: I0218 14:01:10.004370 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 18 14:01:10 crc kubenswrapper[4817]: I0218 14:01:10.004388 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 18 14:01:10 crc kubenswrapper[4817]: I0218 14:01:10.004406 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 18 14:01:10 crc kubenswrapper[4817]: I0218 14:01:10.004423 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 18 14:01:10 crc kubenswrapper[4817]: I0218 14:01:10.004442 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 18 14:01:10 crc kubenswrapper[4817]: I0218 14:01:10.004466 4817 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/39a56faf-6fea-45d0-9531-fb86f571fd8b-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 18 14:01:10 crc kubenswrapper[4817]: I0218 14:01:10.004485 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 14:01:10 crc kubenswrapper[4817]: I0218 14:01:10.004502 4817 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 18 14:01:10 crc kubenswrapper[4817]: I0218 14:01:10.004519 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:01:10 crc kubenswrapper[4817]: I0218 14:01:10.004538 4817 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/39a56faf-6fea-45d0-9531-fb86f571fd8b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 18 14:01:10 crc kubenswrapper[4817]: I0218 14:01:10.564668 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" Feb 18 14:01:10 crc kubenswrapper[4817]: I0218 14:01:10.564669 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bclz6" event={"ID":"39a56faf-6fea-45d0-9531-fb86f571fd8b","Type":"ContainerDied","Data":"617ee40a821789b35f46fd4fda50ddf8b86146ed0ba0705cbe8b35afdf0f6ccd"} Feb 18 14:01:10 crc kubenswrapper[4817]: I0218 14:01:10.564934 4817 scope.go:117] "RemoveContainer" containerID="91fb855daa3b08d69c92e04c18b40eaae7d8f9a536834a012f82f23b235f7ec7" Feb 18 14:01:10 crc kubenswrapper[4817]: I0218 14:01:10.597954 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bclz6"] Feb 18 14:01:10 crc kubenswrapper[4817]: I0218 14:01:10.606122 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bclz6"] Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.471528 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-54cb467b7d-jgbv8"] Feb 18 14:01:11 crc kubenswrapper[4817]: E0218 14:01:11.471747 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39a56faf-6fea-45d0-9531-fb86f571fd8b" containerName="oauth-openshift" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.471760 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="39a56faf-6fea-45d0-9531-fb86f571fd8b" containerName="oauth-openshift" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.471853 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="39a56faf-6fea-45d0-9531-fb86f571fd8b" containerName="oauth-openshift" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.472257 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.474923 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.476029 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.476118 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.476177 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.476669 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.477197 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.477367 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.477529 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.477713 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.477852 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.477916 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.480306 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.487190 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.491139 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.496033 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-54cb467b7d-jgbv8"] Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.497753 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.628695 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4cbcb73b-c91d-47a8-ae83-38439e150615-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.628770 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4cbcb73b-c91d-47a8-ae83-38439e150615-v4-0-config-system-service-ca\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.628820 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4cbcb73b-c91d-47a8-ae83-38439e150615-audit-dir\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.628854 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4cbcb73b-c91d-47a8-ae83-38439e150615-v4-0-config-user-template-login\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.628883 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4cbcb73b-c91d-47a8-ae83-38439e150615-v4-0-config-user-template-error\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.628910 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4cbcb73b-c91d-47a8-ae83-38439e150615-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.628931 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4cbcb73b-c91d-47a8-ae83-38439e150615-v4-0-config-system-router-certs\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.629406 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4cbcb73b-c91d-47a8-ae83-38439e150615-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.629491 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tctr2\" (UniqueName: \"kubernetes.io/projected/4cbcb73b-c91d-47a8-ae83-38439e150615-kube-api-access-tctr2\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.629548 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4cbcb73b-c91d-47a8-ae83-38439e150615-audit-policies\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.629628 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4cbcb73b-c91d-47a8-ae83-38439e150615-v4-0-config-system-cliconfig\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.629666 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4cbcb73b-c91d-47a8-ae83-38439e150615-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.629696 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4cbcb73b-c91d-47a8-ae83-38439e150615-v4-0-config-system-session\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.629742 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4cbcb73b-c91d-47a8-ae83-38439e150615-v4-0-config-system-serving-cert\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.730792 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tctr2\" (UniqueName: \"kubernetes.io/projected/4cbcb73b-c91d-47a8-ae83-38439e150615-kube-api-access-tctr2\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.730893 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4cbcb73b-c91d-47a8-ae83-38439e150615-audit-policies\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.730943 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4cbcb73b-c91d-47a8-ae83-38439e150615-v4-0-config-system-cliconfig\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.731032 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4cbcb73b-c91d-47a8-ae83-38439e150615-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.731104 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4cbcb73b-c91d-47a8-ae83-38439e150615-v4-0-config-system-session\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.731174 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4cbcb73b-c91d-47a8-ae83-38439e150615-v4-0-config-system-serving-cert\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.731256 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4cbcb73b-c91d-47a8-ae83-38439e150615-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.731321 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4cbcb73b-c91d-47a8-ae83-38439e150615-v4-0-config-system-service-ca\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.731400 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4cbcb73b-c91d-47a8-ae83-38439e150615-audit-dir\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.731464 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4cbcb73b-c91d-47a8-ae83-38439e150615-v4-0-config-user-template-login\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.731523 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4cbcb73b-c91d-47a8-ae83-38439e150615-v4-0-config-user-template-error\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.731581 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4cbcb73b-c91d-47a8-ae83-38439e150615-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.731609 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4cbcb73b-c91d-47a8-ae83-38439e150615-audit-dir\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.731629 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4cbcb73b-c91d-47a8-ae83-38439e150615-v4-0-config-system-router-certs\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.731742 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4cbcb73b-c91d-47a8-ae83-38439e150615-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.731864 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4cbcb73b-c91d-47a8-ae83-38439e150615-audit-policies\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.731901 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4cbcb73b-c91d-47a8-ae83-38439e150615-v4-0-config-system-cliconfig\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.732292 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4cbcb73b-c91d-47a8-ae83-38439e150615-v4-0-config-system-service-ca\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.733308 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4cbcb73b-c91d-47a8-ae83-38439e150615-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.736637 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4cbcb73b-c91d-47a8-ae83-38439e150615-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.737963 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4cbcb73b-c91d-47a8-ae83-38439e150615-v4-0-config-system-serving-cert\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.740233 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4cbcb73b-c91d-47a8-ae83-38439e150615-v4-0-config-user-template-error\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.740868 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4cbcb73b-c91d-47a8-ae83-38439e150615-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.742473 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4cbcb73b-c91d-47a8-ae83-38439e150615-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.747574 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4cbcb73b-c91d-47a8-ae83-38439e150615-v4-0-config-system-session\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.749907 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4cbcb73b-c91d-47a8-ae83-38439e150615-v4-0-config-system-router-certs\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.752083 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tctr2\" (UniqueName: \"kubernetes.io/projected/4cbcb73b-c91d-47a8-ae83-38439e150615-kube-api-access-tctr2\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.753138 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4cbcb73b-c91d-47a8-ae83-38439e150615-v4-0-config-user-template-login\") pod \"oauth-openshift-54cb467b7d-jgbv8\" (UID: \"4cbcb73b-c91d-47a8-ae83-38439e150615\") " pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.766176 4817 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.767196 4817 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.767391 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.767643 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://dcaf32faa113768c4a407bd44c1293e8b47dea2412a4cceda99805f5d13c7011" gracePeriod=15 Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.767738 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://68a4078a92ba550fe821084cb5337702f3b17125dbe215bbe057112a0d057c7f" gracePeriod=15 Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.767776 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://f192c80c5b84a0c8f855352f85dadcf7ca8cff63d3ebcd76ca3ecdb4459f467a" gracePeriod=15 Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.767695 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://d7ec21fa1560cd79fad347fcf677bd1f46b09ea43f3dfe3e072d7b8c233eae48" gracePeriod=15 Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.768083 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://c0109fd97f0cd99d13ff1d4b00b268ccecdbb5ab78143c6c7299526c8de0b701" gracePeriod=15 Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.771454 4817 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 18 14:01:11 crc kubenswrapper[4817]: E0218 14:01:11.771800 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.771841 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 18 14:01:11 crc kubenswrapper[4817]: E0218 14:01:11.771862 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.771880 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 18 14:01:11 crc kubenswrapper[4817]: E0218 14:01:11.771914 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.771930 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 18 14:01:11 crc kubenswrapper[4817]: E0218 14:01:11.771951 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.771967 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 18 14:01:11 crc kubenswrapper[4817]: E0218 14:01:11.772072 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.772090 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 18 14:01:11 crc kubenswrapper[4817]: E0218 14:01:11.772115 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.772132 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 18 14:01:11 crc kubenswrapper[4817]: E0218 14:01:11.772163 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.772180 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.772395 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.772428 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.772446 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.772469 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.772496 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.772907 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.807751 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:11 crc kubenswrapper[4817]: E0218 14:01:11.847996 4817 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.38:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.933566 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.933622 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.933644 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.933678 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.933696 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.933730 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.933754 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 14:01:11 crc kubenswrapper[4817]: I0218 14:01:11.933779 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 14:01:12 crc kubenswrapper[4817]: I0218 14:01:12.035691 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 14:01:12 crc kubenswrapper[4817]: I0218 14:01:12.035772 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 14:01:12 crc kubenswrapper[4817]: I0218 14:01:12.035813 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 14:01:12 crc kubenswrapper[4817]: I0218 14:01:12.035854 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 14:01:12 crc kubenswrapper[4817]: I0218 14:01:12.035865 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 14:01:12 crc kubenswrapper[4817]: I0218 14:01:12.035906 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 14:01:12 crc kubenswrapper[4817]: I0218 14:01:12.035920 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 14:01:12 crc kubenswrapper[4817]: I0218 14:01:12.035888 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 14:01:12 crc kubenswrapper[4817]: I0218 14:01:12.035950 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 14:01:12 crc kubenswrapper[4817]: I0218 14:01:12.035962 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 14:01:12 crc kubenswrapper[4817]: I0218 14:01:12.036014 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 14:01:12 crc kubenswrapper[4817]: I0218 14:01:12.036036 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 14:01:12 crc kubenswrapper[4817]: I0218 14:01:12.036041 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 14:01:12 crc kubenswrapper[4817]: I0218 14:01:12.036078 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 14:01:12 crc kubenswrapper[4817]: I0218 14:01:12.036116 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 14:01:12 crc kubenswrapper[4817]: I0218 14:01:12.036124 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 14:01:12 crc kubenswrapper[4817]: I0218 14:01:12.148916 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 14:01:12 crc kubenswrapper[4817]: W0218 14:01:12.169910 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-fbbe5e12908ecc7c79c1b060167b3daa92385e69aaaacd50a46387ce1c5ae217 WatchSource:0}: Error finding container fbbe5e12908ecc7c79c1b060167b3daa92385e69aaaacd50a46387ce1c5ae217: Status 404 returned error can't find the container with id fbbe5e12908ecc7c79c1b060167b3daa92385e69aaaacd50a46387ce1c5ae217 Feb 18 14:01:12 crc kubenswrapper[4817]: E0218 14:01:12.173677 4817 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.38:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18955c14a4b992be openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-18 14:01:12.17288467 +0000 UTC m=+134.748420653,LastTimestamp:2026-02-18 14:01:12.17288467 +0000 UTC m=+134.748420653,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 18 14:01:12 crc kubenswrapper[4817]: I0218 14:01:12.180317 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39a56faf-6fea-45d0-9531-fb86f571fd8b" path="/var/lib/kubelet/pods/39a56faf-6fea-45d0-9531-fb86f571fd8b/volumes" Feb 18 14:01:12 crc kubenswrapper[4817]: E0218 14:01:12.269138 4817 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.38:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" volumeName="registry-storage" Feb 18 14:01:12 crc kubenswrapper[4817]: I0218 14:01:12.586247 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 18 14:01:12 crc kubenswrapper[4817]: I0218 14:01:12.589455 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 18 14:01:12 crc kubenswrapper[4817]: I0218 14:01:12.590606 4817 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d7ec21fa1560cd79fad347fcf677bd1f46b09ea43f3dfe3e072d7b8c233eae48" exitCode=0 Feb 18 14:01:12 crc kubenswrapper[4817]: I0218 14:01:12.590642 4817 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="68a4078a92ba550fe821084cb5337702f3b17125dbe215bbe057112a0d057c7f" exitCode=0 Feb 18 14:01:12 crc kubenswrapper[4817]: I0218 14:01:12.590651 4817 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c0109fd97f0cd99d13ff1d4b00b268ccecdbb5ab78143c6c7299526c8de0b701" exitCode=0 Feb 18 14:01:12 crc kubenswrapper[4817]: I0218 14:01:12.590660 4817 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f192c80c5b84a0c8f855352f85dadcf7ca8cff63d3ebcd76ca3ecdb4459f467a" exitCode=2 Feb 18 14:01:12 crc kubenswrapper[4817]: I0218 14:01:12.590794 4817 scope.go:117] "RemoveContainer" containerID="eea321d8e7b36278725596935e697ed5a16ba2c976f519db63e438907d80dcce" Feb 18 14:01:12 crc kubenswrapper[4817]: I0218 14:01:12.593488 4817 generic.go:334] "Generic (PLEG): container finished" podID="c98705e3-757f-4e25-88df-5dcb9a727afa" containerID="29dde688cf7db1249d35eb8cc245d1611738535cc862bdcf2c52dc0e2ff83e16" exitCode=0 Feb 18 14:01:12 crc kubenswrapper[4817]: I0218 14:01:12.593561 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c98705e3-757f-4e25-88df-5dcb9a727afa","Type":"ContainerDied","Data":"29dde688cf7db1249d35eb8cc245d1611738535cc862bdcf2c52dc0e2ff83e16"} Feb 18 14:01:12 crc kubenswrapper[4817]: I0218 14:01:12.595672 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"aa48396296cb773da3dcc1d02047b9113f090271e65858f09b881aebfc496b24"} Feb 18 14:01:12 crc kubenswrapper[4817]: I0218 14:01:12.595703 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"fbbe5e12908ecc7c79c1b060167b3daa92385e69aaaacd50a46387ce1c5ae217"} Feb 18 14:01:12 crc kubenswrapper[4817]: I0218 14:01:12.596066 4817 status_manager.go:851] "Failed to get status for pod" podUID="c98705e3-757f-4e25-88df-5dcb9a727afa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Feb 18 14:01:12 crc kubenswrapper[4817]: E0218 14:01:12.596439 4817 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.38:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 14:01:12 crc kubenswrapper[4817]: I0218 14:01:12.597556 4817 status_manager.go:851] "Failed to get status for pod" podUID="c98705e3-757f-4e25-88df-5dcb9a727afa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Feb 18 14:01:12 crc kubenswrapper[4817]: E0218 14:01:12.598834 4817 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 18 14:01:12 crc kubenswrapper[4817]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-54cb467b7d-jgbv8_openshift-authentication_4cbcb73b-c91d-47a8-ae83-38439e150615_0(7f739db537396c3287f1bd92b5825e6106bca7a680fb173de2e73f539bdbd8c0): error adding pod openshift-authentication_oauth-openshift-54cb467b7d-jgbv8 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"7f739db537396c3287f1bd92b5825e6106bca7a680fb173de2e73f539bdbd8c0" Netns:"/var/run/netns/0279ee59-8907-4e05-a59f-0dbb2b6e8a67" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-54cb467b7d-jgbv8;K8S_POD_INFRA_CONTAINER_ID=7f739db537396c3287f1bd92b5825e6106bca7a680fb173de2e73f539bdbd8c0;K8S_POD_UID=4cbcb73b-c91d-47a8-ae83-38439e150615" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-54cb467b7d-jgbv8] networking: Multus: [openshift-authentication/oauth-openshift-54cb467b7d-jgbv8/4cbcb73b-c91d-47a8-ae83-38439e150615]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-54cb467b7d-jgbv8 in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-54cb467b7d-jgbv8 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-54cb467b7d-jgbv8?timeout=1m0s": dial tcp 38.102.83.38:6443: connect: connection refused Feb 18 14:01:12 crc kubenswrapper[4817]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 18 14:01:12 crc kubenswrapper[4817]: > Feb 18 14:01:12 crc kubenswrapper[4817]: E0218 14:01:12.598897 4817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 18 14:01:12 crc kubenswrapper[4817]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-54cb467b7d-jgbv8_openshift-authentication_4cbcb73b-c91d-47a8-ae83-38439e150615_0(7f739db537396c3287f1bd92b5825e6106bca7a680fb173de2e73f539bdbd8c0): error adding pod openshift-authentication_oauth-openshift-54cb467b7d-jgbv8 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"7f739db537396c3287f1bd92b5825e6106bca7a680fb173de2e73f539bdbd8c0" Netns:"/var/run/netns/0279ee59-8907-4e05-a59f-0dbb2b6e8a67" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-54cb467b7d-jgbv8;K8S_POD_INFRA_CONTAINER_ID=7f739db537396c3287f1bd92b5825e6106bca7a680fb173de2e73f539bdbd8c0;K8S_POD_UID=4cbcb73b-c91d-47a8-ae83-38439e150615" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-54cb467b7d-jgbv8] networking: Multus: [openshift-authentication/oauth-openshift-54cb467b7d-jgbv8/4cbcb73b-c91d-47a8-ae83-38439e150615]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-54cb467b7d-jgbv8 in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-54cb467b7d-jgbv8 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-54cb467b7d-jgbv8?timeout=1m0s": dial tcp 38.102.83.38:6443: connect: connection refused Feb 18 14:01:12 crc kubenswrapper[4817]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 18 14:01:12 crc kubenswrapper[4817]: > pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:12 crc kubenswrapper[4817]: E0218 14:01:12.598922 4817 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Feb 18 14:01:12 crc kubenswrapper[4817]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-54cb467b7d-jgbv8_openshift-authentication_4cbcb73b-c91d-47a8-ae83-38439e150615_0(7f739db537396c3287f1bd92b5825e6106bca7a680fb173de2e73f539bdbd8c0): error adding pod openshift-authentication_oauth-openshift-54cb467b7d-jgbv8 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"7f739db537396c3287f1bd92b5825e6106bca7a680fb173de2e73f539bdbd8c0" Netns:"/var/run/netns/0279ee59-8907-4e05-a59f-0dbb2b6e8a67" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-54cb467b7d-jgbv8;K8S_POD_INFRA_CONTAINER_ID=7f739db537396c3287f1bd92b5825e6106bca7a680fb173de2e73f539bdbd8c0;K8S_POD_UID=4cbcb73b-c91d-47a8-ae83-38439e150615" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-54cb467b7d-jgbv8] networking: Multus: [openshift-authentication/oauth-openshift-54cb467b7d-jgbv8/4cbcb73b-c91d-47a8-ae83-38439e150615]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-54cb467b7d-jgbv8 in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-54cb467b7d-jgbv8 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-54cb467b7d-jgbv8?timeout=1m0s": dial tcp 38.102.83.38:6443: connect: connection refused Feb 18 14:01:12 crc kubenswrapper[4817]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 18 14:01:12 crc kubenswrapper[4817]: > pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:12 crc kubenswrapper[4817]: E0218 14:01:12.599001 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-54cb467b7d-jgbv8_openshift-authentication(4cbcb73b-c91d-47a8-ae83-38439e150615)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-54cb467b7d-jgbv8_openshift-authentication(4cbcb73b-c91d-47a8-ae83-38439e150615)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-54cb467b7d-jgbv8_openshift-authentication_4cbcb73b-c91d-47a8-ae83-38439e150615_0(7f739db537396c3287f1bd92b5825e6106bca7a680fb173de2e73f539bdbd8c0): error adding pod openshift-authentication_oauth-openshift-54cb467b7d-jgbv8 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"7f739db537396c3287f1bd92b5825e6106bca7a680fb173de2e73f539bdbd8c0\\\" Netns:\\\"/var/run/netns/0279ee59-8907-4e05-a59f-0dbb2b6e8a67\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-54cb467b7d-jgbv8;K8S_POD_INFRA_CONTAINER_ID=7f739db537396c3287f1bd92b5825e6106bca7a680fb173de2e73f539bdbd8c0;K8S_POD_UID=4cbcb73b-c91d-47a8-ae83-38439e150615\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-54cb467b7d-jgbv8] networking: Multus: [openshift-authentication/oauth-openshift-54cb467b7d-jgbv8/4cbcb73b-c91d-47a8-ae83-38439e150615]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-54cb467b7d-jgbv8 in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-54cb467b7d-jgbv8 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-54cb467b7d-jgbv8?timeout=1m0s\\\": dial tcp 38.102.83.38:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" podUID="4cbcb73b-c91d-47a8-ae83-38439e150615" Feb 18 14:01:13 crc kubenswrapper[4817]: I0218 14:01:13.607114 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 18 14:01:13 crc kubenswrapper[4817]: I0218 14:01:13.608823 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:13 crc kubenswrapper[4817]: I0218 14:01:13.609696 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.128047 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.130498 4817 status_manager.go:851] "Failed to get status for pod" podUID="c98705e3-757f-4e25-88df-5dcb9a727afa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.130832 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.132132 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.132708 4817 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.133140 4817 status_manager.go:851] "Failed to get status for pod" podUID="c98705e3-757f-4e25-88df-5dcb9a727afa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.276326 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c98705e3-757f-4e25-88df-5dcb9a727afa-kubelet-dir\") pod \"c98705e3-757f-4e25-88df-5dcb9a727afa\" (UID: \"c98705e3-757f-4e25-88df-5dcb9a727afa\") " Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.276444 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.276559 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c98705e3-757f-4e25-88df-5dcb9a727afa-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c98705e3-757f-4e25-88df-5dcb9a727afa" (UID: "c98705e3-757f-4e25-88df-5dcb9a727afa"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.276653 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c98705e3-757f-4e25-88df-5dcb9a727afa-kube-api-access\") pod \"c98705e3-757f-4e25-88df-5dcb9a727afa\" (UID: \"c98705e3-757f-4e25-88df-5dcb9a727afa\") " Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.276691 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.276708 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c98705e3-757f-4e25-88df-5dcb9a727afa-var-lock\") pod \"c98705e3-757f-4e25-88df-5dcb9a727afa\" (UID: \"c98705e3-757f-4e25-88df-5dcb9a727afa\") " Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.276723 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.276715 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.276802 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.276828 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c98705e3-757f-4e25-88df-5dcb9a727afa-var-lock" (OuterVolumeSpecName: "var-lock") pod "c98705e3-757f-4e25-88df-5dcb9a727afa" (UID: "c98705e3-757f-4e25-88df-5dcb9a727afa"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.277055 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.277094 4817 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c98705e3-757f-4e25-88df-5dcb9a727afa-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.277206 4817 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.277237 4817 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.277261 4817 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c98705e3-757f-4e25-88df-5dcb9a727afa-var-lock\") on node \"crc\" DevicePath \"\"" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.298550 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c98705e3-757f-4e25-88df-5dcb9a727afa-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c98705e3-757f-4e25-88df-5dcb9a727afa" (UID: "c98705e3-757f-4e25-88df-5dcb9a727afa"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.378966 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c98705e3-757f-4e25-88df-5dcb9a727afa-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.379032 4817 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 18 14:01:14 crc kubenswrapper[4817]: E0218 14:01:14.578165 4817 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 18 14:01:14 crc kubenswrapper[4817]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-54cb467b7d-jgbv8_openshift-authentication_4cbcb73b-c91d-47a8-ae83-38439e150615_0(8bb0cdedf0d9dd52dc5919ddd81cbd2afe42262dcca6a597f3157821f47c0d8d): error adding pod openshift-authentication_oauth-openshift-54cb467b7d-jgbv8 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"8bb0cdedf0d9dd52dc5919ddd81cbd2afe42262dcca6a597f3157821f47c0d8d" Netns:"/var/run/netns/7a4570ac-7cf9-4be0-8a1b-a9bb697a2ef6" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-54cb467b7d-jgbv8;K8S_POD_INFRA_CONTAINER_ID=8bb0cdedf0d9dd52dc5919ddd81cbd2afe42262dcca6a597f3157821f47c0d8d;K8S_POD_UID=4cbcb73b-c91d-47a8-ae83-38439e150615" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-54cb467b7d-jgbv8] networking: Multus: [openshift-authentication/oauth-openshift-54cb467b7d-jgbv8/4cbcb73b-c91d-47a8-ae83-38439e150615]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-54cb467b7d-jgbv8 in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-54cb467b7d-jgbv8 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-54cb467b7d-jgbv8?timeout=1m0s": dial tcp 38.102.83.38:6443: connect: connection refused Feb 18 14:01:14 crc kubenswrapper[4817]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 18 14:01:14 crc kubenswrapper[4817]: > Feb 18 14:01:14 crc kubenswrapper[4817]: E0218 14:01:14.578280 4817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 18 14:01:14 crc kubenswrapper[4817]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-54cb467b7d-jgbv8_openshift-authentication_4cbcb73b-c91d-47a8-ae83-38439e150615_0(8bb0cdedf0d9dd52dc5919ddd81cbd2afe42262dcca6a597f3157821f47c0d8d): error adding pod openshift-authentication_oauth-openshift-54cb467b7d-jgbv8 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"8bb0cdedf0d9dd52dc5919ddd81cbd2afe42262dcca6a597f3157821f47c0d8d" Netns:"/var/run/netns/7a4570ac-7cf9-4be0-8a1b-a9bb697a2ef6" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-54cb467b7d-jgbv8;K8S_POD_INFRA_CONTAINER_ID=8bb0cdedf0d9dd52dc5919ddd81cbd2afe42262dcca6a597f3157821f47c0d8d;K8S_POD_UID=4cbcb73b-c91d-47a8-ae83-38439e150615" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-54cb467b7d-jgbv8] networking: Multus: [openshift-authentication/oauth-openshift-54cb467b7d-jgbv8/4cbcb73b-c91d-47a8-ae83-38439e150615]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-54cb467b7d-jgbv8 in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-54cb467b7d-jgbv8 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-54cb467b7d-jgbv8?timeout=1m0s": dial tcp 38.102.83.38:6443: connect: connection refused Feb 18 14:01:14 crc kubenswrapper[4817]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 18 14:01:14 crc kubenswrapper[4817]: > pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:14 crc kubenswrapper[4817]: E0218 14:01:14.578313 4817 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Feb 18 14:01:14 crc kubenswrapper[4817]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-54cb467b7d-jgbv8_openshift-authentication_4cbcb73b-c91d-47a8-ae83-38439e150615_0(8bb0cdedf0d9dd52dc5919ddd81cbd2afe42262dcca6a597f3157821f47c0d8d): error adding pod openshift-authentication_oauth-openshift-54cb467b7d-jgbv8 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"8bb0cdedf0d9dd52dc5919ddd81cbd2afe42262dcca6a597f3157821f47c0d8d" Netns:"/var/run/netns/7a4570ac-7cf9-4be0-8a1b-a9bb697a2ef6" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-54cb467b7d-jgbv8;K8S_POD_INFRA_CONTAINER_ID=8bb0cdedf0d9dd52dc5919ddd81cbd2afe42262dcca6a597f3157821f47c0d8d;K8S_POD_UID=4cbcb73b-c91d-47a8-ae83-38439e150615" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-54cb467b7d-jgbv8] networking: Multus: [openshift-authentication/oauth-openshift-54cb467b7d-jgbv8/4cbcb73b-c91d-47a8-ae83-38439e150615]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-54cb467b7d-jgbv8 in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-54cb467b7d-jgbv8 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-54cb467b7d-jgbv8?timeout=1m0s": dial tcp 38.102.83.38:6443: connect: connection refused Feb 18 14:01:14 crc kubenswrapper[4817]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 18 14:01:14 crc kubenswrapper[4817]: > pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:14 crc kubenswrapper[4817]: E0218 14:01:14.578405 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-54cb467b7d-jgbv8_openshift-authentication(4cbcb73b-c91d-47a8-ae83-38439e150615)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-54cb467b7d-jgbv8_openshift-authentication(4cbcb73b-c91d-47a8-ae83-38439e150615)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-54cb467b7d-jgbv8_openshift-authentication_4cbcb73b-c91d-47a8-ae83-38439e150615_0(8bb0cdedf0d9dd52dc5919ddd81cbd2afe42262dcca6a597f3157821f47c0d8d): error adding pod openshift-authentication_oauth-openshift-54cb467b7d-jgbv8 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"8bb0cdedf0d9dd52dc5919ddd81cbd2afe42262dcca6a597f3157821f47c0d8d\\\" Netns:\\\"/var/run/netns/7a4570ac-7cf9-4be0-8a1b-a9bb697a2ef6\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-54cb467b7d-jgbv8;K8S_POD_INFRA_CONTAINER_ID=8bb0cdedf0d9dd52dc5919ddd81cbd2afe42262dcca6a597f3157821f47c0d8d;K8S_POD_UID=4cbcb73b-c91d-47a8-ae83-38439e150615\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-54cb467b7d-jgbv8] networking: Multus: [openshift-authentication/oauth-openshift-54cb467b7d-jgbv8/4cbcb73b-c91d-47a8-ae83-38439e150615]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod oauth-openshift-54cb467b7d-jgbv8 in out of cluster comm: SetNetworkStatus: failed to update the pod oauth-openshift-54cb467b7d-jgbv8 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-54cb467b7d-jgbv8?timeout=1m0s\\\": dial tcp 38.102.83.38:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" podUID="4cbcb73b-c91d-47a8-ae83-38439e150615" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.622325 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.623544 4817 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dcaf32faa113768c4a407bd44c1293e8b47dea2412a4cceda99805f5d13c7011" exitCode=0 Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.623742 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.623656 4817 scope.go:117] "RemoveContainer" containerID="d7ec21fa1560cd79fad347fcf677bd1f46b09ea43f3dfe3e072d7b8c233eae48" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.624914 4817 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.625672 4817 status_manager.go:851] "Failed to get status for pod" podUID="c98705e3-757f-4e25-88df-5dcb9a727afa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.626491 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c98705e3-757f-4e25-88df-5dcb9a727afa","Type":"ContainerDied","Data":"4cd170d3dbdcaeed34661518f0b963bd6d63d9b4fea413b6e8d9cc38c804f622"} Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.626553 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cd170d3dbdcaeed34661518f0b963bd6d63d9b4fea413b6e8d9cc38c804f622" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.626598 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.643241 4817 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.643760 4817 status_manager.go:851] "Failed to get status for pod" podUID="c98705e3-757f-4e25-88df-5dcb9a727afa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.647141 4817 status_manager.go:851] "Failed to get status for pod" podUID="c98705e3-757f-4e25-88df-5dcb9a727afa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.648258 4817 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.653247 4817 scope.go:117] "RemoveContainer" containerID="68a4078a92ba550fe821084cb5337702f3b17125dbe215bbe057112a0d057c7f" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.681292 4817 scope.go:117] "RemoveContainer" containerID="c0109fd97f0cd99d13ff1d4b00b268ccecdbb5ab78143c6c7299526c8de0b701" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.705958 4817 scope.go:117] "RemoveContainer" containerID="f192c80c5b84a0c8f855352f85dadcf7ca8cff63d3ebcd76ca3ecdb4459f467a" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.732900 4817 scope.go:117] "RemoveContainer" containerID="dcaf32faa113768c4a407bd44c1293e8b47dea2412a4cceda99805f5d13c7011" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.767695 4817 scope.go:117] "RemoveContainer" containerID="7a253c8589983c04399a0e26dd9ef673d6ba96e4063974031901099b6518ead1" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.797401 4817 scope.go:117] "RemoveContainer" containerID="d7ec21fa1560cd79fad347fcf677bd1f46b09ea43f3dfe3e072d7b8c233eae48" Feb 18 14:01:14 crc kubenswrapper[4817]: E0218 14:01:14.797933 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7ec21fa1560cd79fad347fcf677bd1f46b09ea43f3dfe3e072d7b8c233eae48\": container with ID starting with d7ec21fa1560cd79fad347fcf677bd1f46b09ea43f3dfe3e072d7b8c233eae48 not found: ID does not exist" containerID="d7ec21fa1560cd79fad347fcf677bd1f46b09ea43f3dfe3e072d7b8c233eae48" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.797992 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7ec21fa1560cd79fad347fcf677bd1f46b09ea43f3dfe3e072d7b8c233eae48"} err="failed to get container status \"d7ec21fa1560cd79fad347fcf677bd1f46b09ea43f3dfe3e072d7b8c233eae48\": rpc error: code = NotFound desc = could not find container \"d7ec21fa1560cd79fad347fcf677bd1f46b09ea43f3dfe3e072d7b8c233eae48\": container with ID starting with d7ec21fa1560cd79fad347fcf677bd1f46b09ea43f3dfe3e072d7b8c233eae48 not found: ID does not exist" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.798020 4817 scope.go:117] "RemoveContainer" containerID="68a4078a92ba550fe821084cb5337702f3b17125dbe215bbe057112a0d057c7f" Feb 18 14:01:14 crc kubenswrapper[4817]: E0218 14:01:14.798563 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68a4078a92ba550fe821084cb5337702f3b17125dbe215bbe057112a0d057c7f\": container with ID starting with 68a4078a92ba550fe821084cb5337702f3b17125dbe215bbe057112a0d057c7f not found: ID does not exist" containerID="68a4078a92ba550fe821084cb5337702f3b17125dbe215bbe057112a0d057c7f" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.798625 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68a4078a92ba550fe821084cb5337702f3b17125dbe215bbe057112a0d057c7f"} err="failed to get container status \"68a4078a92ba550fe821084cb5337702f3b17125dbe215bbe057112a0d057c7f\": rpc error: code = NotFound desc = could not find container \"68a4078a92ba550fe821084cb5337702f3b17125dbe215bbe057112a0d057c7f\": container with ID starting with 68a4078a92ba550fe821084cb5337702f3b17125dbe215bbe057112a0d057c7f not found: ID does not exist" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.798673 4817 scope.go:117] "RemoveContainer" containerID="c0109fd97f0cd99d13ff1d4b00b268ccecdbb5ab78143c6c7299526c8de0b701" Feb 18 14:01:14 crc kubenswrapper[4817]: E0218 14:01:14.800843 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0109fd97f0cd99d13ff1d4b00b268ccecdbb5ab78143c6c7299526c8de0b701\": container with ID starting with c0109fd97f0cd99d13ff1d4b00b268ccecdbb5ab78143c6c7299526c8de0b701 not found: ID does not exist" containerID="c0109fd97f0cd99d13ff1d4b00b268ccecdbb5ab78143c6c7299526c8de0b701" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.800887 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0109fd97f0cd99d13ff1d4b00b268ccecdbb5ab78143c6c7299526c8de0b701"} err="failed to get container status \"c0109fd97f0cd99d13ff1d4b00b268ccecdbb5ab78143c6c7299526c8de0b701\": rpc error: code = NotFound desc = could not find container \"c0109fd97f0cd99d13ff1d4b00b268ccecdbb5ab78143c6c7299526c8de0b701\": container with ID starting with c0109fd97f0cd99d13ff1d4b00b268ccecdbb5ab78143c6c7299526c8de0b701 not found: ID does not exist" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.800917 4817 scope.go:117] "RemoveContainer" containerID="f192c80c5b84a0c8f855352f85dadcf7ca8cff63d3ebcd76ca3ecdb4459f467a" Feb 18 14:01:14 crc kubenswrapper[4817]: E0218 14:01:14.801561 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f192c80c5b84a0c8f855352f85dadcf7ca8cff63d3ebcd76ca3ecdb4459f467a\": container with ID starting with f192c80c5b84a0c8f855352f85dadcf7ca8cff63d3ebcd76ca3ecdb4459f467a not found: ID does not exist" containerID="f192c80c5b84a0c8f855352f85dadcf7ca8cff63d3ebcd76ca3ecdb4459f467a" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.801590 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f192c80c5b84a0c8f855352f85dadcf7ca8cff63d3ebcd76ca3ecdb4459f467a"} err="failed to get container status \"f192c80c5b84a0c8f855352f85dadcf7ca8cff63d3ebcd76ca3ecdb4459f467a\": rpc error: code = NotFound desc = could not find container \"f192c80c5b84a0c8f855352f85dadcf7ca8cff63d3ebcd76ca3ecdb4459f467a\": container with ID starting with f192c80c5b84a0c8f855352f85dadcf7ca8cff63d3ebcd76ca3ecdb4459f467a not found: ID does not exist" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.801609 4817 scope.go:117] "RemoveContainer" containerID="dcaf32faa113768c4a407bd44c1293e8b47dea2412a4cceda99805f5d13c7011" Feb 18 14:01:14 crc kubenswrapper[4817]: E0218 14:01:14.801954 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcaf32faa113768c4a407bd44c1293e8b47dea2412a4cceda99805f5d13c7011\": container with ID starting with dcaf32faa113768c4a407bd44c1293e8b47dea2412a4cceda99805f5d13c7011 not found: ID does not exist" containerID="dcaf32faa113768c4a407bd44c1293e8b47dea2412a4cceda99805f5d13c7011" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.802010 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcaf32faa113768c4a407bd44c1293e8b47dea2412a4cceda99805f5d13c7011"} err="failed to get container status \"dcaf32faa113768c4a407bd44c1293e8b47dea2412a4cceda99805f5d13c7011\": rpc error: code = NotFound desc = could not find container \"dcaf32faa113768c4a407bd44c1293e8b47dea2412a4cceda99805f5d13c7011\": container with ID starting with dcaf32faa113768c4a407bd44c1293e8b47dea2412a4cceda99805f5d13c7011 not found: ID does not exist" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.802036 4817 scope.go:117] "RemoveContainer" containerID="7a253c8589983c04399a0e26dd9ef673d6ba96e4063974031901099b6518ead1" Feb 18 14:01:14 crc kubenswrapper[4817]: E0218 14:01:14.802311 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a253c8589983c04399a0e26dd9ef673d6ba96e4063974031901099b6518ead1\": container with ID starting with 7a253c8589983c04399a0e26dd9ef673d6ba96e4063974031901099b6518ead1 not found: ID does not exist" containerID="7a253c8589983c04399a0e26dd9ef673d6ba96e4063974031901099b6518ead1" Feb 18 14:01:14 crc kubenswrapper[4817]: I0218 14:01:14.802340 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a253c8589983c04399a0e26dd9ef673d6ba96e4063974031901099b6518ead1"} err="failed to get container status \"7a253c8589983c04399a0e26dd9ef673d6ba96e4063974031901099b6518ead1\": rpc error: code = NotFound desc = could not find container \"7a253c8589983c04399a0e26dd9ef673d6ba96e4063974031901099b6518ead1\": container with ID starting with 7a253c8589983c04399a0e26dd9ef673d6ba96e4063974031901099b6518ead1 not found: ID does not exist" Feb 18 14:01:16 crc kubenswrapper[4817]: I0218 14:01:16.185820 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 18 14:01:16 crc kubenswrapper[4817]: E0218 14:01:16.738544 4817 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" Feb 18 14:01:16 crc kubenswrapper[4817]: E0218 14:01:16.739138 4817 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" Feb 18 14:01:16 crc kubenswrapper[4817]: E0218 14:01:16.739685 4817 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" Feb 18 14:01:16 crc kubenswrapper[4817]: E0218 14:01:16.740234 4817 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" Feb 18 14:01:16 crc kubenswrapper[4817]: E0218 14:01:16.740722 4817 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" Feb 18 14:01:16 crc kubenswrapper[4817]: I0218 14:01:16.740772 4817 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 18 14:01:16 crc kubenswrapper[4817]: E0218 14:01:16.741235 4817 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="200ms" Feb 18 14:01:16 crc kubenswrapper[4817]: E0218 14:01:16.941909 4817 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="400ms" Feb 18 14:01:16 crc kubenswrapper[4817]: E0218 14:01:16.946508 4817 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.38:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18955c14a4b992be openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-18 14:01:12.17288467 +0000 UTC m=+134.748420653,LastTimestamp:2026-02-18 14:01:12.17288467 +0000 UTC m=+134.748420653,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 18 14:01:17 crc kubenswrapper[4817]: E0218 14:01:17.141338 4817 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T14:01:17Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T14:01:17Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T14:01:17Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-18T14:01:17Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" Feb 18 14:01:17 crc kubenswrapper[4817]: E0218 14:01:17.142154 4817 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" Feb 18 14:01:17 crc kubenswrapper[4817]: E0218 14:01:17.142851 4817 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" Feb 18 14:01:17 crc kubenswrapper[4817]: E0218 14:01:17.143392 4817 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" Feb 18 14:01:17 crc kubenswrapper[4817]: E0218 14:01:17.143916 4817 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" Feb 18 14:01:17 crc kubenswrapper[4817]: E0218 14:01:17.143949 4817 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 18 14:01:17 crc kubenswrapper[4817]: E0218 14:01:17.342594 4817 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="800ms" Feb 18 14:01:18 crc kubenswrapper[4817]: E0218 14:01:18.145124 4817 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="1.6s" Feb 18 14:01:18 crc kubenswrapper[4817]: I0218 14:01:18.177885 4817 status_manager.go:851] "Failed to get status for pod" podUID="c98705e3-757f-4e25-88df-5dcb9a727afa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Feb 18 14:01:19 crc kubenswrapper[4817]: E0218 14:01:19.747079 4817 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="3.2s" Feb 18 14:01:22 crc kubenswrapper[4817]: E0218 14:01:22.948029 4817 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="6.4s" Feb 18 14:01:25 crc kubenswrapper[4817]: I0218 14:01:25.171149 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 14:01:25 crc kubenswrapper[4817]: I0218 14:01:25.173398 4817 status_manager.go:851] "Failed to get status for pod" podUID="c98705e3-757f-4e25-88df-5dcb9a727afa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Feb 18 14:01:25 crc kubenswrapper[4817]: I0218 14:01:25.197133 4817 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="31c676f0-4dbd-472a-8ee1-31adc0c27dd3" Feb 18 14:01:25 crc kubenswrapper[4817]: I0218 14:01:25.197186 4817 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="31c676f0-4dbd-472a-8ee1-31adc0c27dd3" Feb 18 14:01:25 crc kubenswrapper[4817]: E0218 14:01:25.197911 4817 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 14:01:25 crc kubenswrapper[4817]: I0218 14:01:25.198657 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 14:01:25 crc kubenswrapper[4817]: W0218 14:01:25.248181 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-e2eca912469226e7928553577630d1c90558e4b6e4f9bad225ad473b750bf4da WatchSource:0}: Error finding container e2eca912469226e7928553577630d1c90558e4b6e4f9bad225ad473b750bf4da: Status 404 returned error can't find the container with id e2eca912469226e7928553577630d1c90558e4b6e4f9bad225ad473b750bf4da Feb 18 14:01:25 crc kubenswrapper[4817]: I0218 14:01:25.715063 4817 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="6adb181bab101b09d9fad252229d1bbc1d3b08f0a3ca27ae3df239c4f1ffd50a" exitCode=0 Feb 18 14:01:25 crc kubenswrapper[4817]: I0218 14:01:25.715162 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"6adb181bab101b09d9fad252229d1bbc1d3b08f0a3ca27ae3df239c4f1ffd50a"} Feb 18 14:01:25 crc kubenswrapper[4817]: I0218 14:01:25.715288 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e2eca912469226e7928553577630d1c90558e4b6e4f9bad225ad473b750bf4da"} Feb 18 14:01:25 crc kubenswrapper[4817]: I0218 14:01:25.715739 4817 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="31c676f0-4dbd-472a-8ee1-31adc0c27dd3" Feb 18 14:01:25 crc kubenswrapper[4817]: I0218 14:01:25.715773 4817 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="31c676f0-4dbd-472a-8ee1-31adc0c27dd3" Feb 18 14:01:25 crc kubenswrapper[4817]: E0218 14:01:25.717262 4817 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 14:01:25 crc kubenswrapper[4817]: I0218 14:01:25.717259 4817 status_manager.go:851] "Failed to get status for pod" podUID="c98705e3-757f-4e25-88df-5dcb9a727afa" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Feb 18 14:01:26 crc kubenswrapper[4817]: I0218 14:01:26.727509 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"84df58ff30846eb86fc9400ba285e44a78e06b8312f1b485185312c545f1752a"} Feb 18 14:01:26 crc kubenswrapper[4817]: I0218 14:01:26.727851 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"20db89d1eb662ba157daf966ba4bc7a3a74683efb7e518d01309a36540fe06ba"} Feb 18 14:01:26 crc kubenswrapper[4817]: I0218 14:01:26.727865 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"85a8fc123a44f54f4f436729bf68b39512db54b021e79b7ad14f275e15d1a1d5"} Feb 18 14:01:27 crc kubenswrapper[4817]: I0218 14:01:27.170842 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:27 crc kubenswrapper[4817]: I0218 14:01:27.171569 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:27 crc kubenswrapper[4817]: I0218 14:01:27.552722 4817 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 18 14:01:27 crc kubenswrapper[4817]: I0218 14:01:27.553319 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 18 14:01:27 crc kubenswrapper[4817]: I0218 14:01:27.738830 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 18 14:01:27 crc kubenswrapper[4817]: I0218 14:01:27.738928 4817 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="f536b0d62076bf44c14eb23cf13fe5bc79c67153d27a1dca9df0577aa9c9f036" exitCode=1 Feb 18 14:01:27 crc kubenswrapper[4817]: I0218 14:01:27.739054 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"f536b0d62076bf44c14eb23cf13fe5bc79c67153d27a1dca9df0577aa9c9f036"} Feb 18 14:01:27 crc kubenswrapper[4817]: I0218 14:01:27.739785 4817 scope.go:117] "RemoveContainer" containerID="f536b0d62076bf44c14eb23cf13fe5bc79c67153d27a1dca9df0577aa9c9f036" Feb 18 14:01:27 crc kubenswrapper[4817]: I0218 14:01:27.744962 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"dc3ead0a09fbfb334d5c3c03f3b90d8b41d49d4def44807e1253021230126607"} Feb 18 14:01:27 crc kubenswrapper[4817]: I0218 14:01:27.745079 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"79dd71e05e36d6167ebf35106037f47e420d2e1acaf33d3376b5e20f4165b9cc"} Feb 18 14:01:27 crc kubenswrapper[4817]: I0218 14:01:27.745206 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 14:01:27 crc kubenswrapper[4817]: I0218 14:01:27.745234 4817 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="31c676f0-4dbd-472a-8ee1-31adc0c27dd3" Feb 18 14:01:27 crc kubenswrapper[4817]: I0218 14:01:27.745288 4817 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="31c676f0-4dbd-472a-8ee1-31adc0c27dd3" Feb 18 14:01:28 crc kubenswrapper[4817]: I0218 14:01:28.761519 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 18 14:01:28 crc kubenswrapper[4817]: I0218 14:01:28.761857 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"401381ada127de4742469ea0002cdf1d143d792ad0b0a157c70d5a65ce9eaa13"} Feb 18 14:01:30 crc kubenswrapper[4817]: I0218 14:01:30.198778 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 14:01:30 crc kubenswrapper[4817]: I0218 14:01:30.199473 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 14:01:30 crc kubenswrapper[4817]: I0218 14:01:30.207818 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 14:01:32 crc kubenswrapper[4817]: I0218 14:01:32.789111 4817 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 14:01:32 crc kubenswrapper[4817]: I0218 14:01:32.956856 4817 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="b6206b93-00a9-4b30-b1fa-69ff8b96a4e4" Feb 18 14:01:33 crc kubenswrapper[4817]: I0218 14:01:33.325791 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 14:01:33 crc kubenswrapper[4817]: I0218 14:01:33.329885 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 14:01:33 crc kubenswrapper[4817]: I0218 14:01:33.806309 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-54cb467b7d-jgbv8_4cbcb73b-c91d-47a8-ae83-38439e150615/oauth-openshift/0.log" Feb 18 14:01:33 crc kubenswrapper[4817]: I0218 14:01:33.806379 4817 generic.go:334] "Generic (PLEG): container finished" podID="4cbcb73b-c91d-47a8-ae83-38439e150615" containerID="4e64733c7a3e0101285b53d320500a4e9cbe019415d64f376a71626fda6cd68f" exitCode=255 Feb 18 14:01:33 crc kubenswrapper[4817]: I0218 14:01:33.806475 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" event={"ID":"4cbcb73b-c91d-47a8-ae83-38439e150615","Type":"ContainerDied","Data":"4e64733c7a3e0101285b53d320500a4e9cbe019415d64f376a71626fda6cd68f"} Feb 18 14:01:33 crc kubenswrapper[4817]: I0218 14:01:33.806575 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" event={"ID":"4cbcb73b-c91d-47a8-ae83-38439e150615","Type":"ContainerStarted","Data":"a9960474983cc5f0e849a725de74c594e8632829b49d6307bd19eefd99d7a061"} Feb 18 14:01:33 crc kubenswrapper[4817]: I0218 14:01:33.806638 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 14:01:33 crc kubenswrapper[4817]: I0218 14:01:33.806768 4817 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="31c676f0-4dbd-472a-8ee1-31adc0c27dd3" Feb 18 14:01:33 crc kubenswrapper[4817]: I0218 14:01:33.806784 4817 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="31c676f0-4dbd-472a-8ee1-31adc0c27dd3" Feb 18 14:01:33 crc kubenswrapper[4817]: I0218 14:01:33.807708 4817 scope.go:117] "RemoveContainer" containerID="4e64733c7a3e0101285b53d320500a4e9cbe019415d64f376a71626fda6cd68f" Feb 18 14:01:33 crc kubenswrapper[4817]: I0218 14:01:33.813780 4817 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="b6206b93-00a9-4b30-b1fa-69ff8b96a4e4" Feb 18 14:01:33 crc kubenswrapper[4817]: I0218 14:01:33.824550 4817 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://85a8fc123a44f54f4f436729bf68b39512db54b021e79b7ad14f275e15d1a1d5" Feb 18 14:01:33 crc kubenswrapper[4817]: I0218 14:01:33.824601 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 14:01:34 crc kubenswrapper[4817]: I0218 14:01:34.817017 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-54cb467b7d-jgbv8_4cbcb73b-c91d-47a8-ae83-38439e150615/oauth-openshift/1.log" Feb 18 14:01:34 crc kubenswrapper[4817]: I0218 14:01:34.819443 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-54cb467b7d-jgbv8_4cbcb73b-c91d-47a8-ae83-38439e150615/oauth-openshift/0.log" Feb 18 14:01:34 crc kubenswrapper[4817]: I0218 14:01:34.819533 4817 generic.go:334] "Generic (PLEG): container finished" podID="4cbcb73b-c91d-47a8-ae83-38439e150615" containerID="f5261660c53167837135f95214692189b40d55ace949f47f3f64b422aca08d29" exitCode=255 Feb 18 14:01:34 crc kubenswrapper[4817]: I0218 14:01:34.819715 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" event={"ID":"4cbcb73b-c91d-47a8-ae83-38439e150615","Type":"ContainerDied","Data":"f5261660c53167837135f95214692189b40d55ace949f47f3f64b422aca08d29"} Feb 18 14:01:34 crc kubenswrapper[4817]: I0218 14:01:34.819834 4817 scope.go:117] "RemoveContainer" containerID="4e64733c7a3e0101285b53d320500a4e9cbe019415d64f376a71626fda6cd68f" Feb 18 14:01:34 crc kubenswrapper[4817]: I0218 14:01:34.820744 4817 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="31c676f0-4dbd-472a-8ee1-31adc0c27dd3" Feb 18 14:01:34 crc kubenswrapper[4817]: I0218 14:01:34.820772 4817 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="31c676f0-4dbd-472a-8ee1-31adc0c27dd3" Feb 18 14:01:34 crc kubenswrapper[4817]: I0218 14:01:34.821023 4817 scope.go:117] "RemoveContainer" containerID="f5261660c53167837135f95214692189b40d55ace949f47f3f64b422aca08d29" Feb 18 14:01:34 crc kubenswrapper[4817]: E0218 14:01:34.821624 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-54cb467b7d-jgbv8_openshift-authentication(4cbcb73b-c91d-47a8-ae83-38439e150615)\"" pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" podUID="4cbcb73b-c91d-47a8-ae83-38439e150615" Feb 18 14:01:34 crc kubenswrapper[4817]: I0218 14:01:34.858801 4817 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="b6206b93-00a9-4b30-b1fa-69ff8b96a4e4" Feb 18 14:01:35 crc kubenswrapper[4817]: I0218 14:01:35.852870 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-54cb467b7d-jgbv8_4cbcb73b-c91d-47a8-ae83-38439e150615/oauth-openshift/1.log" Feb 18 14:01:41 crc kubenswrapper[4817]: I0218 14:01:41.808622 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:41 crc kubenswrapper[4817]: I0218 14:01:41.809069 4817 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:41 crc kubenswrapper[4817]: I0218 14:01:41.809802 4817 scope.go:117] "RemoveContainer" containerID="f5261660c53167837135f95214692189b40d55ace949f47f3f64b422aca08d29" Feb 18 14:01:41 crc kubenswrapper[4817]: E0218 14:01:41.810246 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-54cb467b7d-jgbv8_openshift-authentication(4cbcb73b-c91d-47a8-ae83-38439e150615)\"" pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" podUID="4cbcb73b-c91d-47a8-ae83-38439e150615" Feb 18 14:01:42 crc kubenswrapper[4817]: I0218 14:01:42.863073 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:01:42 crc kubenswrapper[4817]: I0218 14:01:42.863527 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:01:43 crc kubenswrapper[4817]: I0218 14:01:43.035805 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 18 14:01:43 crc kubenswrapper[4817]: I0218 14:01:43.136067 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 18 14:01:43 crc kubenswrapper[4817]: I0218 14:01:43.143769 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 18 14:01:43 crc kubenswrapper[4817]: I0218 14:01:43.453001 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 18 14:01:43 crc kubenswrapper[4817]: I0218 14:01:43.534192 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 18 14:01:43 crc kubenswrapper[4817]: I0218 14:01:43.735915 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 18 14:01:43 crc kubenswrapper[4817]: I0218 14:01:43.750922 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 18 14:01:44 crc kubenswrapper[4817]: I0218 14:01:44.008636 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 18 14:01:44 crc kubenswrapper[4817]: I0218 14:01:44.380696 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 18 14:01:44 crc kubenswrapper[4817]: I0218 14:01:44.510512 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 18 14:01:44 crc kubenswrapper[4817]: I0218 14:01:44.575736 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 18 14:01:44 crc kubenswrapper[4817]: I0218 14:01:44.848396 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 18 14:01:44 crc kubenswrapper[4817]: I0218 14:01:44.968852 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 18 14:01:44 crc kubenswrapper[4817]: I0218 14:01:44.979563 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 18 14:01:45 crc kubenswrapper[4817]: I0218 14:01:45.009209 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 18 14:01:45 crc kubenswrapper[4817]: I0218 14:01:45.028292 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 18 14:01:45 crc kubenswrapper[4817]: I0218 14:01:45.038466 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 18 14:01:45 crc kubenswrapper[4817]: I0218 14:01:45.302636 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 18 14:01:45 crc kubenswrapper[4817]: I0218 14:01:45.353719 4817 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 18 14:01:45 crc kubenswrapper[4817]: I0218 14:01:45.361546 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 18 14:01:45 crc kubenswrapper[4817]: I0218 14:01:45.361676 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 18 14:01:45 crc kubenswrapper[4817]: I0218 14:01:45.362436 4817 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="31c676f0-4dbd-472a-8ee1-31adc0c27dd3" Feb 18 14:01:45 crc kubenswrapper[4817]: I0218 14:01:45.362487 4817 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="31c676f0-4dbd-472a-8ee1-31adc0c27dd3" Feb 18 14:01:45 crc kubenswrapper[4817]: I0218 14:01:45.362837 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-54cb467b7d-jgbv8"] Feb 18 14:01:45 crc kubenswrapper[4817]: I0218 14:01:45.364162 4817 scope.go:117] "RemoveContainer" containerID="f5261660c53167837135f95214692189b40d55ace949f47f3f64b422aca08d29" Feb 18 14:01:45 crc kubenswrapper[4817]: I0218 14:01:45.369688 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 18 14:01:45 crc kubenswrapper[4817]: I0218 14:01:45.395671 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 18 14:01:45 crc kubenswrapper[4817]: I0218 14:01:45.398520 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=13.398496481 podStartE2EDuration="13.398496481s" podCreationTimestamp="2026-02-18 14:01:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:01:45.391153591 +0000 UTC m=+167.966689574" watchObservedRunningTime="2026-02-18 14:01:45.398496481 +0000 UTC m=+167.974032464" Feb 18 14:01:45 crc kubenswrapper[4817]: I0218 14:01:45.428169 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 18 14:01:45 crc kubenswrapper[4817]: I0218 14:01:45.578749 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 18 14:01:45 crc kubenswrapper[4817]: I0218 14:01:45.632148 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 18 14:01:45 crc kubenswrapper[4817]: I0218 14:01:45.644514 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 18 14:01:45 crc kubenswrapper[4817]: I0218 14:01:45.665738 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 18 14:01:45 crc kubenswrapper[4817]: I0218 14:01:45.681262 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 18 14:01:45 crc kubenswrapper[4817]: I0218 14:01:45.686113 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 18 14:01:45 crc kubenswrapper[4817]: I0218 14:01:45.753329 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 18 14:01:45 crc kubenswrapper[4817]: I0218 14:01:45.782159 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 18 14:01:45 crc kubenswrapper[4817]: I0218 14:01:45.909504 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 18 14:01:45 crc kubenswrapper[4817]: I0218 14:01:45.933327 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-54cb467b7d-jgbv8_4cbcb73b-c91d-47a8-ae83-38439e150615/oauth-openshift/1.log" Feb 18 14:01:45 crc kubenswrapper[4817]: I0218 14:01:45.933469 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" event={"ID":"4cbcb73b-c91d-47a8-ae83-38439e150615","Type":"ContainerStarted","Data":"06f6b0070da338049e9b0236b7dc6a0ed1829c066144896ba2efefb0cc5d9d4f"} Feb 18 14:01:45 crc kubenswrapper[4817]: I0218 14:01:45.935234 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:45 crc kubenswrapper[4817]: I0218 14:01:45.975708 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" podStartSLOduration=61.975678224 podStartE2EDuration="1m1.975678224s" podCreationTimestamp="2026-02-18 14:00:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:01:45.963190081 +0000 UTC m=+168.538726064" watchObservedRunningTime="2026-02-18 14:01:45.975678224 +0000 UTC m=+168.551214207" Feb 18 14:01:46 crc kubenswrapper[4817]: I0218 14:01:46.132291 4817 patch_prober.go:28] interesting pod/oauth-openshift-54cb467b7d-jgbv8 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.64:6443/healthz\": read tcp 10.217.0.2:58876->10.217.0.64:6443: read: connection reset by peer" start-of-body= Feb 18 14:01:46 crc kubenswrapper[4817]: I0218 14:01:46.132394 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" podUID="4cbcb73b-c91d-47a8-ae83-38439e150615" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.64:6443/healthz\": read tcp 10.217.0.2:58876->10.217.0.64:6443: read: connection reset by peer" Feb 18 14:01:46 crc kubenswrapper[4817]: I0218 14:01:46.171022 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 18 14:01:46 crc kubenswrapper[4817]: I0218 14:01:46.198266 4817 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 18 14:01:46 crc kubenswrapper[4817]: I0218 14:01:46.217720 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 18 14:01:46 crc kubenswrapper[4817]: I0218 14:01:46.244172 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 18 14:01:46 crc kubenswrapper[4817]: I0218 14:01:46.286056 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 18 14:01:46 crc kubenswrapper[4817]: I0218 14:01:46.348763 4817 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 18 14:01:46 crc kubenswrapper[4817]: I0218 14:01:46.368759 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 18 14:01:46 crc kubenswrapper[4817]: I0218 14:01:46.382734 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 18 14:01:46 crc kubenswrapper[4817]: I0218 14:01:46.423763 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 18 14:01:46 crc kubenswrapper[4817]: I0218 14:01:46.519070 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 18 14:01:46 crc kubenswrapper[4817]: I0218 14:01:46.566939 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 18 14:01:46 crc kubenswrapper[4817]: I0218 14:01:46.591826 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 18 14:01:46 crc kubenswrapper[4817]: I0218 14:01:46.777102 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 18 14:01:46 crc kubenswrapper[4817]: I0218 14:01:46.784929 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 18 14:01:46 crc kubenswrapper[4817]: I0218 14:01:46.813445 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 18 14:01:46 crc kubenswrapper[4817]: I0218 14:01:46.943828 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-54cb467b7d-jgbv8_4cbcb73b-c91d-47a8-ae83-38439e150615/oauth-openshift/2.log" Feb 18 14:01:46 crc kubenswrapper[4817]: I0218 14:01:46.944766 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-54cb467b7d-jgbv8_4cbcb73b-c91d-47a8-ae83-38439e150615/oauth-openshift/1.log" Feb 18 14:01:46 crc kubenswrapper[4817]: I0218 14:01:46.944853 4817 generic.go:334] "Generic (PLEG): container finished" podID="4cbcb73b-c91d-47a8-ae83-38439e150615" containerID="06f6b0070da338049e9b0236b7dc6a0ed1829c066144896ba2efefb0cc5d9d4f" exitCode=255 Feb 18 14:01:46 crc kubenswrapper[4817]: I0218 14:01:46.945068 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" event={"ID":"4cbcb73b-c91d-47a8-ae83-38439e150615","Type":"ContainerDied","Data":"06f6b0070da338049e9b0236b7dc6a0ed1829c066144896ba2efefb0cc5d9d4f"} Feb 18 14:01:46 crc kubenswrapper[4817]: I0218 14:01:46.945159 4817 scope.go:117] "RemoveContainer" containerID="f5261660c53167837135f95214692189b40d55ace949f47f3f64b422aca08d29" Feb 18 14:01:46 crc kubenswrapper[4817]: I0218 14:01:46.946094 4817 scope.go:117] "RemoveContainer" containerID="06f6b0070da338049e9b0236b7dc6a0ed1829c066144896ba2efefb0cc5d9d4f" Feb 18 14:01:46 crc kubenswrapper[4817]: E0218 14:01:46.946532 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 20s restarting failed container=oauth-openshift pod=oauth-openshift-54cb467b7d-jgbv8_openshift-authentication(4cbcb73b-c91d-47a8-ae83-38439e150615)\"" pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" podUID="4cbcb73b-c91d-47a8-ae83-38439e150615" Feb 18 14:01:46 crc kubenswrapper[4817]: I0218 14:01:46.986476 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 18 14:01:47 crc kubenswrapper[4817]: I0218 14:01:47.011009 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 18 14:01:47 crc kubenswrapper[4817]: I0218 14:01:47.020111 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 18 14:01:47 crc kubenswrapper[4817]: I0218 14:01:47.166868 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 18 14:01:47 crc kubenswrapper[4817]: I0218 14:01:47.238606 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 18 14:01:47 crc kubenswrapper[4817]: I0218 14:01:47.292622 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 18 14:01:47 crc kubenswrapper[4817]: I0218 14:01:47.369609 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 18 14:01:47 crc kubenswrapper[4817]: I0218 14:01:47.401448 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 18 14:01:47 crc kubenswrapper[4817]: I0218 14:01:47.490293 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 18 14:01:47 crc kubenswrapper[4817]: I0218 14:01:47.491458 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 18 14:01:47 crc kubenswrapper[4817]: I0218 14:01:47.556885 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 18 14:01:47 crc kubenswrapper[4817]: I0218 14:01:47.567807 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 18 14:01:47 crc kubenswrapper[4817]: I0218 14:01:47.664919 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 18 14:01:47 crc kubenswrapper[4817]: I0218 14:01:47.727175 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 18 14:01:47 crc kubenswrapper[4817]: I0218 14:01:47.768859 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 18 14:01:47 crc kubenswrapper[4817]: I0218 14:01:47.854868 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 18 14:01:47 crc kubenswrapper[4817]: I0218 14:01:47.860890 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 18 14:01:47 crc kubenswrapper[4817]: I0218 14:01:47.951780 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 18 14:01:47 crc kubenswrapper[4817]: I0218 14:01:47.957717 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-54cb467b7d-jgbv8_4cbcb73b-c91d-47a8-ae83-38439e150615/oauth-openshift/2.log" Feb 18 14:01:47 crc kubenswrapper[4817]: I0218 14:01:47.958523 4817 scope.go:117] "RemoveContainer" containerID="06f6b0070da338049e9b0236b7dc6a0ed1829c066144896ba2efefb0cc5d9d4f" Feb 18 14:01:47 crc kubenswrapper[4817]: E0218 14:01:47.958817 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 20s restarting failed container=oauth-openshift pod=oauth-openshift-54cb467b7d-jgbv8_openshift-authentication(4cbcb73b-c91d-47a8-ae83-38439e150615)\"" pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" podUID="4cbcb73b-c91d-47a8-ae83-38439e150615" Feb 18 14:01:47 crc kubenswrapper[4817]: I0218 14:01:47.971755 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 18 14:01:48 crc kubenswrapper[4817]: I0218 14:01:48.157937 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 18 14:01:48 crc kubenswrapper[4817]: I0218 14:01:48.170104 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 18 14:01:48 crc kubenswrapper[4817]: I0218 14:01:48.232054 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 18 14:01:48 crc kubenswrapper[4817]: I0218 14:01:48.274310 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 18 14:01:48 crc kubenswrapper[4817]: I0218 14:01:48.363603 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 18 14:01:48 crc kubenswrapper[4817]: I0218 14:01:48.562458 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 18 14:01:48 crc kubenswrapper[4817]: I0218 14:01:48.571382 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 18 14:01:48 crc kubenswrapper[4817]: I0218 14:01:48.611359 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 18 14:01:48 crc kubenswrapper[4817]: I0218 14:01:48.616119 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 18 14:01:48 crc kubenswrapper[4817]: I0218 14:01:48.689226 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 18 14:01:48 crc kubenswrapper[4817]: I0218 14:01:48.742412 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 18 14:01:48 crc kubenswrapper[4817]: I0218 14:01:48.848098 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 18 14:01:48 crc kubenswrapper[4817]: I0218 14:01:48.906682 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 18 14:01:48 crc kubenswrapper[4817]: I0218 14:01:48.907177 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 18 14:01:48 crc kubenswrapper[4817]: I0218 14:01:48.958528 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 18 14:01:48 crc kubenswrapper[4817]: I0218 14:01:48.970282 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 18 14:01:48 crc kubenswrapper[4817]: I0218 14:01:48.981643 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 18 14:01:48 crc kubenswrapper[4817]: I0218 14:01:48.990948 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 18 14:01:49 crc kubenswrapper[4817]: I0218 14:01:49.042557 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 18 14:01:49 crc kubenswrapper[4817]: I0218 14:01:49.060961 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 18 14:01:49 crc kubenswrapper[4817]: I0218 14:01:49.311080 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 18 14:01:49 crc kubenswrapper[4817]: I0218 14:01:49.455198 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 18 14:01:49 crc kubenswrapper[4817]: I0218 14:01:49.469131 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 18 14:01:49 crc kubenswrapper[4817]: I0218 14:01:49.501967 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 18 14:01:49 crc kubenswrapper[4817]: I0218 14:01:49.547928 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 18 14:01:49 crc kubenswrapper[4817]: I0218 14:01:49.614153 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 18 14:01:49 crc kubenswrapper[4817]: I0218 14:01:49.673345 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 18 14:01:49 crc kubenswrapper[4817]: I0218 14:01:49.679698 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 18 14:01:49 crc kubenswrapper[4817]: I0218 14:01:49.682537 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 18 14:01:49 crc kubenswrapper[4817]: I0218 14:01:49.747882 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 18 14:01:49 crc kubenswrapper[4817]: I0218 14:01:49.811298 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 18 14:01:49 crc kubenswrapper[4817]: I0218 14:01:49.828208 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 18 14:01:49 crc kubenswrapper[4817]: I0218 14:01:49.895089 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 18 14:01:49 crc kubenswrapper[4817]: I0218 14:01:49.947214 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 18 14:01:49 crc kubenswrapper[4817]: I0218 14:01:49.985021 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 18 14:01:50 crc kubenswrapper[4817]: I0218 14:01:50.047572 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 18 14:01:50 crc kubenswrapper[4817]: I0218 14:01:50.058320 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 18 14:01:50 crc kubenswrapper[4817]: I0218 14:01:50.208921 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 18 14:01:50 crc kubenswrapper[4817]: I0218 14:01:50.281661 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 18 14:01:50 crc kubenswrapper[4817]: I0218 14:01:50.282871 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 18 14:01:50 crc kubenswrapper[4817]: I0218 14:01:50.327406 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 18 14:01:50 crc kubenswrapper[4817]: I0218 14:01:50.426144 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 18 14:01:50 crc kubenswrapper[4817]: I0218 14:01:50.465455 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 18 14:01:50 crc kubenswrapper[4817]: I0218 14:01:50.488062 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 18 14:01:50 crc kubenswrapper[4817]: I0218 14:01:50.547860 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 18 14:01:50 crc kubenswrapper[4817]: I0218 14:01:50.566484 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 18 14:01:50 crc kubenswrapper[4817]: I0218 14:01:50.776327 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 18 14:01:50 crc kubenswrapper[4817]: I0218 14:01:50.794563 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 18 14:01:50 crc kubenswrapper[4817]: I0218 14:01:50.911428 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 18 14:01:51 crc kubenswrapper[4817]: I0218 14:01:51.064362 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 18 14:01:51 crc kubenswrapper[4817]: I0218 14:01:51.067467 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 18 14:01:51 crc kubenswrapper[4817]: I0218 14:01:51.074700 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 18 14:01:51 crc kubenswrapper[4817]: I0218 14:01:51.210013 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 18 14:01:51 crc kubenswrapper[4817]: I0218 14:01:51.212563 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 18 14:01:51 crc kubenswrapper[4817]: I0218 14:01:51.336706 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 18 14:01:51 crc kubenswrapper[4817]: I0218 14:01:51.339385 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 18 14:01:51 crc kubenswrapper[4817]: I0218 14:01:51.339892 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 18 14:01:51 crc kubenswrapper[4817]: I0218 14:01:51.344682 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 18 14:01:51 crc kubenswrapper[4817]: I0218 14:01:51.432662 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 18 14:01:51 crc kubenswrapper[4817]: I0218 14:01:51.494130 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 18 14:01:51 crc kubenswrapper[4817]: I0218 14:01:51.503042 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 18 14:01:51 crc kubenswrapper[4817]: I0218 14:01:51.577563 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 18 14:01:51 crc kubenswrapper[4817]: I0218 14:01:51.600472 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 18 14:01:51 crc kubenswrapper[4817]: I0218 14:01:51.755739 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 18 14:01:51 crc kubenswrapper[4817]: I0218 14:01:51.783833 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 18 14:01:51 crc kubenswrapper[4817]: I0218 14:01:51.808194 4817 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:01:51 crc kubenswrapper[4817]: I0218 14:01:51.809582 4817 scope.go:117] "RemoveContainer" containerID="06f6b0070da338049e9b0236b7dc6a0ed1829c066144896ba2efefb0cc5d9d4f" Feb 18 14:01:51 crc kubenswrapper[4817]: E0218 14:01:51.810215 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 20s restarting failed container=oauth-openshift pod=oauth-openshift-54cb467b7d-jgbv8_openshift-authentication(4cbcb73b-c91d-47a8-ae83-38439e150615)\"" pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" podUID="4cbcb73b-c91d-47a8-ae83-38439e150615" Feb 18 14:01:51 crc kubenswrapper[4817]: I0218 14:01:51.918816 4817 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 18 14:01:51 crc kubenswrapper[4817]: I0218 14:01:51.930963 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 18 14:01:51 crc kubenswrapper[4817]: I0218 14:01:51.934192 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 18 14:01:51 crc kubenswrapper[4817]: I0218 14:01:51.945716 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 18 14:01:51 crc kubenswrapper[4817]: I0218 14:01:51.972509 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 18 14:01:52 crc kubenswrapper[4817]: I0218 14:01:52.039453 4817 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 18 14:01:52 crc kubenswrapper[4817]: I0218 14:01:52.043585 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 18 14:01:52 crc kubenswrapper[4817]: I0218 14:01:52.071250 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 18 14:01:52 crc kubenswrapper[4817]: I0218 14:01:52.141064 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 18 14:01:52 crc kubenswrapper[4817]: I0218 14:01:52.157599 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 18 14:01:52 crc kubenswrapper[4817]: I0218 14:01:52.190419 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 18 14:01:52 crc kubenswrapper[4817]: I0218 14:01:52.190828 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 18 14:01:52 crc kubenswrapper[4817]: I0218 14:01:52.208377 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 18 14:01:52 crc kubenswrapper[4817]: I0218 14:01:52.311066 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 18 14:01:52 crc kubenswrapper[4817]: I0218 14:01:52.328788 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 18 14:01:52 crc kubenswrapper[4817]: I0218 14:01:52.385382 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 18 14:01:52 crc kubenswrapper[4817]: I0218 14:01:52.509933 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 18 14:01:52 crc kubenswrapper[4817]: I0218 14:01:52.664373 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 18 14:01:52 crc kubenswrapper[4817]: I0218 14:01:52.673075 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 18 14:01:52 crc kubenswrapper[4817]: I0218 14:01:52.681532 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 18 14:01:52 crc kubenswrapper[4817]: I0218 14:01:52.913399 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 18 14:01:52 crc kubenswrapper[4817]: I0218 14:01:52.924707 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 18 14:01:52 crc kubenswrapper[4817]: I0218 14:01:52.940219 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 18 14:01:52 crc kubenswrapper[4817]: I0218 14:01:52.954927 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 18 14:01:52 crc kubenswrapper[4817]: I0218 14:01:52.975020 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 18 14:01:52 crc kubenswrapper[4817]: I0218 14:01:52.987482 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 18 14:01:53 crc kubenswrapper[4817]: I0218 14:01:53.001851 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 18 14:01:53 crc kubenswrapper[4817]: I0218 14:01:53.075503 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 18 14:01:53 crc kubenswrapper[4817]: I0218 14:01:53.243084 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 18 14:01:53 crc kubenswrapper[4817]: I0218 14:01:53.299120 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 18 14:01:53 crc kubenswrapper[4817]: I0218 14:01:53.350828 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 18 14:01:53 crc kubenswrapper[4817]: I0218 14:01:53.365232 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 18 14:01:53 crc kubenswrapper[4817]: I0218 14:01:53.448068 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 18 14:01:53 crc kubenswrapper[4817]: I0218 14:01:53.546745 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 18 14:01:53 crc kubenswrapper[4817]: I0218 14:01:53.556686 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 18 14:01:53 crc kubenswrapper[4817]: I0218 14:01:53.590340 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 18 14:01:53 crc kubenswrapper[4817]: I0218 14:01:53.593350 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 18 14:01:53 crc kubenswrapper[4817]: I0218 14:01:53.651559 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 18 14:01:53 crc kubenswrapper[4817]: I0218 14:01:53.676388 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 18 14:01:53 crc kubenswrapper[4817]: I0218 14:01:53.790219 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 18 14:01:53 crc kubenswrapper[4817]: I0218 14:01:53.874256 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 18 14:01:53 crc kubenswrapper[4817]: I0218 14:01:53.914440 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 18 14:01:53 crc kubenswrapper[4817]: I0218 14:01:53.931226 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 18 14:01:53 crc kubenswrapper[4817]: I0218 14:01:53.998213 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 18 14:01:54 crc kubenswrapper[4817]: I0218 14:01:54.093877 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 18 14:01:54 crc kubenswrapper[4817]: I0218 14:01:54.103256 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 18 14:01:54 crc kubenswrapper[4817]: I0218 14:01:54.162691 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 18 14:01:54 crc kubenswrapper[4817]: I0218 14:01:54.210075 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 18 14:01:54 crc kubenswrapper[4817]: I0218 14:01:54.252079 4817 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 18 14:01:54 crc kubenswrapper[4817]: I0218 14:01:54.252413 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://aa48396296cb773da3dcc1d02047b9113f090271e65858f09b881aebfc496b24" gracePeriod=5 Feb 18 14:01:54 crc kubenswrapper[4817]: I0218 14:01:54.287429 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 18 14:01:54 crc kubenswrapper[4817]: I0218 14:01:54.299302 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 18 14:01:54 crc kubenswrapper[4817]: I0218 14:01:54.319788 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 18 14:01:54 crc kubenswrapper[4817]: I0218 14:01:54.442375 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 18 14:01:54 crc kubenswrapper[4817]: I0218 14:01:54.574779 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 18 14:01:54 crc kubenswrapper[4817]: I0218 14:01:54.608749 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 18 14:01:54 crc kubenswrapper[4817]: I0218 14:01:54.614937 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 18 14:01:54 crc kubenswrapper[4817]: I0218 14:01:54.683778 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 18 14:01:54 crc kubenswrapper[4817]: I0218 14:01:54.712521 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 18 14:01:54 crc kubenswrapper[4817]: I0218 14:01:54.765698 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 18 14:01:54 crc kubenswrapper[4817]: I0218 14:01:54.779123 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 18 14:01:54 crc kubenswrapper[4817]: I0218 14:01:54.811484 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 18 14:01:54 crc kubenswrapper[4817]: I0218 14:01:54.838157 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 18 14:01:54 crc kubenswrapper[4817]: I0218 14:01:54.980395 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 18 14:01:54 crc kubenswrapper[4817]: I0218 14:01:54.993472 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 18 14:01:55 crc kubenswrapper[4817]: I0218 14:01:55.159468 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 18 14:01:55 crc kubenswrapper[4817]: I0218 14:01:55.195378 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 18 14:01:55 crc kubenswrapper[4817]: I0218 14:01:55.267535 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 18 14:01:55 crc kubenswrapper[4817]: I0218 14:01:55.281265 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 18 14:01:55 crc kubenswrapper[4817]: I0218 14:01:55.304296 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 18 14:01:55 crc kubenswrapper[4817]: I0218 14:01:55.339301 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 18 14:01:55 crc kubenswrapper[4817]: I0218 14:01:55.370516 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 18 14:01:55 crc kubenswrapper[4817]: I0218 14:01:55.374743 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 18 14:01:55 crc kubenswrapper[4817]: I0218 14:01:55.435103 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 18 14:01:55 crc kubenswrapper[4817]: I0218 14:01:55.476043 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 18 14:01:55 crc kubenswrapper[4817]: I0218 14:01:55.514393 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 18 14:01:55 crc kubenswrapper[4817]: I0218 14:01:55.540554 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 18 14:01:55 crc kubenswrapper[4817]: I0218 14:01:55.601471 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 18 14:01:55 crc kubenswrapper[4817]: I0218 14:01:55.646610 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 18 14:01:55 crc kubenswrapper[4817]: I0218 14:01:55.688233 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 18 14:01:55 crc kubenswrapper[4817]: I0218 14:01:55.699116 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 18 14:01:55 crc kubenswrapper[4817]: I0218 14:01:55.709680 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 18 14:01:55 crc kubenswrapper[4817]: I0218 14:01:55.720072 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 18 14:01:55 crc kubenswrapper[4817]: I0218 14:01:55.871445 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 18 14:01:55 crc kubenswrapper[4817]: I0218 14:01:55.905681 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 18 14:01:55 crc kubenswrapper[4817]: I0218 14:01:55.907803 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 18 14:01:55 crc kubenswrapper[4817]: I0218 14:01:55.930655 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 18 14:01:56 crc kubenswrapper[4817]: I0218 14:01:56.106770 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 18 14:01:56 crc kubenswrapper[4817]: I0218 14:01:56.138658 4817 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 18 14:01:56 crc kubenswrapper[4817]: I0218 14:01:56.183888 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 18 14:01:56 crc kubenswrapper[4817]: I0218 14:01:56.241971 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 18 14:01:56 crc kubenswrapper[4817]: I0218 14:01:56.334425 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 18 14:01:56 crc kubenswrapper[4817]: I0218 14:01:56.369893 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 18 14:01:56 crc kubenswrapper[4817]: I0218 14:01:56.380487 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 18 14:01:56 crc kubenswrapper[4817]: I0218 14:01:56.415658 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 18 14:01:56 crc kubenswrapper[4817]: I0218 14:01:56.468092 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 18 14:01:56 crc kubenswrapper[4817]: I0218 14:01:56.551541 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 18 14:01:56 crc kubenswrapper[4817]: I0218 14:01:56.614603 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 18 14:01:56 crc kubenswrapper[4817]: I0218 14:01:56.658142 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 18 14:01:56 crc kubenswrapper[4817]: I0218 14:01:56.695904 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 18 14:01:56 crc kubenswrapper[4817]: I0218 14:01:56.745232 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 18 14:01:56 crc kubenswrapper[4817]: I0218 14:01:56.766295 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 18 14:01:56 crc kubenswrapper[4817]: I0218 14:01:56.797847 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 18 14:01:56 crc kubenswrapper[4817]: I0218 14:01:56.831618 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 18 14:01:57 crc kubenswrapper[4817]: I0218 14:01:57.147889 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 18 14:01:57 crc kubenswrapper[4817]: I0218 14:01:57.163413 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 18 14:01:57 crc kubenswrapper[4817]: I0218 14:01:57.234225 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 18 14:01:57 crc kubenswrapper[4817]: I0218 14:01:57.261683 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 18 14:01:57 crc kubenswrapper[4817]: I0218 14:01:57.394419 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 18 14:01:57 crc kubenswrapper[4817]: I0218 14:01:57.451602 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 18 14:01:57 crc kubenswrapper[4817]: I0218 14:01:57.457723 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 18 14:01:57 crc kubenswrapper[4817]: I0218 14:01:57.518552 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 18 14:01:57 crc kubenswrapper[4817]: I0218 14:01:57.992488 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 18 14:01:58 crc kubenswrapper[4817]: I0218 14:01:58.102893 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 18 14:01:58 crc kubenswrapper[4817]: I0218 14:01:58.264357 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 18 14:01:58 crc kubenswrapper[4817]: I0218 14:01:58.332210 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 18 14:01:58 crc kubenswrapper[4817]: I0218 14:01:58.350266 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 18 14:01:58 crc kubenswrapper[4817]: I0218 14:01:58.382186 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 18 14:01:58 crc kubenswrapper[4817]: I0218 14:01:58.436453 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 18 14:01:58 crc kubenswrapper[4817]: I0218 14:01:58.604919 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 18 14:01:58 crc kubenswrapper[4817]: I0218 14:01:58.691817 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 18 14:01:58 crc kubenswrapper[4817]: I0218 14:01:58.931053 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 18 14:01:59 crc kubenswrapper[4817]: I0218 14:01:59.116097 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 18 14:01:59 crc kubenswrapper[4817]: I0218 14:01:59.210493 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 18 14:01:59 crc kubenswrapper[4817]: I0218 14:01:59.838715 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 18 14:01:59 crc kubenswrapper[4817]: I0218 14:01:59.838812 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 14:01:59 crc kubenswrapper[4817]: I0218 14:01:59.974566 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.022352 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.022487 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.022563 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.022620 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.022650 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.022616 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.022751 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.022775 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.022818 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.023297 4817 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.023315 4817 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.023325 4817 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.023337 4817 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.033254 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.042768 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.042846 4817 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="aa48396296cb773da3dcc1d02047b9113f090271e65858f09b881aebfc496b24" exitCode=137 Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.042930 4817 scope.go:117] "RemoveContainer" containerID="aa48396296cb773da3dcc1d02047b9113f090271e65858f09b881aebfc496b24" Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.043062 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.103525 4817 scope.go:117] "RemoveContainer" containerID="aa48396296cb773da3dcc1d02047b9113f090271e65858f09b881aebfc496b24" Feb 18 14:02:00 crc kubenswrapper[4817]: E0218 14:02:00.104505 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa48396296cb773da3dcc1d02047b9113f090271e65858f09b881aebfc496b24\": container with ID starting with aa48396296cb773da3dcc1d02047b9113f090271e65858f09b881aebfc496b24 not found: ID does not exist" containerID="aa48396296cb773da3dcc1d02047b9113f090271e65858f09b881aebfc496b24" Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.104587 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa48396296cb773da3dcc1d02047b9113f090271e65858f09b881aebfc496b24"} err="failed to get container status \"aa48396296cb773da3dcc1d02047b9113f090271e65858f09b881aebfc496b24\": rpc error: code = NotFound desc = could not find container \"aa48396296cb773da3dcc1d02047b9113f090271e65858f09b881aebfc496b24\": container with ID starting with aa48396296cb773da3dcc1d02047b9113f090271e65858f09b881aebfc496b24 not found: ID does not exist" Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.124822 4817 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.181538 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.353357 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.412260 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f8cd4bb4b-8wbjf"] Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.412621 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7f8cd4bb4b-8wbjf" podUID="03358e20-1724-438e-b1eb-3d9e8ea550e4" containerName="route-controller-manager" containerID="cri-o://ee9b4bc9970eb5d0e3787349501bd01e48d196cdc5483e58078728cd2e03b0e8" gracePeriod=30 Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.433186 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-78c74bbf86-277pl"] Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.433656 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-78c74bbf86-277pl" podUID="0f24f96b-bc64-4d8e-be89-4b64b35ca424" containerName="controller-manager" containerID="cri-o://f89359d23961476d842cdd26f2f753c3c0f3a162b22885b4f584ba3d2dff1754" gracePeriod=30 Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.928090 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f8cd4bb4b-8wbjf" Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.936433 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03358e20-1724-438e-b1eb-3d9e8ea550e4-config\") pod \"03358e20-1724-438e-b1eb-3d9e8ea550e4\" (UID: \"03358e20-1724-438e-b1eb-3d9e8ea550e4\") " Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.936520 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/03358e20-1724-438e-b1eb-3d9e8ea550e4-client-ca\") pod \"03358e20-1724-438e-b1eb-3d9e8ea550e4\" (UID: \"03358e20-1724-438e-b1eb-3d9e8ea550e4\") " Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.936569 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqrq2\" (UniqueName: \"kubernetes.io/projected/03358e20-1724-438e-b1eb-3d9e8ea550e4-kube-api-access-gqrq2\") pod \"03358e20-1724-438e-b1eb-3d9e8ea550e4\" (UID: \"03358e20-1724-438e-b1eb-3d9e8ea550e4\") " Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.936668 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03358e20-1724-438e-b1eb-3d9e8ea550e4-serving-cert\") pod \"03358e20-1724-438e-b1eb-3d9e8ea550e4\" (UID: \"03358e20-1724-438e-b1eb-3d9e8ea550e4\") " Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.937610 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03358e20-1724-438e-b1eb-3d9e8ea550e4-client-ca" (OuterVolumeSpecName: "client-ca") pod "03358e20-1724-438e-b1eb-3d9e8ea550e4" (UID: "03358e20-1724-438e-b1eb-3d9e8ea550e4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.937682 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03358e20-1724-438e-b1eb-3d9e8ea550e4-config" (OuterVolumeSpecName: "config") pod "03358e20-1724-438e-b1eb-3d9e8ea550e4" (UID: "03358e20-1724-438e-b1eb-3d9e8ea550e4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.942852 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03358e20-1724-438e-b1eb-3d9e8ea550e4-kube-api-access-gqrq2" (OuterVolumeSpecName: "kube-api-access-gqrq2") pod "03358e20-1724-438e-b1eb-3d9e8ea550e4" (UID: "03358e20-1724-438e-b1eb-3d9e8ea550e4"). InnerVolumeSpecName "kube-api-access-gqrq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.943912 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03358e20-1724-438e-b1eb-3d9e8ea550e4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "03358e20-1724-438e-b1eb-3d9e8ea550e4" (UID: "03358e20-1724-438e-b1eb-3d9e8ea550e4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:02:00 crc kubenswrapper[4817]: I0218 14:02:00.973912 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78c74bbf86-277pl" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.038594 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0f24f96b-bc64-4d8e-be89-4b64b35ca424-proxy-ca-bundles\") pod \"0f24f96b-bc64-4d8e-be89-4b64b35ca424\" (UID: \"0f24f96b-bc64-4d8e-be89-4b64b35ca424\") " Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.038730 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzjfc\" (UniqueName: \"kubernetes.io/projected/0f24f96b-bc64-4d8e-be89-4b64b35ca424-kube-api-access-lzjfc\") pod \"0f24f96b-bc64-4d8e-be89-4b64b35ca424\" (UID: \"0f24f96b-bc64-4d8e-be89-4b64b35ca424\") " Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.038769 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f24f96b-bc64-4d8e-be89-4b64b35ca424-serving-cert\") pod \"0f24f96b-bc64-4d8e-be89-4b64b35ca424\" (UID: \"0f24f96b-bc64-4d8e-be89-4b64b35ca424\") " Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.039467 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f24f96b-bc64-4d8e-be89-4b64b35ca424-config\") pod \"0f24f96b-bc64-4d8e-be89-4b64b35ca424\" (UID: \"0f24f96b-bc64-4d8e-be89-4b64b35ca424\") " Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.039519 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f24f96b-bc64-4d8e-be89-4b64b35ca424-client-ca\") pod \"0f24f96b-bc64-4d8e-be89-4b64b35ca424\" (UID: \"0f24f96b-bc64-4d8e-be89-4b64b35ca424\") " Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.040025 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03358e20-1724-438e-b1eb-3d9e8ea550e4-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.040045 4817 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/03358e20-1724-438e-b1eb-3d9e8ea550e4-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.040056 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqrq2\" (UniqueName: \"kubernetes.io/projected/03358e20-1724-438e-b1eb-3d9e8ea550e4-kube-api-access-gqrq2\") on node \"crc\" DevicePath \"\"" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.040071 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03358e20-1724-438e-b1eb-3d9e8ea550e4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.040970 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f24f96b-bc64-4d8e-be89-4b64b35ca424-config" (OuterVolumeSpecName: "config") pod "0f24f96b-bc64-4d8e-be89-4b64b35ca424" (UID: "0f24f96b-bc64-4d8e-be89-4b64b35ca424"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.041288 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f24f96b-bc64-4d8e-be89-4b64b35ca424-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0f24f96b-bc64-4d8e-be89-4b64b35ca424" (UID: "0f24f96b-bc64-4d8e-be89-4b64b35ca424"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.042076 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f24f96b-bc64-4d8e-be89-4b64b35ca424-client-ca" (OuterVolumeSpecName: "client-ca") pod "0f24f96b-bc64-4d8e-be89-4b64b35ca424" (UID: "0f24f96b-bc64-4d8e-be89-4b64b35ca424"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.044215 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f24f96b-bc64-4d8e-be89-4b64b35ca424-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0f24f96b-bc64-4d8e-be89-4b64b35ca424" (UID: "0f24f96b-bc64-4d8e-be89-4b64b35ca424"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.044572 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f24f96b-bc64-4d8e-be89-4b64b35ca424-kube-api-access-lzjfc" (OuterVolumeSpecName: "kube-api-access-lzjfc") pod "0f24f96b-bc64-4d8e-be89-4b64b35ca424" (UID: "0f24f96b-bc64-4d8e-be89-4b64b35ca424"). InnerVolumeSpecName "kube-api-access-lzjfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.055279 4817 generic.go:334] "Generic (PLEG): container finished" podID="03358e20-1724-438e-b1eb-3d9e8ea550e4" containerID="ee9b4bc9970eb5d0e3787349501bd01e48d196cdc5483e58078728cd2e03b0e8" exitCode=0 Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.055378 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f8cd4bb4b-8wbjf" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.055401 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f8cd4bb4b-8wbjf" event={"ID":"03358e20-1724-438e-b1eb-3d9e8ea550e4","Type":"ContainerDied","Data":"ee9b4bc9970eb5d0e3787349501bd01e48d196cdc5483e58078728cd2e03b0e8"} Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.055535 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f8cd4bb4b-8wbjf" event={"ID":"03358e20-1724-438e-b1eb-3d9e8ea550e4","Type":"ContainerDied","Data":"dfee772858f26ab02ae54e51815f6c4955cbe4fed51cdf8f3ed4a22d3d28ee20"} Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.055566 4817 scope.go:117] "RemoveContainer" containerID="ee9b4bc9970eb5d0e3787349501bd01e48d196cdc5483e58078728cd2e03b0e8" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.060907 4817 generic.go:334] "Generic (PLEG): container finished" podID="0f24f96b-bc64-4d8e-be89-4b64b35ca424" containerID="f89359d23961476d842cdd26f2f753c3c0f3a162b22885b4f584ba3d2dff1754" exitCode=0 Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.060945 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78c74bbf86-277pl" event={"ID":"0f24f96b-bc64-4d8e-be89-4b64b35ca424","Type":"ContainerDied","Data":"f89359d23961476d842cdd26f2f753c3c0f3a162b22885b4f584ba3d2dff1754"} Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.060993 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78c74bbf86-277pl" event={"ID":"0f24f96b-bc64-4d8e-be89-4b64b35ca424","Type":"ContainerDied","Data":"2c8a52905476463555a33d7685f2400f6515981f007fb379d32b98c00a426d81"} Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.061059 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78c74bbf86-277pl" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.076508 4817 scope.go:117] "RemoveContainer" containerID="ee9b4bc9970eb5d0e3787349501bd01e48d196cdc5483e58078728cd2e03b0e8" Feb 18 14:02:01 crc kubenswrapper[4817]: E0218 14:02:01.077104 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee9b4bc9970eb5d0e3787349501bd01e48d196cdc5483e58078728cd2e03b0e8\": container with ID starting with ee9b4bc9970eb5d0e3787349501bd01e48d196cdc5483e58078728cd2e03b0e8 not found: ID does not exist" containerID="ee9b4bc9970eb5d0e3787349501bd01e48d196cdc5483e58078728cd2e03b0e8" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.077171 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee9b4bc9970eb5d0e3787349501bd01e48d196cdc5483e58078728cd2e03b0e8"} err="failed to get container status \"ee9b4bc9970eb5d0e3787349501bd01e48d196cdc5483e58078728cd2e03b0e8\": rpc error: code = NotFound desc = could not find container \"ee9b4bc9970eb5d0e3787349501bd01e48d196cdc5483e58078728cd2e03b0e8\": container with ID starting with ee9b4bc9970eb5d0e3787349501bd01e48d196cdc5483e58078728cd2e03b0e8 not found: ID does not exist" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.077202 4817 scope.go:117] "RemoveContainer" containerID="f89359d23961476d842cdd26f2f753c3c0f3a162b22885b4f584ba3d2dff1754" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.087229 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f8cd4bb4b-8wbjf"] Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.091997 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f8cd4bb4b-8wbjf"] Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.104172 4817 scope.go:117] "RemoveContainer" containerID="f89359d23961476d842cdd26f2f753c3c0f3a162b22885b4f584ba3d2dff1754" Feb 18 14:02:01 crc kubenswrapper[4817]: E0218 14:02:01.105008 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f89359d23961476d842cdd26f2f753c3c0f3a162b22885b4f584ba3d2dff1754\": container with ID starting with f89359d23961476d842cdd26f2f753c3c0f3a162b22885b4f584ba3d2dff1754 not found: ID does not exist" containerID="f89359d23961476d842cdd26f2f753c3c0f3a162b22885b4f584ba3d2dff1754" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.105071 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f89359d23961476d842cdd26f2f753c3c0f3a162b22885b4f584ba3d2dff1754"} err="failed to get container status \"f89359d23961476d842cdd26f2f753c3c0f3a162b22885b4f584ba3d2dff1754\": rpc error: code = NotFound desc = could not find container \"f89359d23961476d842cdd26f2f753c3c0f3a162b22885b4f584ba3d2dff1754\": container with ID starting with f89359d23961476d842cdd26f2f753c3c0f3a162b22885b4f584ba3d2dff1754 not found: ID does not exist" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.109210 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-78c74bbf86-277pl"] Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.113534 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-78c74bbf86-277pl"] Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.141337 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f24f96b-bc64-4d8e-be89-4b64b35ca424-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.141366 4817 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f24f96b-bc64-4d8e-be89-4b64b35ca424-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.141376 4817 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0f24f96b-bc64-4d8e-be89-4b64b35ca424-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.141388 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzjfc\" (UniqueName: \"kubernetes.io/projected/0f24f96b-bc64-4d8e-be89-4b64b35ca424-kube-api-access-lzjfc\") on node \"crc\" DevicePath \"\"" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.141398 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f24f96b-bc64-4d8e-be89-4b64b35ca424-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.517858 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-664cbf5c87-dpdxl"] Feb 18 14:02:01 crc kubenswrapper[4817]: E0218 14:02:01.518235 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c98705e3-757f-4e25-88df-5dcb9a727afa" containerName="installer" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.518254 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="c98705e3-757f-4e25-88df-5dcb9a727afa" containerName="installer" Feb 18 14:02:01 crc kubenswrapper[4817]: E0218 14:02:01.518274 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f24f96b-bc64-4d8e-be89-4b64b35ca424" containerName="controller-manager" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.518281 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f24f96b-bc64-4d8e-be89-4b64b35ca424" containerName="controller-manager" Feb 18 14:02:01 crc kubenswrapper[4817]: E0218 14:02:01.518289 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.518295 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 18 14:02:01 crc kubenswrapper[4817]: E0218 14:02:01.518310 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03358e20-1724-438e-b1eb-3d9e8ea550e4" containerName="route-controller-manager" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.518317 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="03358e20-1724-438e-b1eb-3d9e8ea550e4" containerName="route-controller-manager" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.518410 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f24f96b-bc64-4d8e-be89-4b64b35ca424" containerName="controller-manager" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.518421 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.518460 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="c98705e3-757f-4e25-88df-5dcb9a727afa" containerName="installer" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.518469 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="03358e20-1724-438e-b1eb-3d9e8ea550e4" containerName="route-controller-manager" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.519007 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-dpdxl" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.523083 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.523304 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.523406 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.524151 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.524154 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.525480 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.537401 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-664cbf5c87-dpdxl"] Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.548390 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/276a5a52-dfb7-4993-907e-37fc443b2e3a-serving-cert\") pod \"route-controller-manager-664cbf5c87-dpdxl\" (UID: \"276a5a52-dfb7-4993-907e-37fc443b2e3a\") " pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-dpdxl" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.548516 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/276a5a52-dfb7-4993-907e-37fc443b2e3a-client-ca\") pod \"route-controller-manager-664cbf5c87-dpdxl\" (UID: \"276a5a52-dfb7-4993-907e-37fc443b2e3a\") " pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-dpdxl" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.548553 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/276a5a52-dfb7-4993-907e-37fc443b2e3a-config\") pod \"route-controller-manager-664cbf5c87-dpdxl\" (UID: \"276a5a52-dfb7-4993-907e-37fc443b2e3a\") " pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-dpdxl" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.548586 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5ff5\" (UniqueName: \"kubernetes.io/projected/276a5a52-dfb7-4993-907e-37fc443b2e3a-kube-api-access-q5ff5\") pod \"route-controller-manager-664cbf5c87-dpdxl\" (UID: \"276a5a52-dfb7-4993-907e-37fc443b2e3a\") " pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-dpdxl" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.649649 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/276a5a52-dfb7-4993-907e-37fc443b2e3a-client-ca\") pod \"route-controller-manager-664cbf5c87-dpdxl\" (UID: \"276a5a52-dfb7-4993-907e-37fc443b2e3a\") " pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-dpdxl" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.649720 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/276a5a52-dfb7-4993-907e-37fc443b2e3a-config\") pod \"route-controller-manager-664cbf5c87-dpdxl\" (UID: \"276a5a52-dfb7-4993-907e-37fc443b2e3a\") " pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-dpdxl" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.649753 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5ff5\" (UniqueName: \"kubernetes.io/projected/276a5a52-dfb7-4993-907e-37fc443b2e3a-kube-api-access-q5ff5\") pod \"route-controller-manager-664cbf5c87-dpdxl\" (UID: \"276a5a52-dfb7-4993-907e-37fc443b2e3a\") " pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-dpdxl" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.649799 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/276a5a52-dfb7-4993-907e-37fc443b2e3a-serving-cert\") pod \"route-controller-manager-664cbf5c87-dpdxl\" (UID: \"276a5a52-dfb7-4993-907e-37fc443b2e3a\") " pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-dpdxl" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.651054 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/276a5a52-dfb7-4993-907e-37fc443b2e3a-client-ca\") pod \"route-controller-manager-664cbf5c87-dpdxl\" (UID: \"276a5a52-dfb7-4993-907e-37fc443b2e3a\") " pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-dpdxl" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.651679 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/276a5a52-dfb7-4993-907e-37fc443b2e3a-config\") pod \"route-controller-manager-664cbf5c87-dpdxl\" (UID: \"276a5a52-dfb7-4993-907e-37fc443b2e3a\") " pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-dpdxl" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.657925 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/276a5a52-dfb7-4993-907e-37fc443b2e3a-serving-cert\") pod \"route-controller-manager-664cbf5c87-dpdxl\" (UID: \"276a5a52-dfb7-4993-907e-37fc443b2e3a\") " pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-dpdxl" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.668678 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5ff5\" (UniqueName: \"kubernetes.io/projected/276a5a52-dfb7-4993-907e-37fc443b2e3a-kube-api-access-q5ff5\") pod \"route-controller-manager-664cbf5c87-dpdxl\" (UID: \"276a5a52-dfb7-4993-907e-37fc443b2e3a\") " pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-dpdxl" Feb 18 14:02:01 crc kubenswrapper[4817]: I0218 14:02:01.863164 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-dpdxl" Feb 18 14:02:02 crc kubenswrapper[4817]: I0218 14:02:02.180652 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03358e20-1724-438e-b1eb-3d9e8ea550e4" path="/var/lib/kubelet/pods/03358e20-1724-438e-b1eb-3d9e8ea550e4/volumes" Feb 18 14:02:02 crc kubenswrapper[4817]: I0218 14:02:02.182746 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f24f96b-bc64-4d8e-be89-4b64b35ca424" path="/var/lib/kubelet/pods/0f24f96b-bc64-4d8e-be89-4b64b35ca424/volumes" Feb 18 14:02:02 crc kubenswrapper[4817]: I0218 14:02:02.318752 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-664cbf5c87-dpdxl"] Feb 18 14:02:02 crc kubenswrapper[4817]: W0218 14:02:02.326364 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod276a5a52_dfb7_4993_907e_37fc443b2e3a.slice/crio-ac6206f7f589214d2778d6d0fa9f46c0ee1d89fbd2b8671d8793029986f338b2 WatchSource:0}: Error finding container ac6206f7f589214d2778d6d0fa9f46c0ee1d89fbd2b8671d8793029986f338b2: Status 404 returned error can't find the container with id ac6206f7f589214d2778d6d0fa9f46c0ee1d89fbd2b8671d8793029986f338b2 Feb 18 14:02:02 crc kubenswrapper[4817]: I0218 14:02:02.518520 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-55f8f55785-lzxc5"] Feb 18 14:02:02 crc kubenswrapper[4817]: I0218 14:02:02.519581 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55f8f55785-lzxc5" Feb 18 14:02:02 crc kubenswrapper[4817]: I0218 14:02:02.522829 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 18 14:02:02 crc kubenswrapper[4817]: I0218 14:02:02.522864 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 18 14:02:02 crc kubenswrapper[4817]: I0218 14:02:02.523195 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 18 14:02:02 crc kubenswrapper[4817]: I0218 14:02:02.524611 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 18 14:02:02 crc kubenswrapper[4817]: I0218 14:02:02.524649 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 18 14:02:02 crc kubenswrapper[4817]: I0218 14:02:02.525075 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 18 14:02:02 crc kubenswrapper[4817]: I0218 14:02:02.530230 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-55f8f55785-lzxc5"] Feb 18 14:02:02 crc kubenswrapper[4817]: I0218 14:02:02.544068 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 18 14:02:02 crc kubenswrapper[4817]: I0218 14:02:02.561739 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ab3a1628-f65f-495a-83f9-bfb536d40c12-proxy-ca-bundles\") pod \"controller-manager-55f8f55785-lzxc5\" (UID: \"ab3a1628-f65f-495a-83f9-bfb536d40c12\") " pod="openshift-controller-manager/controller-manager-55f8f55785-lzxc5" Feb 18 14:02:02 crc kubenswrapper[4817]: I0218 14:02:02.561839 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5kkf\" (UniqueName: \"kubernetes.io/projected/ab3a1628-f65f-495a-83f9-bfb536d40c12-kube-api-access-r5kkf\") pod \"controller-manager-55f8f55785-lzxc5\" (UID: \"ab3a1628-f65f-495a-83f9-bfb536d40c12\") " pod="openshift-controller-manager/controller-manager-55f8f55785-lzxc5" Feb 18 14:02:02 crc kubenswrapper[4817]: I0218 14:02:02.561878 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab3a1628-f65f-495a-83f9-bfb536d40c12-config\") pod \"controller-manager-55f8f55785-lzxc5\" (UID: \"ab3a1628-f65f-495a-83f9-bfb536d40c12\") " pod="openshift-controller-manager/controller-manager-55f8f55785-lzxc5" Feb 18 14:02:02 crc kubenswrapper[4817]: I0218 14:02:02.561915 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab3a1628-f65f-495a-83f9-bfb536d40c12-serving-cert\") pod \"controller-manager-55f8f55785-lzxc5\" (UID: \"ab3a1628-f65f-495a-83f9-bfb536d40c12\") " pod="openshift-controller-manager/controller-manager-55f8f55785-lzxc5" Feb 18 14:02:02 crc kubenswrapper[4817]: I0218 14:02:02.563284 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab3a1628-f65f-495a-83f9-bfb536d40c12-client-ca\") pod \"controller-manager-55f8f55785-lzxc5\" (UID: \"ab3a1628-f65f-495a-83f9-bfb536d40c12\") " pod="openshift-controller-manager/controller-manager-55f8f55785-lzxc5" Feb 18 14:02:02 crc kubenswrapper[4817]: I0218 14:02:02.666023 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ab3a1628-f65f-495a-83f9-bfb536d40c12-proxy-ca-bundles\") pod \"controller-manager-55f8f55785-lzxc5\" (UID: \"ab3a1628-f65f-495a-83f9-bfb536d40c12\") " pod="openshift-controller-manager/controller-manager-55f8f55785-lzxc5" Feb 18 14:02:02 crc kubenswrapper[4817]: I0218 14:02:02.666146 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5kkf\" (UniqueName: \"kubernetes.io/projected/ab3a1628-f65f-495a-83f9-bfb536d40c12-kube-api-access-r5kkf\") pod \"controller-manager-55f8f55785-lzxc5\" (UID: \"ab3a1628-f65f-495a-83f9-bfb536d40c12\") " pod="openshift-controller-manager/controller-manager-55f8f55785-lzxc5" Feb 18 14:02:02 crc kubenswrapper[4817]: I0218 14:02:02.666218 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab3a1628-f65f-495a-83f9-bfb536d40c12-config\") pod \"controller-manager-55f8f55785-lzxc5\" (UID: \"ab3a1628-f65f-495a-83f9-bfb536d40c12\") " pod="openshift-controller-manager/controller-manager-55f8f55785-lzxc5" Feb 18 14:02:02 crc kubenswrapper[4817]: I0218 14:02:02.666312 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab3a1628-f65f-495a-83f9-bfb536d40c12-serving-cert\") pod \"controller-manager-55f8f55785-lzxc5\" (UID: \"ab3a1628-f65f-495a-83f9-bfb536d40c12\") " pod="openshift-controller-manager/controller-manager-55f8f55785-lzxc5" Feb 18 14:02:02 crc kubenswrapper[4817]: I0218 14:02:02.666370 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab3a1628-f65f-495a-83f9-bfb536d40c12-client-ca\") pod \"controller-manager-55f8f55785-lzxc5\" (UID: \"ab3a1628-f65f-495a-83f9-bfb536d40c12\") " pod="openshift-controller-manager/controller-manager-55f8f55785-lzxc5" Feb 18 14:02:02 crc kubenswrapper[4817]: I0218 14:02:02.667261 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ab3a1628-f65f-495a-83f9-bfb536d40c12-proxy-ca-bundles\") pod \"controller-manager-55f8f55785-lzxc5\" (UID: \"ab3a1628-f65f-495a-83f9-bfb536d40c12\") " pod="openshift-controller-manager/controller-manager-55f8f55785-lzxc5" Feb 18 14:02:02 crc kubenswrapper[4817]: I0218 14:02:02.668420 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab3a1628-f65f-495a-83f9-bfb536d40c12-config\") pod \"controller-manager-55f8f55785-lzxc5\" (UID: \"ab3a1628-f65f-495a-83f9-bfb536d40c12\") " pod="openshift-controller-manager/controller-manager-55f8f55785-lzxc5" Feb 18 14:02:02 crc kubenswrapper[4817]: I0218 14:02:02.668565 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab3a1628-f65f-495a-83f9-bfb536d40c12-client-ca\") pod \"controller-manager-55f8f55785-lzxc5\" (UID: \"ab3a1628-f65f-495a-83f9-bfb536d40c12\") " pod="openshift-controller-manager/controller-manager-55f8f55785-lzxc5" Feb 18 14:02:02 crc kubenswrapper[4817]: I0218 14:02:02.689930 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab3a1628-f65f-495a-83f9-bfb536d40c12-serving-cert\") pod \"controller-manager-55f8f55785-lzxc5\" (UID: \"ab3a1628-f65f-495a-83f9-bfb536d40c12\") " pod="openshift-controller-manager/controller-manager-55f8f55785-lzxc5" Feb 18 14:02:02 crc kubenswrapper[4817]: I0218 14:02:02.692887 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5kkf\" (UniqueName: \"kubernetes.io/projected/ab3a1628-f65f-495a-83f9-bfb536d40c12-kube-api-access-r5kkf\") pod \"controller-manager-55f8f55785-lzxc5\" (UID: \"ab3a1628-f65f-495a-83f9-bfb536d40c12\") " pod="openshift-controller-manager/controller-manager-55f8f55785-lzxc5" Feb 18 14:02:02 crc kubenswrapper[4817]: I0218 14:02:02.857285 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55f8f55785-lzxc5" Feb 18 14:02:03 crc kubenswrapper[4817]: I0218 14:02:03.078631 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-dpdxl" event={"ID":"276a5a52-dfb7-4993-907e-37fc443b2e3a","Type":"ContainerStarted","Data":"d093ea42aa3f29e8e629d8c7c5e4adbc90b7698b59ba1510a2d2726d5dbf2f5c"} Feb 18 14:02:03 crc kubenswrapper[4817]: I0218 14:02:03.078690 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-dpdxl" event={"ID":"276a5a52-dfb7-4993-907e-37fc443b2e3a","Type":"ContainerStarted","Data":"ac6206f7f589214d2778d6d0fa9f46c0ee1d89fbd2b8671d8793029986f338b2"} Feb 18 14:02:03 crc kubenswrapper[4817]: I0218 14:02:03.079000 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-dpdxl" Feb 18 14:02:03 crc kubenswrapper[4817]: I0218 14:02:03.085347 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-dpdxl" Feb 18 14:02:03 crc kubenswrapper[4817]: I0218 14:02:03.105050 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-dpdxl" podStartSLOduration=3.105015843 podStartE2EDuration="3.105015843s" podCreationTimestamp="2026-02-18 14:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:02:03.099877 +0000 UTC m=+185.675412993" watchObservedRunningTime="2026-02-18 14:02:03.105015843 +0000 UTC m=+185.680551826" Feb 18 14:02:03 crc kubenswrapper[4817]: I0218 14:02:03.300358 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-55f8f55785-lzxc5"] Feb 18 14:02:03 crc kubenswrapper[4817]: W0218 14:02:03.311641 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab3a1628_f65f_495a_83f9_bfb536d40c12.slice/crio-92b878825927715050a95af74b8af9b021335ca64794e0b4ce224a77b9b2d85b WatchSource:0}: Error finding container 92b878825927715050a95af74b8af9b021335ca64794e0b4ce224a77b9b2d85b: Status 404 returned error can't find the container with id 92b878825927715050a95af74b8af9b021335ca64794e0b4ce224a77b9b2d85b Feb 18 14:02:04 crc kubenswrapper[4817]: I0218 14:02:04.086771 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55f8f55785-lzxc5" event={"ID":"ab3a1628-f65f-495a-83f9-bfb536d40c12","Type":"ContainerStarted","Data":"74848d067ef0dafd3de0d3cb820eba29adaec983c90da15c6fef4a1155ea8df8"} Feb 18 14:02:04 crc kubenswrapper[4817]: I0218 14:02:04.087206 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-55f8f55785-lzxc5" Feb 18 14:02:04 crc kubenswrapper[4817]: I0218 14:02:04.088363 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55f8f55785-lzxc5" event={"ID":"ab3a1628-f65f-495a-83f9-bfb536d40c12","Type":"ContainerStarted","Data":"92b878825927715050a95af74b8af9b021335ca64794e0b4ce224a77b9b2d85b"} Feb 18 14:02:04 crc kubenswrapper[4817]: I0218 14:02:04.093208 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-55f8f55785-lzxc5" Feb 18 14:02:04 crc kubenswrapper[4817]: I0218 14:02:04.108456 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-55f8f55785-lzxc5" podStartSLOduration=4.108438275 podStartE2EDuration="4.108438275s" podCreationTimestamp="2026-02-18 14:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:02:04.106865384 +0000 UTC m=+186.682401367" watchObservedRunningTime="2026-02-18 14:02:04.108438275 +0000 UTC m=+186.683974258" Feb 18 14:02:04 crc kubenswrapper[4817]: I0218 14:02:04.172215 4817 scope.go:117] "RemoveContainer" containerID="06f6b0070da338049e9b0236b7dc6a0ed1829c066144896ba2efefb0cc5d9d4f" Feb 18 14:02:04 crc kubenswrapper[4817]: E0218 14:02:04.172523 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 20s restarting failed container=oauth-openshift pod=oauth-openshift-54cb467b7d-jgbv8_openshift-authentication(4cbcb73b-c91d-47a8-ae83-38439e150615)\"" pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" podUID="4cbcb73b-c91d-47a8-ae83-38439e150615" Feb 18 14:02:12 crc kubenswrapper[4817]: I0218 14:02:12.864469 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:02:12 crc kubenswrapper[4817]: I0218 14:02:12.865347 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:02:19 crc kubenswrapper[4817]: I0218 14:02:19.171037 4817 scope.go:117] "RemoveContainer" containerID="06f6b0070da338049e9b0236b7dc6a0ed1829c066144896ba2efefb0cc5d9d4f" Feb 18 14:02:20 crc kubenswrapper[4817]: I0218 14:02:20.193216 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-54cb467b7d-jgbv8_4cbcb73b-c91d-47a8-ae83-38439e150615/oauth-openshift/2.log" Feb 18 14:02:20 crc kubenswrapper[4817]: I0218 14:02:20.193785 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" event={"ID":"4cbcb73b-c91d-47a8-ae83-38439e150615","Type":"ContainerStarted","Data":"d3a70d405438b8ccee618083df3f70a473c4fafdfdaa6cc2551a2d605e61c0fa"} Feb 18 14:02:20 crc kubenswrapper[4817]: I0218 14:02:20.194939 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:02:20 crc kubenswrapper[4817]: I0218 14:02:20.205957 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-54cb467b7d-jgbv8" Feb 18 14:02:20 crc kubenswrapper[4817]: I0218 14:02:20.372358 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-55f8f55785-lzxc5"] Feb 18 14:02:20 crc kubenswrapper[4817]: I0218 14:02:20.372629 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-55f8f55785-lzxc5" podUID="ab3a1628-f65f-495a-83f9-bfb536d40c12" containerName="controller-manager" containerID="cri-o://74848d067ef0dafd3de0d3cb820eba29adaec983c90da15c6fef4a1155ea8df8" gracePeriod=30 Feb 18 14:02:20 crc kubenswrapper[4817]: I0218 14:02:20.394318 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-664cbf5c87-dpdxl"] Feb 18 14:02:20 crc kubenswrapper[4817]: I0218 14:02:20.394537 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-dpdxl" podUID="276a5a52-dfb7-4993-907e-37fc443b2e3a" containerName="route-controller-manager" containerID="cri-o://d093ea42aa3f29e8e629d8c7c5e4adbc90b7698b59ba1510a2d2726d5dbf2f5c" gracePeriod=30 Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.073523 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-dpdxl" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.194482 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5ff5\" (UniqueName: \"kubernetes.io/projected/276a5a52-dfb7-4993-907e-37fc443b2e3a-kube-api-access-q5ff5\") pod \"276a5a52-dfb7-4993-907e-37fc443b2e3a\" (UID: \"276a5a52-dfb7-4993-907e-37fc443b2e3a\") " Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.194529 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/276a5a52-dfb7-4993-907e-37fc443b2e3a-config\") pod \"276a5a52-dfb7-4993-907e-37fc443b2e3a\" (UID: \"276a5a52-dfb7-4993-907e-37fc443b2e3a\") " Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.194611 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/276a5a52-dfb7-4993-907e-37fc443b2e3a-client-ca\") pod \"276a5a52-dfb7-4993-907e-37fc443b2e3a\" (UID: \"276a5a52-dfb7-4993-907e-37fc443b2e3a\") " Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.194645 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/276a5a52-dfb7-4993-907e-37fc443b2e3a-serving-cert\") pod \"276a5a52-dfb7-4993-907e-37fc443b2e3a\" (UID: \"276a5a52-dfb7-4993-907e-37fc443b2e3a\") " Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.195666 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/276a5a52-dfb7-4993-907e-37fc443b2e3a-client-ca" (OuterVolumeSpecName: "client-ca") pod "276a5a52-dfb7-4993-907e-37fc443b2e3a" (UID: "276a5a52-dfb7-4993-907e-37fc443b2e3a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.195804 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/276a5a52-dfb7-4993-907e-37fc443b2e3a-config" (OuterVolumeSpecName: "config") pod "276a5a52-dfb7-4993-907e-37fc443b2e3a" (UID: "276a5a52-dfb7-4993-907e-37fc443b2e3a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.200622 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/276a5a52-dfb7-4993-907e-37fc443b2e3a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "276a5a52-dfb7-4993-907e-37fc443b2e3a" (UID: "276a5a52-dfb7-4993-907e-37fc443b2e3a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.201837 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/276a5a52-dfb7-4993-907e-37fc443b2e3a-kube-api-access-q5ff5" (OuterVolumeSpecName: "kube-api-access-q5ff5") pod "276a5a52-dfb7-4993-907e-37fc443b2e3a" (UID: "276a5a52-dfb7-4993-907e-37fc443b2e3a"). InnerVolumeSpecName "kube-api-access-q5ff5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.202844 4817 generic.go:334] "Generic (PLEG): container finished" podID="276a5a52-dfb7-4993-907e-37fc443b2e3a" containerID="d093ea42aa3f29e8e629d8c7c5e4adbc90b7698b59ba1510a2d2726d5dbf2f5c" exitCode=0 Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.202918 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-dpdxl" event={"ID":"276a5a52-dfb7-4993-907e-37fc443b2e3a","Type":"ContainerDied","Data":"d093ea42aa3f29e8e629d8c7c5e4adbc90b7698b59ba1510a2d2726d5dbf2f5c"} Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.202949 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-dpdxl" event={"ID":"276a5a52-dfb7-4993-907e-37fc443b2e3a","Type":"ContainerDied","Data":"ac6206f7f589214d2778d6d0fa9f46c0ee1d89fbd2b8671d8793029986f338b2"} Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.202968 4817 scope.go:117] "RemoveContainer" containerID="d093ea42aa3f29e8e629d8c7c5e4adbc90b7698b59ba1510a2d2726d5dbf2f5c" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.203102 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-dpdxl" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.206250 4817 generic.go:334] "Generic (PLEG): container finished" podID="ab3a1628-f65f-495a-83f9-bfb536d40c12" containerID="74848d067ef0dafd3de0d3cb820eba29adaec983c90da15c6fef4a1155ea8df8" exitCode=0 Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.207001 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55f8f55785-lzxc5" event={"ID":"ab3a1628-f65f-495a-83f9-bfb536d40c12","Type":"ContainerDied","Data":"74848d067ef0dafd3de0d3cb820eba29adaec983c90da15c6fef4a1155ea8df8"} Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.255282 4817 scope.go:117] "RemoveContainer" containerID="d093ea42aa3f29e8e629d8c7c5e4adbc90b7698b59ba1510a2d2726d5dbf2f5c" Feb 18 14:02:21 crc kubenswrapper[4817]: E0218 14:02:21.255701 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d093ea42aa3f29e8e629d8c7c5e4adbc90b7698b59ba1510a2d2726d5dbf2f5c\": container with ID starting with d093ea42aa3f29e8e629d8c7c5e4adbc90b7698b59ba1510a2d2726d5dbf2f5c not found: ID does not exist" containerID="d093ea42aa3f29e8e629d8c7c5e4adbc90b7698b59ba1510a2d2726d5dbf2f5c" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.255733 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d093ea42aa3f29e8e629d8c7c5e4adbc90b7698b59ba1510a2d2726d5dbf2f5c"} err="failed to get container status \"d093ea42aa3f29e8e629d8c7c5e4adbc90b7698b59ba1510a2d2726d5dbf2f5c\": rpc error: code = NotFound desc = could not find container \"d093ea42aa3f29e8e629d8c7c5e4adbc90b7698b59ba1510a2d2726d5dbf2f5c\": container with ID starting with d093ea42aa3f29e8e629d8c7c5e4adbc90b7698b59ba1510a2d2726d5dbf2f5c not found: ID does not exist" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.256051 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55f8f55785-lzxc5" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.276043 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-664cbf5c87-dpdxl"] Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.278853 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-664cbf5c87-dpdxl"] Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.296023 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab3a1628-f65f-495a-83f9-bfb536d40c12-serving-cert\") pod \"ab3a1628-f65f-495a-83f9-bfb536d40c12\" (UID: \"ab3a1628-f65f-495a-83f9-bfb536d40c12\") " Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.296095 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab3a1628-f65f-495a-83f9-bfb536d40c12-config\") pod \"ab3a1628-f65f-495a-83f9-bfb536d40c12\" (UID: \"ab3a1628-f65f-495a-83f9-bfb536d40c12\") " Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.296120 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ab3a1628-f65f-495a-83f9-bfb536d40c12-proxy-ca-bundles\") pod \"ab3a1628-f65f-495a-83f9-bfb536d40c12\" (UID: \"ab3a1628-f65f-495a-83f9-bfb536d40c12\") " Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.296154 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab3a1628-f65f-495a-83f9-bfb536d40c12-client-ca\") pod \"ab3a1628-f65f-495a-83f9-bfb536d40c12\" (UID: \"ab3a1628-f65f-495a-83f9-bfb536d40c12\") " Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.296178 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5kkf\" (UniqueName: \"kubernetes.io/projected/ab3a1628-f65f-495a-83f9-bfb536d40c12-kube-api-access-r5kkf\") pod \"ab3a1628-f65f-495a-83f9-bfb536d40c12\" (UID: \"ab3a1628-f65f-495a-83f9-bfb536d40c12\") " Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.296486 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/276a5a52-dfb7-4993-907e-37fc443b2e3a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.296502 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5ff5\" (UniqueName: \"kubernetes.io/projected/276a5a52-dfb7-4993-907e-37fc443b2e3a-kube-api-access-q5ff5\") on node \"crc\" DevicePath \"\"" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.296514 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/276a5a52-dfb7-4993-907e-37fc443b2e3a-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.296523 4817 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/276a5a52-dfb7-4993-907e-37fc443b2e3a-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.297479 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab3a1628-f65f-495a-83f9-bfb536d40c12-client-ca" (OuterVolumeSpecName: "client-ca") pod "ab3a1628-f65f-495a-83f9-bfb536d40c12" (UID: "ab3a1628-f65f-495a-83f9-bfb536d40c12"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.297568 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab3a1628-f65f-495a-83f9-bfb536d40c12-config" (OuterVolumeSpecName: "config") pod "ab3a1628-f65f-495a-83f9-bfb536d40c12" (UID: "ab3a1628-f65f-495a-83f9-bfb536d40c12"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.297553 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab3a1628-f65f-495a-83f9-bfb536d40c12-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ab3a1628-f65f-495a-83f9-bfb536d40c12" (UID: "ab3a1628-f65f-495a-83f9-bfb536d40c12"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.301186 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab3a1628-f65f-495a-83f9-bfb536d40c12-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ab3a1628-f65f-495a-83f9-bfb536d40c12" (UID: "ab3a1628-f65f-495a-83f9-bfb536d40c12"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.305148 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab3a1628-f65f-495a-83f9-bfb536d40c12-kube-api-access-r5kkf" (OuterVolumeSpecName: "kube-api-access-r5kkf") pod "ab3a1628-f65f-495a-83f9-bfb536d40c12" (UID: "ab3a1628-f65f-495a-83f9-bfb536d40c12"). InnerVolumeSpecName "kube-api-access-r5kkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.397688 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab3a1628-f65f-495a-83f9-bfb536d40c12-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.397728 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab3a1628-f65f-495a-83f9-bfb536d40c12-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.397737 4817 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ab3a1628-f65f-495a-83f9-bfb536d40c12-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.397749 4817 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab3a1628-f65f-495a-83f9-bfb536d40c12-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.397760 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5kkf\" (UniqueName: \"kubernetes.io/projected/ab3a1628-f65f-495a-83f9-bfb536d40c12-kube-api-access-r5kkf\") on node \"crc\" DevicePath \"\"" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.524917 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7b45548f64-5nmnc"] Feb 18 14:02:21 crc kubenswrapper[4817]: E0218 14:02:21.525233 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab3a1628-f65f-495a-83f9-bfb536d40c12" containerName="controller-manager" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.525250 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab3a1628-f65f-495a-83f9-bfb536d40c12" containerName="controller-manager" Feb 18 14:02:21 crc kubenswrapper[4817]: E0218 14:02:21.525266 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="276a5a52-dfb7-4993-907e-37fc443b2e3a" containerName="route-controller-manager" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.525275 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="276a5a52-dfb7-4993-907e-37fc443b2e3a" containerName="route-controller-manager" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.525378 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="276a5a52-dfb7-4993-907e-37fc443b2e3a" containerName="route-controller-manager" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.525397 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab3a1628-f65f-495a-83f9-bfb536d40c12" containerName="controller-manager" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.525852 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b45548f64-5nmnc" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.530248 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fc95d9d9d-7xwb7"] Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.531954 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-fc95d9d9d-7xwb7" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.534334 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b45548f64-5nmnc"] Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.534532 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.534552 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.534596 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.534707 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.534720 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.535114 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.541008 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fc95d9d9d-7xwb7"] Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.599973 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/998d1dac-2178-4614-ba73-3032370ae8b4-serving-cert\") pod \"controller-manager-7b45548f64-5nmnc\" (UID: \"998d1dac-2178-4614-ba73-3032370ae8b4\") " pod="openshift-controller-manager/controller-manager-7b45548f64-5nmnc" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.600086 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdcwl\" (UniqueName: \"kubernetes.io/projected/998d1dac-2178-4614-ba73-3032370ae8b4-kube-api-access-xdcwl\") pod \"controller-manager-7b45548f64-5nmnc\" (UID: \"998d1dac-2178-4614-ba73-3032370ae8b4\") " pod="openshift-controller-manager/controller-manager-7b45548f64-5nmnc" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.600114 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/998d1dac-2178-4614-ba73-3032370ae8b4-client-ca\") pod \"controller-manager-7b45548f64-5nmnc\" (UID: \"998d1dac-2178-4614-ba73-3032370ae8b4\") " pod="openshift-controller-manager/controller-manager-7b45548f64-5nmnc" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.600133 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/998d1dac-2178-4614-ba73-3032370ae8b4-proxy-ca-bundles\") pod \"controller-manager-7b45548f64-5nmnc\" (UID: \"998d1dac-2178-4614-ba73-3032370ae8b4\") " pod="openshift-controller-manager/controller-manager-7b45548f64-5nmnc" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.600165 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns6g7\" (UniqueName: \"kubernetes.io/projected/64967ad0-c726-4605-85fd-7c44a64e8f4c-kube-api-access-ns6g7\") pod \"route-controller-manager-fc95d9d9d-7xwb7\" (UID: \"64967ad0-c726-4605-85fd-7c44a64e8f4c\") " pod="openshift-route-controller-manager/route-controller-manager-fc95d9d9d-7xwb7" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.600185 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64967ad0-c726-4605-85fd-7c44a64e8f4c-serving-cert\") pod \"route-controller-manager-fc95d9d9d-7xwb7\" (UID: \"64967ad0-c726-4605-85fd-7c44a64e8f4c\") " pod="openshift-route-controller-manager/route-controller-manager-fc95d9d9d-7xwb7" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.600207 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/998d1dac-2178-4614-ba73-3032370ae8b4-config\") pod \"controller-manager-7b45548f64-5nmnc\" (UID: \"998d1dac-2178-4614-ba73-3032370ae8b4\") " pod="openshift-controller-manager/controller-manager-7b45548f64-5nmnc" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.600314 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64967ad0-c726-4605-85fd-7c44a64e8f4c-config\") pod \"route-controller-manager-fc95d9d9d-7xwb7\" (UID: \"64967ad0-c726-4605-85fd-7c44a64e8f4c\") " pod="openshift-route-controller-manager/route-controller-manager-fc95d9d9d-7xwb7" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.600354 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/64967ad0-c726-4605-85fd-7c44a64e8f4c-client-ca\") pod \"route-controller-manager-fc95d9d9d-7xwb7\" (UID: \"64967ad0-c726-4605-85fd-7c44a64e8f4c\") " pod="openshift-route-controller-manager/route-controller-manager-fc95d9d9d-7xwb7" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.701162 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64967ad0-c726-4605-85fd-7c44a64e8f4c-serving-cert\") pod \"route-controller-manager-fc95d9d9d-7xwb7\" (UID: \"64967ad0-c726-4605-85fd-7c44a64e8f4c\") " pod="openshift-route-controller-manager/route-controller-manager-fc95d9d9d-7xwb7" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.701207 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns6g7\" (UniqueName: \"kubernetes.io/projected/64967ad0-c726-4605-85fd-7c44a64e8f4c-kube-api-access-ns6g7\") pod \"route-controller-manager-fc95d9d9d-7xwb7\" (UID: \"64967ad0-c726-4605-85fd-7c44a64e8f4c\") " pod="openshift-route-controller-manager/route-controller-manager-fc95d9d9d-7xwb7" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.701230 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/998d1dac-2178-4614-ba73-3032370ae8b4-config\") pod \"controller-manager-7b45548f64-5nmnc\" (UID: \"998d1dac-2178-4614-ba73-3032370ae8b4\") " pod="openshift-controller-manager/controller-manager-7b45548f64-5nmnc" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.701264 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64967ad0-c726-4605-85fd-7c44a64e8f4c-config\") pod \"route-controller-manager-fc95d9d9d-7xwb7\" (UID: \"64967ad0-c726-4605-85fd-7c44a64e8f4c\") " pod="openshift-route-controller-manager/route-controller-manager-fc95d9d9d-7xwb7" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.701288 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/64967ad0-c726-4605-85fd-7c44a64e8f4c-client-ca\") pod \"route-controller-manager-fc95d9d9d-7xwb7\" (UID: \"64967ad0-c726-4605-85fd-7c44a64e8f4c\") " pod="openshift-route-controller-manager/route-controller-manager-fc95d9d9d-7xwb7" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.701318 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/998d1dac-2178-4614-ba73-3032370ae8b4-serving-cert\") pod \"controller-manager-7b45548f64-5nmnc\" (UID: \"998d1dac-2178-4614-ba73-3032370ae8b4\") " pod="openshift-controller-manager/controller-manager-7b45548f64-5nmnc" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.701353 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdcwl\" (UniqueName: \"kubernetes.io/projected/998d1dac-2178-4614-ba73-3032370ae8b4-kube-api-access-xdcwl\") pod \"controller-manager-7b45548f64-5nmnc\" (UID: \"998d1dac-2178-4614-ba73-3032370ae8b4\") " pod="openshift-controller-manager/controller-manager-7b45548f64-5nmnc" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.701369 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/998d1dac-2178-4614-ba73-3032370ae8b4-client-ca\") pod \"controller-manager-7b45548f64-5nmnc\" (UID: \"998d1dac-2178-4614-ba73-3032370ae8b4\") " pod="openshift-controller-manager/controller-manager-7b45548f64-5nmnc" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.701383 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/998d1dac-2178-4614-ba73-3032370ae8b4-proxy-ca-bundles\") pod \"controller-manager-7b45548f64-5nmnc\" (UID: \"998d1dac-2178-4614-ba73-3032370ae8b4\") " pod="openshift-controller-manager/controller-manager-7b45548f64-5nmnc" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.703040 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/998d1dac-2178-4614-ba73-3032370ae8b4-proxy-ca-bundles\") pod \"controller-manager-7b45548f64-5nmnc\" (UID: \"998d1dac-2178-4614-ba73-3032370ae8b4\") " pod="openshift-controller-manager/controller-manager-7b45548f64-5nmnc" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.703054 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/64967ad0-c726-4605-85fd-7c44a64e8f4c-client-ca\") pod \"route-controller-manager-fc95d9d9d-7xwb7\" (UID: \"64967ad0-c726-4605-85fd-7c44a64e8f4c\") " pod="openshift-route-controller-manager/route-controller-manager-fc95d9d9d-7xwb7" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.703249 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/998d1dac-2178-4614-ba73-3032370ae8b4-config\") pod \"controller-manager-7b45548f64-5nmnc\" (UID: \"998d1dac-2178-4614-ba73-3032370ae8b4\") " pod="openshift-controller-manager/controller-manager-7b45548f64-5nmnc" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.703470 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64967ad0-c726-4605-85fd-7c44a64e8f4c-config\") pod \"route-controller-manager-fc95d9d9d-7xwb7\" (UID: \"64967ad0-c726-4605-85fd-7c44a64e8f4c\") " pod="openshift-route-controller-manager/route-controller-manager-fc95d9d9d-7xwb7" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.703842 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/998d1dac-2178-4614-ba73-3032370ae8b4-client-ca\") pod \"controller-manager-7b45548f64-5nmnc\" (UID: \"998d1dac-2178-4614-ba73-3032370ae8b4\") " pod="openshift-controller-manager/controller-manager-7b45548f64-5nmnc" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.706662 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/998d1dac-2178-4614-ba73-3032370ae8b4-serving-cert\") pod \"controller-manager-7b45548f64-5nmnc\" (UID: \"998d1dac-2178-4614-ba73-3032370ae8b4\") " pod="openshift-controller-manager/controller-manager-7b45548f64-5nmnc" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.714794 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64967ad0-c726-4605-85fd-7c44a64e8f4c-serving-cert\") pod \"route-controller-manager-fc95d9d9d-7xwb7\" (UID: \"64967ad0-c726-4605-85fd-7c44a64e8f4c\") " pod="openshift-route-controller-manager/route-controller-manager-fc95d9d9d-7xwb7" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.722186 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns6g7\" (UniqueName: \"kubernetes.io/projected/64967ad0-c726-4605-85fd-7c44a64e8f4c-kube-api-access-ns6g7\") pod \"route-controller-manager-fc95d9d9d-7xwb7\" (UID: \"64967ad0-c726-4605-85fd-7c44a64e8f4c\") " pod="openshift-route-controller-manager/route-controller-manager-fc95d9d9d-7xwb7" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.726116 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdcwl\" (UniqueName: \"kubernetes.io/projected/998d1dac-2178-4614-ba73-3032370ae8b4-kube-api-access-xdcwl\") pod \"controller-manager-7b45548f64-5nmnc\" (UID: \"998d1dac-2178-4614-ba73-3032370ae8b4\") " pod="openshift-controller-manager/controller-manager-7b45548f64-5nmnc" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.873518 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b45548f64-5nmnc" Feb 18 14:02:21 crc kubenswrapper[4817]: I0218 14:02:21.880998 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-fc95d9d9d-7xwb7" Feb 18 14:02:22 crc kubenswrapper[4817]: I0218 14:02:22.114566 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b45548f64-5nmnc"] Feb 18 14:02:22 crc kubenswrapper[4817]: I0218 14:02:22.203393 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="276a5a52-dfb7-4993-907e-37fc443b2e3a" path="/var/lib/kubelet/pods/276a5a52-dfb7-4993-907e-37fc443b2e3a/volumes" Feb 18 14:02:22 crc kubenswrapper[4817]: I0218 14:02:22.204615 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fc95d9d9d-7xwb7"] Feb 18 14:02:22 crc kubenswrapper[4817]: W0218 14:02:22.213085 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64967ad0_c726_4605_85fd_7c44a64e8f4c.slice/crio-baddaabfe37e0db58db2cd00e03b33f7f9d81a71a909cd5d34ac0a97929908c2 WatchSource:0}: Error finding container baddaabfe37e0db58db2cd00e03b33f7f9d81a71a909cd5d34ac0a97929908c2: Status 404 returned error can't find the container with id baddaabfe37e0db58db2cd00e03b33f7f9d81a71a909cd5d34ac0a97929908c2 Feb 18 14:02:22 crc kubenswrapper[4817]: I0218 14:02:22.214388 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55f8f55785-lzxc5" Feb 18 14:02:22 crc kubenswrapper[4817]: I0218 14:02:22.214465 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55f8f55785-lzxc5" event={"ID":"ab3a1628-f65f-495a-83f9-bfb536d40c12","Type":"ContainerDied","Data":"92b878825927715050a95af74b8af9b021335ca64794e0b4ce224a77b9b2d85b"} Feb 18 14:02:22 crc kubenswrapper[4817]: I0218 14:02:22.214547 4817 scope.go:117] "RemoveContainer" containerID="74848d067ef0dafd3de0d3cb820eba29adaec983c90da15c6fef4a1155ea8df8" Feb 18 14:02:22 crc kubenswrapper[4817]: I0218 14:02:22.218269 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b45548f64-5nmnc" event={"ID":"998d1dac-2178-4614-ba73-3032370ae8b4","Type":"ContainerStarted","Data":"57694da48de66ca969b73783f195907a38102d982b7d9f47777bd2a6e994641c"} Feb 18 14:02:22 crc kubenswrapper[4817]: I0218 14:02:22.261382 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-55f8f55785-lzxc5"] Feb 18 14:02:22 crc kubenswrapper[4817]: I0218 14:02:22.263732 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-55f8f55785-lzxc5"] Feb 18 14:02:23 crc kubenswrapper[4817]: I0218 14:02:23.228467 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-fc95d9d9d-7xwb7" event={"ID":"64967ad0-c726-4605-85fd-7c44a64e8f4c","Type":"ContainerStarted","Data":"ba743a628dd3055d208ba3bb7ab97f378b96d8250bf79953058a4896c266c54c"} Feb 18 14:02:23 crc kubenswrapper[4817]: I0218 14:02:23.229370 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-fc95d9d9d-7xwb7" Feb 18 14:02:23 crc kubenswrapper[4817]: I0218 14:02:23.229385 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-fc95d9d9d-7xwb7" event={"ID":"64967ad0-c726-4605-85fd-7c44a64e8f4c","Type":"ContainerStarted","Data":"baddaabfe37e0db58db2cd00e03b33f7f9d81a71a909cd5d34ac0a97929908c2"} Feb 18 14:02:23 crc kubenswrapper[4817]: I0218 14:02:23.230748 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b45548f64-5nmnc" event={"ID":"998d1dac-2178-4614-ba73-3032370ae8b4","Type":"ContainerStarted","Data":"1806616f8386b178f492430d7a559b5208fef60916d9f03f7c364630dbe7497b"} Feb 18 14:02:23 crc kubenswrapper[4817]: I0218 14:02:23.231025 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7b45548f64-5nmnc" Feb 18 14:02:23 crc kubenswrapper[4817]: I0218 14:02:23.235711 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-fc95d9d9d-7xwb7" Feb 18 14:02:23 crc kubenswrapper[4817]: I0218 14:02:23.236811 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7b45548f64-5nmnc" Feb 18 14:02:23 crc kubenswrapper[4817]: I0218 14:02:23.284544 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-fc95d9d9d-7xwb7" podStartSLOduration=3.284524073 podStartE2EDuration="3.284524073s" podCreationTimestamp="2026-02-18 14:02:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:02:23.262441207 +0000 UTC m=+205.837977200" watchObservedRunningTime="2026-02-18 14:02:23.284524073 +0000 UTC m=+205.860060056" Feb 18 14:02:23 crc kubenswrapper[4817]: I0218 14:02:23.285559 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7b45548f64-5nmnc" podStartSLOduration=3.285552299 podStartE2EDuration="3.285552299s" podCreationTimestamp="2026-02-18 14:02:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:02:23.284087532 +0000 UTC m=+205.859623525" watchObservedRunningTime="2026-02-18 14:02:23.285552299 +0000 UTC m=+205.861088282" Feb 18 14:02:24 crc kubenswrapper[4817]: I0218 14:02:24.190342 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab3a1628-f65f-495a-83f9-bfb536d40c12" path="/var/lib/kubelet/pods/ab3a1628-f65f-495a-83f9-bfb536d40c12/volumes" Feb 18 14:02:36 crc kubenswrapper[4817]: I0218 14:02:36.965324 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-g6m46"] Feb 18 14:02:36 crc kubenswrapper[4817]: I0218 14:02:36.967699 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-g6m46" Feb 18 14:02:36 crc kubenswrapper[4817]: I0218 14:02:36.991021 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-g6m46"] Feb 18 14:02:37 crc kubenswrapper[4817]: I0218 14:02:37.114413 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/41e71f68-88cb-4a2c-bcd0-6e781d1d4db2-bound-sa-token\") pod \"image-registry-66df7c8f76-g6m46\" (UID: \"41e71f68-88cb-4a2c-bcd0-6e781d1d4db2\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6m46" Feb 18 14:02:37 crc kubenswrapper[4817]: I0218 14:02:37.114480 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/41e71f68-88cb-4a2c-bcd0-6e781d1d4db2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-g6m46\" (UID: \"41e71f68-88cb-4a2c-bcd0-6e781d1d4db2\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6m46" Feb 18 14:02:37 crc kubenswrapper[4817]: I0218 14:02:37.114518 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-g6m46\" (UID: \"41e71f68-88cb-4a2c-bcd0-6e781d1d4db2\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6m46" Feb 18 14:02:37 crc kubenswrapper[4817]: I0218 14:02:37.114558 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/41e71f68-88cb-4a2c-bcd0-6e781d1d4db2-registry-tls\") pod \"image-registry-66df7c8f76-g6m46\" (UID: \"41e71f68-88cb-4a2c-bcd0-6e781d1d4db2\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6m46" Feb 18 14:02:37 crc kubenswrapper[4817]: I0218 14:02:37.114600 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/41e71f68-88cb-4a2c-bcd0-6e781d1d4db2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-g6m46\" (UID: \"41e71f68-88cb-4a2c-bcd0-6e781d1d4db2\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6m46" Feb 18 14:02:37 crc kubenswrapper[4817]: I0218 14:02:37.114621 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/41e71f68-88cb-4a2c-bcd0-6e781d1d4db2-trusted-ca\") pod \"image-registry-66df7c8f76-g6m46\" (UID: \"41e71f68-88cb-4a2c-bcd0-6e781d1d4db2\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6m46" Feb 18 14:02:37 crc kubenswrapper[4817]: I0218 14:02:37.114652 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs84q\" (UniqueName: \"kubernetes.io/projected/41e71f68-88cb-4a2c-bcd0-6e781d1d4db2-kube-api-access-vs84q\") pod \"image-registry-66df7c8f76-g6m46\" (UID: \"41e71f68-88cb-4a2c-bcd0-6e781d1d4db2\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6m46" Feb 18 14:02:37 crc kubenswrapper[4817]: I0218 14:02:37.114682 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/41e71f68-88cb-4a2c-bcd0-6e781d1d4db2-registry-certificates\") pod \"image-registry-66df7c8f76-g6m46\" (UID: \"41e71f68-88cb-4a2c-bcd0-6e781d1d4db2\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6m46" Feb 18 14:02:37 crc kubenswrapper[4817]: I0218 14:02:37.138540 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-g6m46\" (UID: \"41e71f68-88cb-4a2c-bcd0-6e781d1d4db2\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6m46" Feb 18 14:02:37 crc kubenswrapper[4817]: I0218 14:02:37.216310 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/41e71f68-88cb-4a2c-bcd0-6e781d1d4db2-bound-sa-token\") pod \"image-registry-66df7c8f76-g6m46\" (UID: \"41e71f68-88cb-4a2c-bcd0-6e781d1d4db2\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6m46" Feb 18 14:02:37 crc kubenswrapper[4817]: I0218 14:02:37.216383 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/41e71f68-88cb-4a2c-bcd0-6e781d1d4db2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-g6m46\" (UID: \"41e71f68-88cb-4a2c-bcd0-6e781d1d4db2\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6m46" Feb 18 14:02:37 crc kubenswrapper[4817]: I0218 14:02:37.216424 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/41e71f68-88cb-4a2c-bcd0-6e781d1d4db2-registry-tls\") pod \"image-registry-66df7c8f76-g6m46\" (UID: \"41e71f68-88cb-4a2c-bcd0-6e781d1d4db2\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6m46" Feb 18 14:02:37 crc kubenswrapper[4817]: I0218 14:02:37.216454 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/41e71f68-88cb-4a2c-bcd0-6e781d1d4db2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-g6m46\" (UID: \"41e71f68-88cb-4a2c-bcd0-6e781d1d4db2\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6m46" Feb 18 14:02:37 crc kubenswrapper[4817]: I0218 14:02:37.216469 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/41e71f68-88cb-4a2c-bcd0-6e781d1d4db2-trusted-ca\") pod \"image-registry-66df7c8f76-g6m46\" (UID: \"41e71f68-88cb-4a2c-bcd0-6e781d1d4db2\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6m46" Feb 18 14:02:37 crc kubenswrapper[4817]: I0218 14:02:37.216482 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs84q\" (UniqueName: \"kubernetes.io/projected/41e71f68-88cb-4a2c-bcd0-6e781d1d4db2-kube-api-access-vs84q\") pod \"image-registry-66df7c8f76-g6m46\" (UID: \"41e71f68-88cb-4a2c-bcd0-6e781d1d4db2\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6m46" Feb 18 14:02:37 crc kubenswrapper[4817]: I0218 14:02:37.216499 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/41e71f68-88cb-4a2c-bcd0-6e781d1d4db2-registry-certificates\") pod \"image-registry-66df7c8f76-g6m46\" (UID: \"41e71f68-88cb-4a2c-bcd0-6e781d1d4db2\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6m46" Feb 18 14:02:37 crc kubenswrapper[4817]: I0218 14:02:37.217695 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/41e71f68-88cb-4a2c-bcd0-6e781d1d4db2-registry-certificates\") pod \"image-registry-66df7c8f76-g6m46\" (UID: \"41e71f68-88cb-4a2c-bcd0-6e781d1d4db2\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6m46" Feb 18 14:02:37 crc kubenswrapper[4817]: I0218 14:02:37.218741 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/41e71f68-88cb-4a2c-bcd0-6e781d1d4db2-trusted-ca\") pod \"image-registry-66df7c8f76-g6m46\" (UID: \"41e71f68-88cb-4a2c-bcd0-6e781d1d4db2\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6m46" Feb 18 14:02:37 crc kubenswrapper[4817]: I0218 14:02:37.221293 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/41e71f68-88cb-4a2c-bcd0-6e781d1d4db2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-g6m46\" (UID: \"41e71f68-88cb-4a2c-bcd0-6e781d1d4db2\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6m46" Feb 18 14:02:37 crc kubenswrapper[4817]: I0218 14:02:37.223304 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/41e71f68-88cb-4a2c-bcd0-6e781d1d4db2-registry-tls\") pod \"image-registry-66df7c8f76-g6m46\" (UID: \"41e71f68-88cb-4a2c-bcd0-6e781d1d4db2\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6m46" Feb 18 14:02:37 crc kubenswrapper[4817]: I0218 14:02:37.225353 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/41e71f68-88cb-4a2c-bcd0-6e781d1d4db2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-g6m46\" (UID: \"41e71f68-88cb-4a2c-bcd0-6e781d1d4db2\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6m46" Feb 18 14:02:37 crc kubenswrapper[4817]: I0218 14:02:37.238428 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/41e71f68-88cb-4a2c-bcd0-6e781d1d4db2-bound-sa-token\") pod \"image-registry-66df7c8f76-g6m46\" (UID: \"41e71f68-88cb-4a2c-bcd0-6e781d1d4db2\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6m46" Feb 18 14:02:37 crc kubenswrapper[4817]: I0218 14:02:37.239702 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs84q\" (UniqueName: \"kubernetes.io/projected/41e71f68-88cb-4a2c-bcd0-6e781d1d4db2-kube-api-access-vs84q\") pod \"image-registry-66df7c8f76-g6m46\" (UID: \"41e71f68-88cb-4a2c-bcd0-6e781d1d4db2\") " pod="openshift-image-registry/image-registry-66df7c8f76-g6m46" Feb 18 14:02:37 crc kubenswrapper[4817]: I0218 14:02:37.291447 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-g6m46" Feb 18 14:02:37 crc kubenswrapper[4817]: I0218 14:02:37.719654 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-g6m46"] Feb 18 14:02:37 crc kubenswrapper[4817]: W0218 14:02:37.727634 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41e71f68_88cb_4a2c_bcd0_6e781d1d4db2.slice/crio-e25347c386fc20cf49b00e12b039516470d9d47a25b01d2a3a650156d4386856 WatchSource:0}: Error finding container e25347c386fc20cf49b00e12b039516470d9d47a25b01d2a3a650156d4386856: Status 404 returned error can't find the container with id e25347c386fc20cf49b00e12b039516470d9d47a25b01d2a3a650156d4386856 Feb 18 14:02:38 crc kubenswrapper[4817]: I0218 14:02:38.326899 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-g6m46" event={"ID":"41e71f68-88cb-4a2c-bcd0-6e781d1d4db2","Type":"ContainerStarted","Data":"19d9357924a170a69d926a04e0ee320c81c940ae7148f7b3c69fbaa8b7ecf8e4"} Feb 18 14:02:38 crc kubenswrapper[4817]: I0218 14:02:38.327374 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-g6m46" event={"ID":"41e71f68-88cb-4a2c-bcd0-6e781d1d4db2","Type":"ContainerStarted","Data":"e25347c386fc20cf49b00e12b039516470d9d47a25b01d2a3a650156d4386856"} Feb 18 14:02:38 crc kubenswrapper[4817]: I0218 14:02:38.327550 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-g6m46" Feb 18 14:02:38 crc kubenswrapper[4817]: I0218 14:02:38.369583 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-g6m46" podStartSLOduration=2.369552034 podStartE2EDuration="2.369552034s" podCreationTimestamp="2026-02-18 14:02:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:02:38.363144105 +0000 UTC m=+220.938680098" watchObservedRunningTime="2026-02-18 14:02:38.369552034 +0000 UTC m=+220.945088017" Feb 18 14:02:40 crc kubenswrapper[4817]: I0218 14:02:40.379690 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fc95d9d9d-7xwb7"] Feb 18 14:02:40 crc kubenswrapper[4817]: I0218 14:02:40.380267 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-fc95d9d9d-7xwb7" podUID="64967ad0-c726-4605-85fd-7c44a64e8f4c" containerName="route-controller-manager" containerID="cri-o://ba743a628dd3055d208ba3bb7ab97f378b96d8250bf79953058a4896c266c54c" gracePeriod=30 Feb 18 14:02:40 crc kubenswrapper[4817]: I0218 14:02:40.990940 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-fc95d9d9d-7xwb7" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.103688 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64967ad0-c726-4605-85fd-7c44a64e8f4c-config\") pod \"64967ad0-c726-4605-85fd-7c44a64e8f4c\" (UID: \"64967ad0-c726-4605-85fd-7c44a64e8f4c\") " Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.103739 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64967ad0-c726-4605-85fd-7c44a64e8f4c-serving-cert\") pod \"64967ad0-c726-4605-85fd-7c44a64e8f4c\" (UID: \"64967ad0-c726-4605-85fd-7c44a64e8f4c\") " Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.103833 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/64967ad0-c726-4605-85fd-7c44a64e8f4c-client-ca\") pod \"64967ad0-c726-4605-85fd-7c44a64e8f4c\" (UID: \"64967ad0-c726-4605-85fd-7c44a64e8f4c\") " Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.103869 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ns6g7\" (UniqueName: \"kubernetes.io/projected/64967ad0-c726-4605-85fd-7c44a64e8f4c-kube-api-access-ns6g7\") pod \"64967ad0-c726-4605-85fd-7c44a64e8f4c\" (UID: \"64967ad0-c726-4605-85fd-7c44a64e8f4c\") " Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.105358 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64967ad0-c726-4605-85fd-7c44a64e8f4c-client-ca" (OuterVolumeSpecName: "client-ca") pod "64967ad0-c726-4605-85fd-7c44a64e8f4c" (UID: "64967ad0-c726-4605-85fd-7c44a64e8f4c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.105520 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64967ad0-c726-4605-85fd-7c44a64e8f4c-config" (OuterVolumeSpecName: "config") pod "64967ad0-c726-4605-85fd-7c44a64e8f4c" (UID: "64967ad0-c726-4605-85fd-7c44a64e8f4c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.112835 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64967ad0-c726-4605-85fd-7c44a64e8f4c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "64967ad0-c726-4605-85fd-7c44a64e8f4c" (UID: "64967ad0-c726-4605-85fd-7c44a64e8f4c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.113210 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64967ad0-c726-4605-85fd-7c44a64e8f4c-kube-api-access-ns6g7" (OuterVolumeSpecName: "kube-api-access-ns6g7") pod "64967ad0-c726-4605-85fd-7c44a64e8f4c" (UID: "64967ad0-c726-4605-85fd-7c44a64e8f4c"). InnerVolumeSpecName "kube-api-access-ns6g7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.205012 4817 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/64967ad0-c726-4605-85fd-7c44a64e8f4c-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.205055 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ns6g7\" (UniqueName: \"kubernetes.io/projected/64967ad0-c726-4605-85fd-7c44a64e8f4c-kube-api-access-ns6g7\") on node \"crc\" DevicePath \"\"" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.205070 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64967ad0-c726-4605-85fd-7c44a64e8f4c-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.205080 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64967ad0-c726-4605-85fd-7c44a64e8f4c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.350133 4817 generic.go:334] "Generic (PLEG): container finished" podID="64967ad0-c726-4605-85fd-7c44a64e8f4c" containerID="ba743a628dd3055d208ba3bb7ab97f378b96d8250bf79953058a4896c266c54c" exitCode=0 Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.350404 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-fc95d9d9d-7xwb7" event={"ID":"64967ad0-c726-4605-85fd-7c44a64e8f4c","Type":"ContainerDied","Data":"ba743a628dd3055d208ba3bb7ab97f378b96d8250bf79953058a4896c266c54c"} Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.350747 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-fc95d9d9d-7xwb7" event={"ID":"64967ad0-c726-4605-85fd-7c44a64e8f4c","Type":"ContainerDied","Data":"baddaabfe37e0db58db2cd00e03b33f7f9d81a71a909cd5d34ac0a97929908c2"} Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.350550 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-fc95d9d9d-7xwb7" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.350839 4817 scope.go:117] "RemoveContainer" containerID="ba743a628dd3055d208ba3bb7ab97f378b96d8250bf79953058a4896c266c54c" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.378469 4817 scope.go:117] "RemoveContainer" containerID="ba743a628dd3055d208ba3bb7ab97f378b96d8250bf79953058a4896c266c54c" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.386047 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fc95d9d9d-7xwb7"] Feb 18 14:02:41 crc kubenswrapper[4817]: E0218 14:02:41.387885 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba743a628dd3055d208ba3bb7ab97f378b96d8250bf79953058a4896c266c54c\": container with ID starting with ba743a628dd3055d208ba3bb7ab97f378b96d8250bf79953058a4896c266c54c not found: ID does not exist" containerID="ba743a628dd3055d208ba3bb7ab97f378b96d8250bf79953058a4896c266c54c" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.388850 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba743a628dd3055d208ba3bb7ab97f378b96d8250bf79953058a4896c266c54c"} err="failed to get container status \"ba743a628dd3055d208ba3bb7ab97f378b96d8250bf79953058a4896c266c54c\": rpc error: code = NotFound desc = could not find container \"ba743a628dd3055d208ba3bb7ab97f378b96d8250bf79953058a4896c266c54c\": container with ID starting with ba743a628dd3055d208ba3bb7ab97f378b96d8250bf79953058a4896c266c54c not found: ID does not exist" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.398148 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fc95d9d9d-7xwb7"] Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.539741 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-664cbf5c87-rg9z9"] Feb 18 14:02:41 crc kubenswrapper[4817]: E0218 14:02:41.540015 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64967ad0-c726-4605-85fd-7c44a64e8f4c" containerName="route-controller-manager" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.540030 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="64967ad0-c726-4605-85fd-7c44a64e8f4c" containerName="route-controller-manager" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.540151 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="64967ad0-c726-4605-85fd-7c44a64e8f4c" containerName="route-controller-manager" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.540642 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-rg9z9" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.542832 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.543893 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.544191 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.545011 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.545593 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.545879 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.571918 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-664cbf5c87-rg9z9"] Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.712595 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlsvm\" (UniqueName: \"kubernetes.io/projected/0c1138f3-09ab-4422-9708-02415b5accbc-kube-api-access-wlsvm\") pod \"route-controller-manager-664cbf5c87-rg9z9\" (UID: \"0c1138f3-09ab-4422-9708-02415b5accbc\") " pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-rg9z9" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.712642 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c1138f3-09ab-4422-9708-02415b5accbc-config\") pod \"route-controller-manager-664cbf5c87-rg9z9\" (UID: \"0c1138f3-09ab-4422-9708-02415b5accbc\") " pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-rg9z9" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.712754 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0c1138f3-09ab-4422-9708-02415b5accbc-client-ca\") pod \"route-controller-manager-664cbf5c87-rg9z9\" (UID: \"0c1138f3-09ab-4422-9708-02415b5accbc\") " pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-rg9z9" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.712783 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c1138f3-09ab-4422-9708-02415b5accbc-serving-cert\") pod \"route-controller-manager-664cbf5c87-rg9z9\" (UID: \"0c1138f3-09ab-4422-9708-02415b5accbc\") " pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-rg9z9" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.814503 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c1138f3-09ab-4422-9708-02415b5accbc-config\") pod \"route-controller-manager-664cbf5c87-rg9z9\" (UID: \"0c1138f3-09ab-4422-9708-02415b5accbc\") " pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-rg9z9" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.814689 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0c1138f3-09ab-4422-9708-02415b5accbc-client-ca\") pod \"route-controller-manager-664cbf5c87-rg9z9\" (UID: \"0c1138f3-09ab-4422-9708-02415b5accbc\") " pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-rg9z9" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.814734 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c1138f3-09ab-4422-9708-02415b5accbc-serving-cert\") pod \"route-controller-manager-664cbf5c87-rg9z9\" (UID: \"0c1138f3-09ab-4422-9708-02415b5accbc\") " pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-rg9z9" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.814811 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlsvm\" (UniqueName: \"kubernetes.io/projected/0c1138f3-09ab-4422-9708-02415b5accbc-kube-api-access-wlsvm\") pod \"route-controller-manager-664cbf5c87-rg9z9\" (UID: \"0c1138f3-09ab-4422-9708-02415b5accbc\") " pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-rg9z9" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.816009 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c1138f3-09ab-4422-9708-02415b5accbc-config\") pod \"route-controller-manager-664cbf5c87-rg9z9\" (UID: \"0c1138f3-09ab-4422-9708-02415b5accbc\") " pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-rg9z9" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.816146 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0c1138f3-09ab-4422-9708-02415b5accbc-client-ca\") pod \"route-controller-manager-664cbf5c87-rg9z9\" (UID: \"0c1138f3-09ab-4422-9708-02415b5accbc\") " pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-rg9z9" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.827493 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c1138f3-09ab-4422-9708-02415b5accbc-serving-cert\") pod \"route-controller-manager-664cbf5c87-rg9z9\" (UID: \"0c1138f3-09ab-4422-9708-02415b5accbc\") " pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-rg9z9" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.836176 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlsvm\" (UniqueName: \"kubernetes.io/projected/0c1138f3-09ab-4422-9708-02415b5accbc-kube-api-access-wlsvm\") pod \"route-controller-manager-664cbf5c87-rg9z9\" (UID: \"0c1138f3-09ab-4422-9708-02415b5accbc\") " pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-rg9z9" Feb 18 14:02:41 crc kubenswrapper[4817]: I0218 14:02:41.867711 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-rg9z9" Feb 18 14:02:42 crc kubenswrapper[4817]: I0218 14:02:42.179474 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64967ad0-c726-4605-85fd-7c44a64e8f4c" path="/var/lib/kubelet/pods/64967ad0-c726-4605-85fd-7c44a64e8f4c/volumes" Feb 18 14:02:42 crc kubenswrapper[4817]: I0218 14:02:42.348492 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-664cbf5c87-rg9z9"] Feb 18 14:02:42 crc kubenswrapper[4817]: I0218 14:02:42.863409 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:02:42 crc kubenswrapper[4817]: I0218 14:02:42.863835 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:02:42 crc kubenswrapper[4817]: I0218 14:02:42.863892 4817 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" Feb 18 14:02:42 crc kubenswrapper[4817]: I0218 14:02:42.864542 4817 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c7c1c799a80a9d975ab53d4cf5272008822680f6f55efd7a2e6bec382bbea671"} pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 14:02:42 crc kubenswrapper[4817]: I0218 14:02:42.864610 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" containerID="cri-o://c7c1c799a80a9d975ab53d4cf5272008822680f6f55efd7a2e6bec382bbea671" gracePeriod=600 Feb 18 14:02:43 crc kubenswrapper[4817]: I0218 14:02:43.373057 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-rg9z9" event={"ID":"0c1138f3-09ab-4422-9708-02415b5accbc","Type":"ContainerStarted","Data":"a432007a8310ca70be487fe14a7686bf7fd8e0468a751ef8c2eb0fe44b0f275f"} Feb 18 14:02:43 crc kubenswrapper[4817]: I0218 14:02:43.373509 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-rg9z9" event={"ID":"0c1138f3-09ab-4422-9708-02415b5accbc","Type":"ContainerStarted","Data":"892543b15efa7be60d8192b682c42a69b2a30a453a64873103a418132c80e6a3"} Feb 18 14:02:43 crc kubenswrapper[4817]: I0218 14:02:43.375048 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-rg9z9" Feb 18 14:02:43 crc kubenswrapper[4817]: I0218 14:02:43.378135 4817 generic.go:334] "Generic (PLEG): container finished" podID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerID="c7c1c799a80a9d975ab53d4cf5272008822680f6f55efd7a2e6bec382bbea671" exitCode=0 Feb 18 14:02:43 crc kubenswrapper[4817]: I0218 14:02:43.378170 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerDied","Data":"c7c1c799a80a9d975ab53d4cf5272008822680f6f55efd7a2e6bec382bbea671"} Feb 18 14:02:43 crc kubenswrapper[4817]: I0218 14:02:43.378190 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerStarted","Data":"d3e9adde8434a7716ab4563cbde77006c2cd5de9992720aea0fc7ac8f5c1757e"} Feb 18 14:02:43 crc kubenswrapper[4817]: I0218 14:02:43.383489 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-rg9z9" Feb 18 14:02:43 crc kubenswrapper[4817]: I0218 14:02:43.414048 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-664cbf5c87-rg9z9" podStartSLOduration=3.414024433 podStartE2EDuration="3.414024433s" podCreationTimestamp="2026-02-18 14:02:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:02:43.39448948 +0000 UTC m=+225.970025463" watchObservedRunningTime="2026-02-18 14:02:43.414024433 +0000 UTC m=+225.989560416" Feb 18 14:02:57 crc kubenswrapper[4817]: I0218 14:02:57.304737 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-g6m46" Feb 18 14:02:57 crc kubenswrapper[4817]: I0218 14:02:57.388254 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mddfd"] Feb 18 14:03:00 crc kubenswrapper[4817]: I0218 14:03:00.395697 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b45548f64-5nmnc"] Feb 18 14:03:00 crc kubenswrapper[4817]: I0218 14:03:00.397159 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7b45548f64-5nmnc" podUID="998d1dac-2178-4614-ba73-3032370ae8b4" containerName="controller-manager" containerID="cri-o://1806616f8386b178f492430d7a559b5208fef60916d9f03f7c364630dbe7497b" gracePeriod=30 Feb 18 14:03:00 crc kubenswrapper[4817]: I0218 14:03:00.941437 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b45548f64-5nmnc" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.038398 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/998d1dac-2178-4614-ba73-3032370ae8b4-config\") pod \"998d1dac-2178-4614-ba73-3032370ae8b4\" (UID: \"998d1dac-2178-4614-ba73-3032370ae8b4\") " Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.038518 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/998d1dac-2178-4614-ba73-3032370ae8b4-client-ca\") pod \"998d1dac-2178-4614-ba73-3032370ae8b4\" (UID: \"998d1dac-2178-4614-ba73-3032370ae8b4\") " Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.038583 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/998d1dac-2178-4614-ba73-3032370ae8b4-serving-cert\") pod \"998d1dac-2178-4614-ba73-3032370ae8b4\" (UID: \"998d1dac-2178-4614-ba73-3032370ae8b4\") " Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.038716 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/998d1dac-2178-4614-ba73-3032370ae8b4-proxy-ca-bundles\") pod \"998d1dac-2178-4614-ba73-3032370ae8b4\" (UID: \"998d1dac-2178-4614-ba73-3032370ae8b4\") " Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.038789 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdcwl\" (UniqueName: \"kubernetes.io/projected/998d1dac-2178-4614-ba73-3032370ae8b4-kube-api-access-xdcwl\") pod \"998d1dac-2178-4614-ba73-3032370ae8b4\" (UID: \"998d1dac-2178-4614-ba73-3032370ae8b4\") " Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.039639 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/998d1dac-2178-4614-ba73-3032370ae8b4-client-ca" (OuterVolumeSpecName: "client-ca") pod "998d1dac-2178-4614-ba73-3032370ae8b4" (UID: "998d1dac-2178-4614-ba73-3032370ae8b4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.039677 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/998d1dac-2178-4614-ba73-3032370ae8b4-config" (OuterVolumeSpecName: "config") pod "998d1dac-2178-4614-ba73-3032370ae8b4" (UID: "998d1dac-2178-4614-ba73-3032370ae8b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.040846 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/998d1dac-2178-4614-ba73-3032370ae8b4-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "998d1dac-2178-4614-ba73-3032370ae8b4" (UID: "998d1dac-2178-4614-ba73-3032370ae8b4"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.045651 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/998d1dac-2178-4614-ba73-3032370ae8b4-kube-api-access-xdcwl" (OuterVolumeSpecName: "kube-api-access-xdcwl") pod "998d1dac-2178-4614-ba73-3032370ae8b4" (UID: "998d1dac-2178-4614-ba73-3032370ae8b4"). InnerVolumeSpecName "kube-api-access-xdcwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.045853 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/998d1dac-2178-4614-ba73-3032370ae8b4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "998d1dac-2178-4614-ba73-3032370ae8b4" (UID: "998d1dac-2178-4614-ba73-3032370ae8b4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.139833 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/998d1dac-2178-4614-ba73-3032370ae8b4-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.139888 4817 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/998d1dac-2178-4614-ba73-3032370ae8b4-client-ca\") on node \"crc\" DevicePath \"\"" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.139907 4817 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/998d1dac-2178-4614-ba73-3032370ae8b4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.139924 4817 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/998d1dac-2178-4614-ba73-3032370ae8b4-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.139945 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdcwl\" (UniqueName: \"kubernetes.io/projected/998d1dac-2178-4614-ba73-3032370ae8b4-kube-api-access-xdcwl\") on node \"crc\" DevicePath \"\"" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.516839 4817 generic.go:334] "Generic (PLEG): container finished" podID="998d1dac-2178-4614-ba73-3032370ae8b4" containerID="1806616f8386b178f492430d7a559b5208fef60916d9f03f7c364630dbe7497b" exitCode=0 Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.516966 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b45548f64-5nmnc" event={"ID":"998d1dac-2178-4614-ba73-3032370ae8b4","Type":"ContainerDied","Data":"1806616f8386b178f492430d7a559b5208fef60916d9f03f7c364630dbe7497b"} Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.517060 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b45548f64-5nmnc" event={"ID":"998d1dac-2178-4614-ba73-3032370ae8b4","Type":"ContainerDied","Data":"57694da48de66ca969b73783f195907a38102d982b7d9f47777bd2a6e994641c"} Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.517106 4817 scope.go:117] "RemoveContainer" containerID="1806616f8386b178f492430d7a559b5208fef60916d9f03f7c364630dbe7497b" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.517299 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b45548f64-5nmnc" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.540220 4817 scope.go:117] "RemoveContainer" containerID="1806616f8386b178f492430d7a559b5208fef60916d9f03f7c364630dbe7497b" Feb 18 14:03:01 crc kubenswrapper[4817]: E0218 14:03:01.542612 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1806616f8386b178f492430d7a559b5208fef60916d9f03f7c364630dbe7497b\": container with ID starting with 1806616f8386b178f492430d7a559b5208fef60916d9f03f7c364630dbe7497b not found: ID does not exist" containerID="1806616f8386b178f492430d7a559b5208fef60916d9f03f7c364630dbe7497b" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.542693 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1806616f8386b178f492430d7a559b5208fef60916d9f03f7c364630dbe7497b"} err="failed to get container status \"1806616f8386b178f492430d7a559b5208fef60916d9f03f7c364630dbe7497b\": rpc error: code = NotFound desc = could not find container \"1806616f8386b178f492430d7a559b5208fef60916d9f03f7c364630dbe7497b\": container with ID starting with 1806616f8386b178f492430d7a559b5208fef60916d9f03f7c364630dbe7497b not found: ID does not exist" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.566337 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-55f8f55785-fgh8x"] Feb 18 14:03:01 crc kubenswrapper[4817]: E0218 14:03:01.567331 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="998d1dac-2178-4614-ba73-3032370ae8b4" containerName="controller-manager" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.568820 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="998d1dac-2178-4614-ba73-3032370ae8b4" containerName="controller-manager" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.568997 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="998d1dac-2178-4614-ba73-3032370ae8b4" containerName="controller-manager" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.569417 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55f8f55785-fgh8x" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.573245 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.574025 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.574147 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.577106 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b45548f64-5nmnc"] Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.579582 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.579837 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.582892 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.584140 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7b45548f64-5nmnc"] Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.588373 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.589879 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-55f8f55785-fgh8x"] Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.644440 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f981e0de-d009-42af-8a75-2f76754301be-proxy-ca-bundles\") pod \"controller-manager-55f8f55785-fgh8x\" (UID: \"f981e0de-d009-42af-8a75-2f76754301be\") " pod="openshift-controller-manager/controller-manager-55f8f55785-fgh8x" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.644511 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f981e0de-d009-42af-8a75-2f76754301be-client-ca\") pod \"controller-manager-55f8f55785-fgh8x\" (UID: \"f981e0de-d009-42af-8a75-2f76754301be\") " pod="openshift-controller-manager/controller-manager-55f8f55785-fgh8x" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.644660 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g79rz\" (UniqueName: \"kubernetes.io/projected/f981e0de-d009-42af-8a75-2f76754301be-kube-api-access-g79rz\") pod \"controller-manager-55f8f55785-fgh8x\" (UID: \"f981e0de-d009-42af-8a75-2f76754301be\") " pod="openshift-controller-manager/controller-manager-55f8f55785-fgh8x" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.644698 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f981e0de-d009-42af-8a75-2f76754301be-config\") pod \"controller-manager-55f8f55785-fgh8x\" (UID: \"f981e0de-d009-42af-8a75-2f76754301be\") " pod="openshift-controller-manager/controller-manager-55f8f55785-fgh8x" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.644832 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f981e0de-d009-42af-8a75-2f76754301be-serving-cert\") pod \"controller-manager-55f8f55785-fgh8x\" (UID: \"f981e0de-d009-42af-8a75-2f76754301be\") " pod="openshift-controller-manager/controller-manager-55f8f55785-fgh8x" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.745603 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f981e0de-d009-42af-8a75-2f76754301be-proxy-ca-bundles\") pod \"controller-manager-55f8f55785-fgh8x\" (UID: \"f981e0de-d009-42af-8a75-2f76754301be\") " pod="openshift-controller-manager/controller-manager-55f8f55785-fgh8x" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.745684 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f981e0de-d009-42af-8a75-2f76754301be-client-ca\") pod \"controller-manager-55f8f55785-fgh8x\" (UID: \"f981e0de-d009-42af-8a75-2f76754301be\") " pod="openshift-controller-manager/controller-manager-55f8f55785-fgh8x" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.745758 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g79rz\" (UniqueName: \"kubernetes.io/projected/f981e0de-d009-42af-8a75-2f76754301be-kube-api-access-g79rz\") pod \"controller-manager-55f8f55785-fgh8x\" (UID: \"f981e0de-d009-42af-8a75-2f76754301be\") " pod="openshift-controller-manager/controller-manager-55f8f55785-fgh8x" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.745785 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f981e0de-d009-42af-8a75-2f76754301be-config\") pod \"controller-manager-55f8f55785-fgh8x\" (UID: \"f981e0de-d009-42af-8a75-2f76754301be\") " pod="openshift-controller-manager/controller-manager-55f8f55785-fgh8x" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.745815 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f981e0de-d009-42af-8a75-2f76754301be-serving-cert\") pod \"controller-manager-55f8f55785-fgh8x\" (UID: \"f981e0de-d009-42af-8a75-2f76754301be\") " pod="openshift-controller-manager/controller-manager-55f8f55785-fgh8x" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.748381 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f981e0de-d009-42af-8a75-2f76754301be-client-ca\") pod \"controller-manager-55f8f55785-fgh8x\" (UID: \"f981e0de-d009-42af-8a75-2f76754301be\") " pod="openshift-controller-manager/controller-manager-55f8f55785-fgh8x" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.751219 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f981e0de-d009-42af-8a75-2f76754301be-proxy-ca-bundles\") pod \"controller-manager-55f8f55785-fgh8x\" (UID: \"f981e0de-d009-42af-8a75-2f76754301be\") " pod="openshift-controller-manager/controller-manager-55f8f55785-fgh8x" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.751488 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f981e0de-d009-42af-8a75-2f76754301be-config\") pod \"controller-manager-55f8f55785-fgh8x\" (UID: \"f981e0de-d009-42af-8a75-2f76754301be\") " pod="openshift-controller-manager/controller-manager-55f8f55785-fgh8x" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.754477 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f981e0de-d009-42af-8a75-2f76754301be-serving-cert\") pod \"controller-manager-55f8f55785-fgh8x\" (UID: \"f981e0de-d009-42af-8a75-2f76754301be\") " pod="openshift-controller-manager/controller-manager-55f8f55785-fgh8x" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.780344 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g79rz\" (UniqueName: \"kubernetes.io/projected/f981e0de-d009-42af-8a75-2f76754301be-kube-api-access-g79rz\") pod \"controller-manager-55f8f55785-fgh8x\" (UID: \"f981e0de-d009-42af-8a75-2f76754301be\") " pod="openshift-controller-manager/controller-manager-55f8f55785-fgh8x" Feb 18 14:03:01 crc kubenswrapper[4817]: I0218 14:03:01.901346 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55f8f55785-fgh8x" Feb 18 14:03:02 crc kubenswrapper[4817]: I0218 14:03:02.178758 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="998d1dac-2178-4614-ba73-3032370ae8b4" path="/var/lib/kubelet/pods/998d1dac-2178-4614-ba73-3032370ae8b4/volumes" Feb 18 14:03:02 crc kubenswrapper[4817]: I0218 14:03:02.811521 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-55f8f55785-fgh8x"] Feb 18 14:03:03 crc kubenswrapper[4817]: I0218 14:03:03.530760 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55f8f55785-fgh8x" event={"ID":"f981e0de-d009-42af-8a75-2f76754301be","Type":"ContainerStarted","Data":"c97f9c16cd56a13a03b05605b96da758acfd2e5bc20a4c91930725ddb306822b"} Feb 18 14:03:03 crc kubenswrapper[4817]: I0218 14:03:03.530798 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55f8f55785-fgh8x" event={"ID":"f981e0de-d009-42af-8a75-2f76754301be","Type":"ContainerStarted","Data":"5ea045d8d5d0d2c56fa76bf488c90d0072d69e385363ff29ea29f1c5575543a0"} Feb 18 14:03:03 crc kubenswrapper[4817]: I0218 14:03:03.531310 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-55f8f55785-fgh8x" Feb 18 14:03:03 crc kubenswrapper[4817]: I0218 14:03:03.538471 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-55f8f55785-fgh8x" Feb 18 14:03:03 crc kubenswrapper[4817]: I0218 14:03:03.582457 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-55f8f55785-fgh8x" podStartSLOduration=3.582436404 podStartE2EDuration="3.582436404s" podCreationTimestamp="2026-02-18 14:03:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:03:03.555823086 +0000 UTC m=+246.131359069" watchObservedRunningTime="2026-02-18 14:03:03.582436404 +0000 UTC m=+246.157972387" Feb 18 14:03:22 crc kubenswrapper[4817]: I0218 14:03:22.446136 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" podUID="4faf1743-e825-477d-b191-830513a39317" containerName="registry" containerID="cri-o://5b48d2dceb1c7cf78a9dd57d6b9a2d6bd5ee08ab58e00b48402ac4ecacd2c1ce" gracePeriod=30 Feb 18 14:03:22 crc kubenswrapper[4817]: I0218 14:03:22.664697 4817 generic.go:334] "Generic (PLEG): container finished" podID="4faf1743-e825-477d-b191-830513a39317" containerID="5b48d2dceb1c7cf78a9dd57d6b9a2d6bd5ee08ab58e00b48402ac4ecacd2c1ce" exitCode=0 Feb 18 14:03:22 crc kubenswrapper[4817]: I0218 14:03:22.664783 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" event={"ID":"4faf1743-e825-477d-b191-830513a39317","Type":"ContainerDied","Data":"5b48d2dceb1c7cf78a9dd57d6b9a2d6bd5ee08ab58e00b48402ac4ecacd2c1ce"} Feb 18 14:03:22 crc kubenswrapper[4817]: I0218 14:03:22.962547 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 14:03:23 crc kubenswrapper[4817]: I0218 14:03:23.097793 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"4faf1743-e825-477d-b191-830513a39317\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " Feb 18 14:03:23 crc kubenswrapper[4817]: I0218 14:03:23.097972 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4faf1743-e825-477d-b191-830513a39317-trusted-ca\") pod \"4faf1743-e825-477d-b191-830513a39317\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " Feb 18 14:03:23 crc kubenswrapper[4817]: I0218 14:03:23.098137 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4faf1743-e825-477d-b191-830513a39317-bound-sa-token\") pod \"4faf1743-e825-477d-b191-830513a39317\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " Feb 18 14:03:23 crc kubenswrapper[4817]: I0218 14:03:23.098236 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4faf1743-e825-477d-b191-830513a39317-registry-certificates\") pod \"4faf1743-e825-477d-b191-830513a39317\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " Feb 18 14:03:23 crc kubenswrapper[4817]: I0218 14:03:23.098336 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4faf1743-e825-477d-b191-830513a39317-registry-tls\") pod \"4faf1743-e825-477d-b191-830513a39317\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " Feb 18 14:03:23 crc kubenswrapper[4817]: I0218 14:03:23.098386 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4faf1743-e825-477d-b191-830513a39317-installation-pull-secrets\") pod \"4faf1743-e825-477d-b191-830513a39317\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " Feb 18 14:03:23 crc kubenswrapper[4817]: I0218 14:03:23.098464 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9ztb\" (UniqueName: \"kubernetes.io/projected/4faf1743-e825-477d-b191-830513a39317-kube-api-access-j9ztb\") pod \"4faf1743-e825-477d-b191-830513a39317\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " Feb 18 14:03:23 crc kubenswrapper[4817]: I0218 14:03:23.098512 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4faf1743-e825-477d-b191-830513a39317-ca-trust-extracted\") pod \"4faf1743-e825-477d-b191-830513a39317\" (UID: \"4faf1743-e825-477d-b191-830513a39317\") " Feb 18 14:03:23 crc kubenswrapper[4817]: I0218 14:03:23.100681 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4faf1743-e825-477d-b191-830513a39317-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4faf1743-e825-477d-b191-830513a39317" (UID: "4faf1743-e825-477d-b191-830513a39317"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:03:23 crc kubenswrapper[4817]: I0218 14:03:23.101061 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4faf1743-e825-477d-b191-830513a39317-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4faf1743-e825-477d-b191-830513a39317" (UID: "4faf1743-e825-477d-b191-830513a39317"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:03:23 crc kubenswrapper[4817]: I0218 14:03:23.107593 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4faf1743-e825-477d-b191-830513a39317-kube-api-access-j9ztb" (OuterVolumeSpecName: "kube-api-access-j9ztb") pod "4faf1743-e825-477d-b191-830513a39317" (UID: "4faf1743-e825-477d-b191-830513a39317"). InnerVolumeSpecName "kube-api-access-j9ztb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:03:23 crc kubenswrapper[4817]: I0218 14:03:23.108062 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4faf1743-e825-477d-b191-830513a39317-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4faf1743-e825-477d-b191-830513a39317" (UID: "4faf1743-e825-477d-b191-830513a39317"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:03:23 crc kubenswrapper[4817]: I0218 14:03:23.108235 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4faf1743-e825-477d-b191-830513a39317-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4faf1743-e825-477d-b191-830513a39317" (UID: "4faf1743-e825-477d-b191-830513a39317"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:03:23 crc kubenswrapper[4817]: I0218 14:03:23.109484 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4faf1743-e825-477d-b191-830513a39317-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "4faf1743-e825-477d-b191-830513a39317" (UID: "4faf1743-e825-477d-b191-830513a39317"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:03:23 crc kubenswrapper[4817]: I0218 14:03:23.113829 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "4faf1743-e825-477d-b191-830513a39317" (UID: "4faf1743-e825-477d-b191-830513a39317"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 18 14:03:23 crc kubenswrapper[4817]: I0218 14:03:23.136230 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4faf1743-e825-477d-b191-830513a39317-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4faf1743-e825-477d-b191-830513a39317" (UID: "4faf1743-e825-477d-b191-830513a39317"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:03:23 crc kubenswrapper[4817]: I0218 14:03:23.200166 4817 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4faf1743-e825-477d-b191-830513a39317-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 14:03:23 crc kubenswrapper[4817]: I0218 14:03:23.200234 4817 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4faf1743-e825-477d-b191-830513a39317-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 18 14:03:23 crc kubenswrapper[4817]: I0218 14:03:23.200262 4817 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4faf1743-e825-477d-b191-830513a39317-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 18 14:03:23 crc kubenswrapper[4817]: I0218 14:03:23.200283 4817 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4faf1743-e825-477d-b191-830513a39317-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 18 14:03:23 crc kubenswrapper[4817]: I0218 14:03:23.200303 4817 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4faf1743-e825-477d-b191-830513a39317-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 18 14:03:23 crc kubenswrapper[4817]: I0218 14:03:23.200321 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9ztb\" (UniqueName: \"kubernetes.io/projected/4faf1743-e825-477d-b191-830513a39317-kube-api-access-j9ztb\") on node \"crc\" DevicePath \"\"" Feb 18 14:03:23 crc kubenswrapper[4817]: I0218 14:03:23.200340 4817 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4faf1743-e825-477d-b191-830513a39317-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 18 14:03:23 crc kubenswrapper[4817]: I0218 14:03:23.678321 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" event={"ID":"4faf1743-e825-477d-b191-830513a39317","Type":"ContainerDied","Data":"f1ee523467d8f30e34fede55c323c6a464aa4503afd82a661b8535db3b1b1a3f"} Feb 18 14:03:23 crc kubenswrapper[4817]: I0218 14:03:23.678373 4817 scope.go:117] "RemoveContainer" containerID="5b48d2dceb1c7cf78a9dd57d6b9a2d6bd5ee08ab58e00b48402ac4ecacd2c1ce" Feb 18 14:03:23 crc kubenswrapper[4817]: I0218 14:03:23.678416 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mddfd" Feb 18 14:03:23 crc kubenswrapper[4817]: I0218 14:03:23.716476 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mddfd"] Feb 18 14:03:23 crc kubenswrapper[4817]: I0218 14:03:23.721118 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mddfd"] Feb 18 14:03:24 crc kubenswrapper[4817]: I0218 14:03:24.185869 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4faf1743-e825-477d-b191-830513a39317" path="/var/lib/kubelet/pods/4faf1743-e825-477d-b191-830513a39317/volumes" Feb 18 14:03:38 crc kubenswrapper[4817]: I0218 14:03:38.899641 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hxln5"] Feb 18 14:03:38 crc kubenswrapper[4817]: I0218 14:03:38.900355 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hxln5" podUID="8162b014-86d1-482a-8c7c-eba34fed3f62" containerName="registry-server" containerID="cri-o://7775c4178f5b2931c0ed71c9f9adbc601246b52aff904af7bd43e960221a1d9b" gracePeriod=30 Feb 18 14:03:38 crc kubenswrapper[4817]: I0218 14:03:38.903733 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-54ftv"] Feb 18 14:03:38 crc kubenswrapper[4817]: I0218 14:03:38.904350 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-54ftv" podUID="307f9900-9137-46bb-9b32-254ae14c8c17" containerName="registry-server" containerID="cri-o://c26cc50afbd01264ca9d349db92397eee85590040151ab1b7faed36a648304e5" gracePeriod=30 Feb 18 14:03:38 crc kubenswrapper[4817]: I0218 14:03:38.925098 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j4llw"] Feb 18 14:03:38 crc kubenswrapper[4817]: I0218 14:03:38.925396 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-j4llw" podUID="18a3347a-5dd7-4047-8c43-9c073c9321e6" containerName="marketplace-operator" containerID="cri-o://b330da5cfdbd2ae9feb6cc659fbeba65b247ef572f1156c86b73d62b6a02dacf" gracePeriod=30 Feb 18 14:03:38 crc kubenswrapper[4817]: I0218 14:03:38.941174 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jp6w5"] Feb 18 14:03:38 crc kubenswrapper[4817]: I0218 14:03:38.941251 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b7925"] Feb 18 14:03:38 crc kubenswrapper[4817]: I0218 14:03:38.941539 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b7925" podUID="44a62058-ed9b-4364-97d0-09af2bb1c22d" containerName="registry-server" containerID="cri-o://827bcc31902382441f5dc5292cbc88bab9c20d1b18edec4252963d92d90ca87d" gracePeriod=30 Feb 18 14:03:38 crc kubenswrapper[4817]: I0218 14:03:38.941813 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jp6w5" podUID="a860a4b3-b2dd-4d2f-8d2f-a959007a6197" containerName="registry-server" containerID="cri-o://52302c5635b0814b8f9c823564514b85b7eb1dc5e0dee685fbbb31135ee65bec" gracePeriod=30 Feb 18 14:03:38 crc kubenswrapper[4817]: I0218 14:03:38.941955 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cw95t"] Feb 18 14:03:38 crc kubenswrapper[4817]: E0218 14:03:38.942280 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4faf1743-e825-477d-b191-830513a39317" containerName="registry" Feb 18 14:03:38 crc kubenswrapper[4817]: I0218 14:03:38.942305 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="4faf1743-e825-477d-b191-830513a39317" containerName="registry" Feb 18 14:03:38 crc kubenswrapper[4817]: I0218 14:03:38.942461 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="4faf1743-e825-477d-b191-830513a39317" containerName="registry" Feb 18 14:03:38 crc kubenswrapper[4817]: I0218 14:03:38.942943 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cw95t" Feb 18 14:03:38 crc kubenswrapper[4817]: I0218 14:03:38.955554 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cw95t"] Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.039408 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/039201b1-3f23-4f22-80cb-17f07e1732df-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cw95t\" (UID: \"039201b1-3f23-4f22-80cb-17f07e1732df\") " pod="openshift-marketplace/marketplace-operator-79b997595-cw95t" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.039706 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cdt2\" (UniqueName: \"kubernetes.io/projected/039201b1-3f23-4f22-80cb-17f07e1732df-kube-api-access-8cdt2\") pod \"marketplace-operator-79b997595-cw95t\" (UID: \"039201b1-3f23-4f22-80cb-17f07e1732df\") " pod="openshift-marketplace/marketplace-operator-79b997595-cw95t" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.039825 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/039201b1-3f23-4f22-80cb-17f07e1732df-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cw95t\" (UID: \"039201b1-3f23-4f22-80cb-17f07e1732df\") " pod="openshift-marketplace/marketplace-operator-79b997595-cw95t" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.140930 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/039201b1-3f23-4f22-80cb-17f07e1732df-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cw95t\" (UID: \"039201b1-3f23-4f22-80cb-17f07e1732df\") " pod="openshift-marketplace/marketplace-operator-79b997595-cw95t" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.140998 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cdt2\" (UniqueName: \"kubernetes.io/projected/039201b1-3f23-4f22-80cb-17f07e1732df-kube-api-access-8cdt2\") pod \"marketplace-operator-79b997595-cw95t\" (UID: \"039201b1-3f23-4f22-80cb-17f07e1732df\") " pod="openshift-marketplace/marketplace-operator-79b997595-cw95t" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.141040 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/039201b1-3f23-4f22-80cb-17f07e1732df-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cw95t\" (UID: \"039201b1-3f23-4f22-80cb-17f07e1732df\") " pod="openshift-marketplace/marketplace-operator-79b997595-cw95t" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.142153 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/039201b1-3f23-4f22-80cb-17f07e1732df-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cw95t\" (UID: \"039201b1-3f23-4f22-80cb-17f07e1732df\") " pod="openshift-marketplace/marketplace-operator-79b997595-cw95t" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.148434 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/039201b1-3f23-4f22-80cb-17f07e1732df-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cw95t\" (UID: \"039201b1-3f23-4f22-80cb-17f07e1732df\") " pod="openshift-marketplace/marketplace-operator-79b997595-cw95t" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.162852 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cdt2\" (UniqueName: \"kubernetes.io/projected/039201b1-3f23-4f22-80cb-17f07e1732df-kube-api-access-8cdt2\") pod \"marketplace-operator-79b997595-cw95t\" (UID: \"039201b1-3f23-4f22-80cb-17f07e1732df\") " pod="openshift-marketplace/marketplace-operator-79b997595-cw95t" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.270330 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cw95t" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.394710 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-54ftv" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.448366 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc7nx\" (UniqueName: \"kubernetes.io/projected/307f9900-9137-46bb-9b32-254ae14c8c17-kube-api-access-cc7nx\") pod \"307f9900-9137-46bb-9b32-254ae14c8c17\" (UID: \"307f9900-9137-46bb-9b32-254ae14c8c17\") " Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.448423 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/307f9900-9137-46bb-9b32-254ae14c8c17-utilities\") pod \"307f9900-9137-46bb-9b32-254ae14c8c17\" (UID: \"307f9900-9137-46bb-9b32-254ae14c8c17\") " Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.448488 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/307f9900-9137-46bb-9b32-254ae14c8c17-catalog-content\") pod \"307f9900-9137-46bb-9b32-254ae14c8c17\" (UID: \"307f9900-9137-46bb-9b32-254ae14c8c17\") " Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.452822 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/307f9900-9137-46bb-9b32-254ae14c8c17-utilities" (OuterVolumeSpecName: "utilities") pod "307f9900-9137-46bb-9b32-254ae14c8c17" (UID: "307f9900-9137-46bb-9b32-254ae14c8c17"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.458817 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/307f9900-9137-46bb-9b32-254ae14c8c17-kube-api-access-cc7nx" (OuterVolumeSpecName: "kube-api-access-cc7nx") pod "307f9900-9137-46bb-9b32-254ae14c8c17" (UID: "307f9900-9137-46bb-9b32-254ae14c8c17"). InnerVolumeSpecName "kube-api-access-cc7nx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.517291 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hxln5" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.521044 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/307f9900-9137-46bb-9b32-254ae14c8c17-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "307f9900-9137-46bb-9b32-254ae14c8c17" (UID: "307f9900-9137-46bb-9b32-254ae14c8c17"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.551068 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wr8hq\" (UniqueName: \"kubernetes.io/projected/8162b014-86d1-482a-8c7c-eba34fed3f62-kube-api-access-wr8hq\") pod \"8162b014-86d1-482a-8c7c-eba34fed3f62\" (UID: \"8162b014-86d1-482a-8c7c-eba34fed3f62\") " Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.551118 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8162b014-86d1-482a-8c7c-eba34fed3f62-utilities\") pod \"8162b014-86d1-482a-8c7c-eba34fed3f62\" (UID: \"8162b014-86d1-482a-8c7c-eba34fed3f62\") " Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.551218 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8162b014-86d1-482a-8c7c-eba34fed3f62-catalog-content\") pod \"8162b014-86d1-482a-8c7c-eba34fed3f62\" (UID: \"8162b014-86d1-482a-8c7c-eba34fed3f62\") " Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.551401 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/307f9900-9137-46bb-9b32-254ae14c8c17-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.551412 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cc7nx\" (UniqueName: \"kubernetes.io/projected/307f9900-9137-46bb-9b32-254ae14c8c17-kube-api-access-cc7nx\") on node \"crc\" DevicePath \"\"" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.551422 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/307f9900-9137-46bb-9b32-254ae14c8c17-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.561743 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8162b014-86d1-482a-8c7c-eba34fed3f62-kube-api-access-wr8hq" (OuterVolumeSpecName: "kube-api-access-wr8hq") pod "8162b014-86d1-482a-8c7c-eba34fed3f62" (UID: "8162b014-86d1-482a-8c7c-eba34fed3f62"). InnerVolumeSpecName "kube-api-access-wr8hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.565832 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-j4llw" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.569464 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b7925" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.572743 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8162b014-86d1-482a-8c7c-eba34fed3f62-utilities" (OuterVolumeSpecName: "utilities") pod "8162b014-86d1-482a-8c7c-eba34fed3f62" (UID: "8162b014-86d1-482a-8c7c-eba34fed3f62"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.610741 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jp6w5" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.636122 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8162b014-86d1-482a-8c7c-eba34fed3f62-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8162b014-86d1-482a-8c7c-eba34fed3f62" (UID: "8162b014-86d1-482a-8c7c-eba34fed3f62"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.651727 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d96mv\" (UniqueName: \"kubernetes.io/projected/18a3347a-5dd7-4047-8c43-9c073c9321e6-kube-api-access-d96mv\") pod \"18a3347a-5dd7-4047-8c43-9c073c9321e6\" (UID: \"18a3347a-5dd7-4047-8c43-9c073c9321e6\") " Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.651781 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m47tw\" (UniqueName: \"kubernetes.io/projected/a860a4b3-b2dd-4d2f-8d2f-a959007a6197-kube-api-access-m47tw\") pod \"a860a4b3-b2dd-4d2f-8d2f-a959007a6197\" (UID: \"a860a4b3-b2dd-4d2f-8d2f-a959007a6197\") " Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.651804 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tms49\" (UniqueName: \"kubernetes.io/projected/44a62058-ed9b-4364-97d0-09af2bb1c22d-kube-api-access-tms49\") pod \"44a62058-ed9b-4364-97d0-09af2bb1c22d\" (UID: \"44a62058-ed9b-4364-97d0-09af2bb1c22d\") " Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.651828 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/18a3347a-5dd7-4047-8c43-9c073c9321e6-marketplace-operator-metrics\") pod \"18a3347a-5dd7-4047-8c43-9c073c9321e6\" (UID: \"18a3347a-5dd7-4047-8c43-9c073c9321e6\") " Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.651863 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a860a4b3-b2dd-4d2f-8d2f-a959007a6197-catalog-content\") pod \"a860a4b3-b2dd-4d2f-8d2f-a959007a6197\" (UID: \"a860a4b3-b2dd-4d2f-8d2f-a959007a6197\") " Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.651883 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44a62058-ed9b-4364-97d0-09af2bb1c22d-catalog-content\") pod \"44a62058-ed9b-4364-97d0-09af2bb1c22d\" (UID: \"44a62058-ed9b-4364-97d0-09af2bb1c22d\") " Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.651899 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44a62058-ed9b-4364-97d0-09af2bb1c22d-utilities\") pod \"44a62058-ed9b-4364-97d0-09af2bb1c22d\" (UID: \"44a62058-ed9b-4364-97d0-09af2bb1c22d\") " Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.651916 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a860a4b3-b2dd-4d2f-8d2f-a959007a6197-utilities\") pod \"a860a4b3-b2dd-4d2f-8d2f-a959007a6197\" (UID: \"a860a4b3-b2dd-4d2f-8d2f-a959007a6197\") " Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.651949 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/18a3347a-5dd7-4047-8c43-9c073c9321e6-marketplace-trusted-ca\") pod \"18a3347a-5dd7-4047-8c43-9c073c9321e6\" (UID: \"18a3347a-5dd7-4047-8c43-9c073c9321e6\") " Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.652078 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8162b014-86d1-482a-8c7c-eba34fed3f62-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.652090 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8162b014-86d1-482a-8c7c-eba34fed3f62-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.652100 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wr8hq\" (UniqueName: \"kubernetes.io/projected/8162b014-86d1-482a-8c7c-eba34fed3f62-kube-api-access-wr8hq\") on node \"crc\" DevicePath \"\"" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.652611 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18a3347a-5dd7-4047-8c43-9c073c9321e6-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "18a3347a-5dd7-4047-8c43-9c073c9321e6" (UID: "18a3347a-5dd7-4047-8c43-9c073c9321e6"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.655501 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18a3347a-5dd7-4047-8c43-9c073c9321e6-kube-api-access-d96mv" (OuterVolumeSpecName: "kube-api-access-d96mv") pod "18a3347a-5dd7-4047-8c43-9c073c9321e6" (UID: "18a3347a-5dd7-4047-8c43-9c073c9321e6"). InnerVolumeSpecName "kube-api-access-d96mv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.656446 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44a62058-ed9b-4364-97d0-09af2bb1c22d-utilities" (OuterVolumeSpecName: "utilities") pod "44a62058-ed9b-4364-97d0-09af2bb1c22d" (UID: "44a62058-ed9b-4364-97d0-09af2bb1c22d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.657423 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a860a4b3-b2dd-4d2f-8d2f-a959007a6197-utilities" (OuterVolumeSpecName: "utilities") pod "a860a4b3-b2dd-4d2f-8d2f-a959007a6197" (UID: "a860a4b3-b2dd-4d2f-8d2f-a959007a6197"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.657558 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44a62058-ed9b-4364-97d0-09af2bb1c22d-kube-api-access-tms49" (OuterVolumeSpecName: "kube-api-access-tms49") pod "44a62058-ed9b-4364-97d0-09af2bb1c22d" (UID: "44a62058-ed9b-4364-97d0-09af2bb1c22d"). InnerVolumeSpecName "kube-api-access-tms49". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.657801 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a860a4b3-b2dd-4d2f-8d2f-a959007a6197-kube-api-access-m47tw" (OuterVolumeSpecName: "kube-api-access-m47tw") pod "a860a4b3-b2dd-4d2f-8d2f-a959007a6197" (UID: "a860a4b3-b2dd-4d2f-8d2f-a959007a6197"). InnerVolumeSpecName "kube-api-access-m47tw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.660854 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18a3347a-5dd7-4047-8c43-9c073c9321e6-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "18a3347a-5dd7-4047-8c43-9c073c9321e6" (UID: "18a3347a-5dd7-4047-8c43-9c073c9321e6"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.684941 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a860a4b3-b2dd-4d2f-8d2f-a959007a6197-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a860a4b3-b2dd-4d2f-8d2f-a959007a6197" (UID: "a860a4b3-b2dd-4d2f-8d2f-a959007a6197"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.752453 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m47tw\" (UniqueName: \"kubernetes.io/projected/a860a4b3-b2dd-4d2f-8d2f-a959007a6197-kube-api-access-m47tw\") on node \"crc\" DevicePath \"\"" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.752486 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tms49\" (UniqueName: \"kubernetes.io/projected/44a62058-ed9b-4364-97d0-09af2bb1c22d-kube-api-access-tms49\") on node \"crc\" DevicePath \"\"" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.752496 4817 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/18a3347a-5dd7-4047-8c43-9c073c9321e6-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.752507 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a860a4b3-b2dd-4d2f-8d2f-a959007a6197-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.752515 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44a62058-ed9b-4364-97d0-09af2bb1c22d-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.752523 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a860a4b3-b2dd-4d2f-8d2f-a959007a6197-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.752531 4817 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/18a3347a-5dd7-4047-8c43-9c073c9321e6-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.752539 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d96mv\" (UniqueName: \"kubernetes.io/projected/18a3347a-5dd7-4047-8c43-9c073c9321e6-kube-api-access-d96mv\") on node \"crc\" DevicePath \"\"" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.792548 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44a62058-ed9b-4364-97d0-09af2bb1c22d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44a62058-ed9b-4364-97d0-09af2bb1c22d" (UID: "44a62058-ed9b-4364-97d0-09af2bb1c22d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.794053 4817 generic.go:334] "Generic (PLEG): container finished" podID="8162b014-86d1-482a-8c7c-eba34fed3f62" containerID="7775c4178f5b2931c0ed71c9f9adbc601246b52aff904af7bd43e960221a1d9b" exitCode=0 Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.794133 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hxln5" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.794115 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hxln5" event={"ID":"8162b014-86d1-482a-8c7c-eba34fed3f62","Type":"ContainerDied","Data":"7775c4178f5b2931c0ed71c9f9adbc601246b52aff904af7bd43e960221a1d9b"} Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.795018 4817 scope.go:117] "RemoveContainer" containerID="7775c4178f5b2931c0ed71c9f9adbc601246b52aff904af7bd43e960221a1d9b" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.794973 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hxln5" event={"ID":"8162b014-86d1-482a-8c7c-eba34fed3f62","Type":"ContainerDied","Data":"6648c733b10888135df31bb74dcee822a035d8432afeefa97dd64240c9bdd812"} Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.797835 4817 generic.go:334] "Generic (PLEG): container finished" podID="44a62058-ed9b-4364-97d0-09af2bb1c22d" containerID="827bcc31902382441f5dc5292cbc88bab9c20d1b18edec4252963d92d90ca87d" exitCode=0 Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.797998 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7925" event={"ID":"44a62058-ed9b-4364-97d0-09af2bb1c22d","Type":"ContainerDied","Data":"827bcc31902382441f5dc5292cbc88bab9c20d1b18edec4252963d92d90ca87d"} Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.798058 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7925" event={"ID":"44a62058-ed9b-4364-97d0-09af2bb1c22d","Type":"ContainerDied","Data":"409b121b87837cca791c4f60e0e46e6192da6970ca02ef9ba5489e96f1250b8f"} Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.798029 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b7925" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.801135 4817 generic.go:334] "Generic (PLEG): container finished" podID="18a3347a-5dd7-4047-8c43-9c073c9321e6" containerID="b330da5cfdbd2ae9feb6cc659fbeba65b247ef572f1156c86b73d62b6a02dacf" exitCode=0 Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.801295 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-j4llw" event={"ID":"18a3347a-5dd7-4047-8c43-9c073c9321e6","Type":"ContainerDied","Data":"b330da5cfdbd2ae9feb6cc659fbeba65b247ef572f1156c86b73d62b6a02dacf"} Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.801341 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-j4llw" event={"ID":"18a3347a-5dd7-4047-8c43-9c073c9321e6","Type":"ContainerDied","Data":"0ab67841e6ac80e55979c94e709143b08600c0a3f7865376e902b3ba086a052e"} Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.801418 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-j4llw" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.805853 4817 generic.go:334] "Generic (PLEG): container finished" podID="307f9900-9137-46bb-9b32-254ae14c8c17" containerID="c26cc50afbd01264ca9d349db92397eee85590040151ab1b7faed36a648304e5" exitCode=0 Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.805950 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54ftv" event={"ID":"307f9900-9137-46bb-9b32-254ae14c8c17","Type":"ContainerDied","Data":"c26cc50afbd01264ca9d349db92397eee85590040151ab1b7faed36a648304e5"} Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.806015 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54ftv" event={"ID":"307f9900-9137-46bb-9b32-254ae14c8c17","Type":"ContainerDied","Data":"c866af86003101a5b03304faf9f9c8eba07e17488bbf0d2e6a9406274bbcad67"} Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.806112 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-54ftv" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.811641 4817 generic.go:334] "Generic (PLEG): container finished" podID="a860a4b3-b2dd-4d2f-8d2f-a959007a6197" containerID="52302c5635b0814b8f9c823564514b85b7eb1dc5e0dee685fbbb31135ee65bec" exitCode=0 Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.811726 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jp6w5" event={"ID":"a860a4b3-b2dd-4d2f-8d2f-a959007a6197","Type":"ContainerDied","Data":"52302c5635b0814b8f9c823564514b85b7eb1dc5e0dee685fbbb31135ee65bec"} Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.811821 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jp6w5" event={"ID":"a860a4b3-b2dd-4d2f-8d2f-a959007a6197","Type":"ContainerDied","Data":"62fb97cbfacc50e216527029bc24d68da25c026473c5ec48450bc55b0896a11e"} Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.812133 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jp6w5" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.823753 4817 scope.go:117] "RemoveContainer" containerID="1237dc5b449277e51dd56196d51e311f3272f33a7d5d2085f79ca8f81f4cc16c" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.845755 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hxln5"] Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.852721 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hxln5"] Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.853174 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44a62058-ed9b-4364-97d0-09af2bb1c22d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.856097 4817 scope.go:117] "RemoveContainer" containerID="94fd463f320e734957f758e9aa5704c0873f4941b3317af56b7a7914e01b2385" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.860670 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j4llw"] Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.873854 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j4llw"] Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.889693 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jp6w5"] Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.897065 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jp6w5"] Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.899397 4817 scope.go:117] "RemoveContainer" containerID="7775c4178f5b2931c0ed71c9f9adbc601246b52aff904af7bd43e960221a1d9b" Feb 18 14:03:39 crc kubenswrapper[4817]: E0218 14:03:39.900512 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7775c4178f5b2931c0ed71c9f9adbc601246b52aff904af7bd43e960221a1d9b\": container with ID starting with 7775c4178f5b2931c0ed71c9f9adbc601246b52aff904af7bd43e960221a1d9b not found: ID does not exist" containerID="7775c4178f5b2931c0ed71c9f9adbc601246b52aff904af7bd43e960221a1d9b" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.900585 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7775c4178f5b2931c0ed71c9f9adbc601246b52aff904af7bd43e960221a1d9b"} err="failed to get container status \"7775c4178f5b2931c0ed71c9f9adbc601246b52aff904af7bd43e960221a1d9b\": rpc error: code = NotFound desc = could not find container \"7775c4178f5b2931c0ed71c9f9adbc601246b52aff904af7bd43e960221a1d9b\": container with ID starting with 7775c4178f5b2931c0ed71c9f9adbc601246b52aff904af7bd43e960221a1d9b not found: ID does not exist" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.900628 4817 scope.go:117] "RemoveContainer" containerID="1237dc5b449277e51dd56196d51e311f3272f33a7d5d2085f79ca8f81f4cc16c" Feb 18 14:03:39 crc kubenswrapper[4817]: E0218 14:03:39.901397 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1237dc5b449277e51dd56196d51e311f3272f33a7d5d2085f79ca8f81f4cc16c\": container with ID starting with 1237dc5b449277e51dd56196d51e311f3272f33a7d5d2085f79ca8f81f4cc16c not found: ID does not exist" containerID="1237dc5b449277e51dd56196d51e311f3272f33a7d5d2085f79ca8f81f4cc16c" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.901445 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1237dc5b449277e51dd56196d51e311f3272f33a7d5d2085f79ca8f81f4cc16c"} err="failed to get container status \"1237dc5b449277e51dd56196d51e311f3272f33a7d5d2085f79ca8f81f4cc16c\": rpc error: code = NotFound desc = could not find container \"1237dc5b449277e51dd56196d51e311f3272f33a7d5d2085f79ca8f81f4cc16c\": container with ID starting with 1237dc5b449277e51dd56196d51e311f3272f33a7d5d2085f79ca8f81f4cc16c not found: ID does not exist" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.901480 4817 scope.go:117] "RemoveContainer" containerID="94fd463f320e734957f758e9aa5704c0873f4941b3317af56b7a7914e01b2385" Feb 18 14:03:39 crc kubenswrapper[4817]: E0218 14:03:39.904374 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94fd463f320e734957f758e9aa5704c0873f4941b3317af56b7a7914e01b2385\": container with ID starting with 94fd463f320e734957f758e9aa5704c0873f4941b3317af56b7a7914e01b2385 not found: ID does not exist" containerID="94fd463f320e734957f758e9aa5704c0873f4941b3317af56b7a7914e01b2385" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.904445 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94fd463f320e734957f758e9aa5704c0873f4941b3317af56b7a7914e01b2385"} err="failed to get container status \"94fd463f320e734957f758e9aa5704c0873f4941b3317af56b7a7914e01b2385\": rpc error: code = NotFound desc = could not find container \"94fd463f320e734957f758e9aa5704c0873f4941b3317af56b7a7914e01b2385\": container with ID starting with 94fd463f320e734957f758e9aa5704c0873f4941b3317af56b7a7914e01b2385 not found: ID does not exist" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.904475 4817 scope.go:117] "RemoveContainer" containerID="827bcc31902382441f5dc5292cbc88bab9c20d1b18edec4252963d92d90ca87d" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.904585 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cw95t"] Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.909049 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-54ftv"] Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.914107 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-54ftv"] Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.920380 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b7925"] Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.923837 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b7925"] Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.931131 4817 scope.go:117] "RemoveContainer" containerID="b6d03a5f99c01f8b29d89d9b1cc1fa0abe0ae20e5a3c80d1e983f151b0f73e02" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.958128 4817 scope.go:117] "RemoveContainer" containerID="89594fa67868a4f25b500350d3891d510e66849a0e5aaf872bcd923442db2fe2" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.974550 4817 scope.go:117] "RemoveContainer" containerID="827bcc31902382441f5dc5292cbc88bab9c20d1b18edec4252963d92d90ca87d" Feb 18 14:03:39 crc kubenswrapper[4817]: E0218 14:03:39.975122 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"827bcc31902382441f5dc5292cbc88bab9c20d1b18edec4252963d92d90ca87d\": container with ID starting with 827bcc31902382441f5dc5292cbc88bab9c20d1b18edec4252963d92d90ca87d not found: ID does not exist" containerID="827bcc31902382441f5dc5292cbc88bab9c20d1b18edec4252963d92d90ca87d" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.975174 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"827bcc31902382441f5dc5292cbc88bab9c20d1b18edec4252963d92d90ca87d"} err="failed to get container status \"827bcc31902382441f5dc5292cbc88bab9c20d1b18edec4252963d92d90ca87d\": rpc error: code = NotFound desc = could not find container \"827bcc31902382441f5dc5292cbc88bab9c20d1b18edec4252963d92d90ca87d\": container with ID starting with 827bcc31902382441f5dc5292cbc88bab9c20d1b18edec4252963d92d90ca87d not found: ID does not exist" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.975204 4817 scope.go:117] "RemoveContainer" containerID="b6d03a5f99c01f8b29d89d9b1cc1fa0abe0ae20e5a3c80d1e983f151b0f73e02" Feb 18 14:03:39 crc kubenswrapper[4817]: E0218 14:03:39.975514 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6d03a5f99c01f8b29d89d9b1cc1fa0abe0ae20e5a3c80d1e983f151b0f73e02\": container with ID starting with b6d03a5f99c01f8b29d89d9b1cc1fa0abe0ae20e5a3c80d1e983f151b0f73e02 not found: ID does not exist" containerID="b6d03a5f99c01f8b29d89d9b1cc1fa0abe0ae20e5a3c80d1e983f151b0f73e02" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.975550 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6d03a5f99c01f8b29d89d9b1cc1fa0abe0ae20e5a3c80d1e983f151b0f73e02"} err="failed to get container status \"b6d03a5f99c01f8b29d89d9b1cc1fa0abe0ae20e5a3c80d1e983f151b0f73e02\": rpc error: code = NotFound desc = could not find container \"b6d03a5f99c01f8b29d89d9b1cc1fa0abe0ae20e5a3c80d1e983f151b0f73e02\": container with ID starting with b6d03a5f99c01f8b29d89d9b1cc1fa0abe0ae20e5a3c80d1e983f151b0f73e02 not found: ID does not exist" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.975572 4817 scope.go:117] "RemoveContainer" containerID="89594fa67868a4f25b500350d3891d510e66849a0e5aaf872bcd923442db2fe2" Feb 18 14:03:39 crc kubenswrapper[4817]: E0218 14:03:39.975776 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89594fa67868a4f25b500350d3891d510e66849a0e5aaf872bcd923442db2fe2\": container with ID starting with 89594fa67868a4f25b500350d3891d510e66849a0e5aaf872bcd923442db2fe2 not found: ID does not exist" containerID="89594fa67868a4f25b500350d3891d510e66849a0e5aaf872bcd923442db2fe2" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.975812 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89594fa67868a4f25b500350d3891d510e66849a0e5aaf872bcd923442db2fe2"} err="failed to get container status \"89594fa67868a4f25b500350d3891d510e66849a0e5aaf872bcd923442db2fe2\": rpc error: code = NotFound desc = could not find container \"89594fa67868a4f25b500350d3891d510e66849a0e5aaf872bcd923442db2fe2\": container with ID starting with 89594fa67868a4f25b500350d3891d510e66849a0e5aaf872bcd923442db2fe2 not found: ID does not exist" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.975832 4817 scope.go:117] "RemoveContainer" containerID="b330da5cfdbd2ae9feb6cc659fbeba65b247ef572f1156c86b73d62b6a02dacf" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.990842 4817 scope.go:117] "RemoveContainer" containerID="b330da5cfdbd2ae9feb6cc659fbeba65b247ef572f1156c86b73d62b6a02dacf" Feb 18 14:03:39 crc kubenswrapper[4817]: E0218 14:03:39.991350 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b330da5cfdbd2ae9feb6cc659fbeba65b247ef572f1156c86b73d62b6a02dacf\": container with ID starting with b330da5cfdbd2ae9feb6cc659fbeba65b247ef572f1156c86b73d62b6a02dacf not found: ID does not exist" containerID="b330da5cfdbd2ae9feb6cc659fbeba65b247ef572f1156c86b73d62b6a02dacf" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.991389 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b330da5cfdbd2ae9feb6cc659fbeba65b247ef572f1156c86b73d62b6a02dacf"} err="failed to get container status \"b330da5cfdbd2ae9feb6cc659fbeba65b247ef572f1156c86b73d62b6a02dacf\": rpc error: code = NotFound desc = could not find container \"b330da5cfdbd2ae9feb6cc659fbeba65b247ef572f1156c86b73d62b6a02dacf\": container with ID starting with b330da5cfdbd2ae9feb6cc659fbeba65b247ef572f1156c86b73d62b6a02dacf not found: ID does not exist" Feb 18 14:03:39 crc kubenswrapper[4817]: I0218 14:03:39.991416 4817 scope.go:117] "RemoveContainer" containerID="c26cc50afbd01264ca9d349db92397eee85590040151ab1b7faed36a648304e5" Feb 18 14:03:40 crc kubenswrapper[4817]: I0218 14:03:40.006927 4817 scope.go:117] "RemoveContainer" containerID="d19b260d2c3e89c304837eeb0754a6305edfadd5407dda9fd1a76556c9870a23" Feb 18 14:03:40 crc kubenswrapper[4817]: I0218 14:03:40.022971 4817 scope.go:117] "RemoveContainer" containerID="22298acbf82a978e9b06615fff23103085837415a73e22b240c8153c62d80df1" Feb 18 14:03:40 crc kubenswrapper[4817]: I0218 14:03:40.036179 4817 scope.go:117] "RemoveContainer" containerID="c26cc50afbd01264ca9d349db92397eee85590040151ab1b7faed36a648304e5" Feb 18 14:03:40 crc kubenswrapper[4817]: E0218 14:03:40.036713 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c26cc50afbd01264ca9d349db92397eee85590040151ab1b7faed36a648304e5\": container with ID starting with c26cc50afbd01264ca9d349db92397eee85590040151ab1b7faed36a648304e5 not found: ID does not exist" containerID="c26cc50afbd01264ca9d349db92397eee85590040151ab1b7faed36a648304e5" Feb 18 14:03:40 crc kubenswrapper[4817]: I0218 14:03:40.036746 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c26cc50afbd01264ca9d349db92397eee85590040151ab1b7faed36a648304e5"} err="failed to get container status \"c26cc50afbd01264ca9d349db92397eee85590040151ab1b7faed36a648304e5\": rpc error: code = NotFound desc = could not find container \"c26cc50afbd01264ca9d349db92397eee85590040151ab1b7faed36a648304e5\": container with ID starting with c26cc50afbd01264ca9d349db92397eee85590040151ab1b7faed36a648304e5 not found: ID does not exist" Feb 18 14:03:40 crc kubenswrapper[4817]: I0218 14:03:40.036785 4817 scope.go:117] "RemoveContainer" containerID="d19b260d2c3e89c304837eeb0754a6305edfadd5407dda9fd1a76556c9870a23" Feb 18 14:03:40 crc kubenswrapper[4817]: E0218 14:03:40.037134 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d19b260d2c3e89c304837eeb0754a6305edfadd5407dda9fd1a76556c9870a23\": container with ID starting with d19b260d2c3e89c304837eeb0754a6305edfadd5407dda9fd1a76556c9870a23 not found: ID does not exist" containerID="d19b260d2c3e89c304837eeb0754a6305edfadd5407dda9fd1a76556c9870a23" Feb 18 14:03:40 crc kubenswrapper[4817]: I0218 14:03:40.037189 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d19b260d2c3e89c304837eeb0754a6305edfadd5407dda9fd1a76556c9870a23"} err="failed to get container status \"d19b260d2c3e89c304837eeb0754a6305edfadd5407dda9fd1a76556c9870a23\": rpc error: code = NotFound desc = could not find container \"d19b260d2c3e89c304837eeb0754a6305edfadd5407dda9fd1a76556c9870a23\": container with ID starting with d19b260d2c3e89c304837eeb0754a6305edfadd5407dda9fd1a76556c9870a23 not found: ID does not exist" Feb 18 14:03:40 crc kubenswrapper[4817]: I0218 14:03:40.037226 4817 scope.go:117] "RemoveContainer" containerID="22298acbf82a978e9b06615fff23103085837415a73e22b240c8153c62d80df1" Feb 18 14:03:40 crc kubenswrapper[4817]: E0218 14:03:40.037556 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22298acbf82a978e9b06615fff23103085837415a73e22b240c8153c62d80df1\": container with ID starting with 22298acbf82a978e9b06615fff23103085837415a73e22b240c8153c62d80df1 not found: ID does not exist" containerID="22298acbf82a978e9b06615fff23103085837415a73e22b240c8153c62d80df1" Feb 18 14:03:40 crc kubenswrapper[4817]: I0218 14:03:40.037587 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22298acbf82a978e9b06615fff23103085837415a73e22b240c8153c62d80df1"} err="failed to get container status \"22298acbf82a978e9b06615fff23103085837415a73e22b240c8153c62d80df1\": rpc error: code = NotFound desc = could not find container \"22298acbf82a978e9b06615fff23103085837415a73e22b240c8153c62d80df1\": container with ID starting with 22298acbf82a978e9b06615fff23103085837415a73e22b240c8153c62d80df1 not found: ID does not exist" Feb 18 14:03:40 crc kubenswrapper[4817]: I0218 14:03:40.037604 4817 scope.go:117] "RemoveContainer" containerID="52302c5635b0814b8f9c823564514b85b7eb1dc5e0dee685fbbb31135ee65bec" Feb 18 14:03:40 crc kubenswrapper[4817]: I0218 14:03:40.050056 4817 scope.go:117] "RemoveContainer" containerID="a4645ae34569b8c312505667114b45db0df325131011b13469e6290f5980f326" Feb 18 14:03:40 crc kubenswrapper[4817]: I0218 14:03:40.120120 4817 scope.go:117] "RemoveContainer" containerID="c573b08532a022ea1e7bddffb3a48364e3fc3bc378a04577f928453f616541cc" Feb 18 14:03:40 crc kubenswrapper[4817]: I0218 14:03:40.133750 4817 scope.go:117] "RemoveContainer" containerID="52302c5635b0814b8f9c823564514b85b7eb1dc5e0dee685fbbb31135ee65bec" Feb 18 14:03:40 crc kubenswrapper[4817]: E0218 14:03:40.134227 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52302c5635b0814b8f9c823564514b85b7eb1dc5e0dee685fbbb31135ee65bec\": container with ID starting with 52302c5635b0814b8f9c823564514b85b7eb1dc5e0dee685fbbb31135ee65bec not found: ID does not exist" containerID="52302c5635b0814b8f9c823564514b85b7eb1dc5e0dee685fbbb31135ee65bec" Feb 18 14:03:40 crc kubenswrapper[4817]: I0218 14:03:40.134264 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52302c5635b0814b8f9c823564514b85b7eb1dc5e0dee685fbbb31135ee65bec"} err="failed to get container status \"52302c5635b0814b8f9c823564514b85b7eb1dc5e0dee685fbbb31135ee65bec\": rpc error: code = NotFound desc = could not find container \"52302c5635b0814b8f9c823564514b85b7eb1dc5e0dee685fbbb31135ee65bec\": container with ID starting with 52302c5635b0814b8f9c823564514b85b7eb1dc5e0dee685fbbb31135ee65bec not found: ID does not exist" Feb 18 14:03:40 crc kubenswrapper[4817]: I0218 14:03:40.134285 4817 scope.go:117] "RemoveContainer" containerID="a4645ae34569b8c312505667114b45db0df325131011b13469e6290f5980f326" Feb 18 14:03:40 crc kubenswrapper[4817]: E0218 14:03:40.134524 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4645ae34569b8c312505667114b45db0df325131011b13469e6290f5980f326\": container with ID starting with a4645ae34569b8c312505667114b45db0df325131011b13469e6290f5980f326 not found: ID does not exist" containerID="a4645ae34569b8c312505667114b45db0df325131011b13469e6290f5980f326" Feb 18 14:03:40 crc kubenswrapper[4817]: I0218 14:03:40.134594 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4645ae34569b8c312505667114b45db0df325131011b13469e6290f5980f326"} err="failed to get container status \"a4645ae34569b8c312505667114b45db0df325131011b13469e6290f5980f326\": rpc error: code = NotFound desc = could not find container \"a4645ae34569b8c312505667114b45db0df325131011b13469e6290f5980f326\": container with ID starting with a4645ae34569b8c312505667114b45db0df325131011b13469e6290f5980f326 not found: ID does not exist" Feb 18 14:03:40 crc kubenswrapper[4817]: I0218 14:03:40.134608 4817 scope.go:117] "RemoveContainer" containerID="c573b08532a022ea1e7bddffb3a48364e3fc3bc378a04577f928453f616541cc" Feb 18 14:03:40 crc kubenswrapper[4817]: E0218 14:03:40.134842 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c573b08532a022ea1e7bddffb3a48364e3fc3bc378a04577f928453f616541cc\": container with ID starting with c573b08532a022ea1e7bddffb3a48364e3fc3bc378a04577f928453f616541cc not found: ID does not exist" containerID="c573b08532a022ea1e7bddffb3a48364e3fc3bc378a04577f928453f616541cc" Feb 18 14:03:40 crc kubenswrapper[4817]: I0218 14:03:40.134883 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c573b08532a022ea1e7bddffb3a48364e3fc3bc378a04577f928453f616541cc"} err="failed to get container status \"c573b08532a022ea1e7bddffb3a48364e3fc3bc378a04577f928453f616541cc\": rpc error: code = NotFound desc = could not find container \"c573b08532a022ea1e7bddffb3a48364e3fc3bc378a04577f928453f616541cc\": container with ID starting with c573b08532a022ea1e7bddffb3a48364e3fc3bc378a04577f928453f616541cc not found: ID does not exist" Feb 18 14:03:40 crc kubenswrapper[4817]: I0218 14:03:40.178089 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18a3347a-5dd7-4047-8c43-9c073c9321e6" path="/var/lib/kubelet/pods/18a3347a-5dd7-4047-8c43-9c073c9321e6/volumes" Feb 18 14:03:40 crc kubenswrapper[4817]: I0218 14:03:40.178655 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="307f9900-9137-46bb-9b32-254ae14c8c17" path="/var/lib/kubelet/pods/307f9900-9137-46bb-9b32-254ae14c8c17/volumes" Feb 18 14:03:40 crc kubenswrapper[4817]: I0218 14:03:40.179375 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44a62058-ed9b-4364-97d0-09af2bb1c22d" path="/var/lib/kubelet/pods/44a62058-ed9b-4364-97d0-09af2bb1c22d/volumes" Feb 18 14:03:40 crc kubenswrapper[4817]: I0218 14:03:40.180553 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8162b014-86d1-482a-8c7c-eba34fed3f62" path="/var/lib/kubelet/pods/8162b014-86d1-482a-8c7c-eba34fed3f62/volumes" Feb 18 14:03:40 crc kubenswrapper[4817]: I0218 14:03:40.181234 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a860a4b3-b2dd-4d2f-8d2f-a959007a6197" path="/var/lib/kubelet/pods/a860a4b3-b2dd-4d2f-8d2f-a959007a6197/volumes" Feb 18 14:03:40 crc kubenswrapper[4817]: I0218 14:03:40.823595 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cw95t" event={"ID":"039201b1-3f23-4f22-80cb-17f07e1732df","Type":"ContainerStarted","Data":"80433e2a1868f8019ea00e0287f75c96d537c91ea1224630d48d46204b49c38e"} Feb 18 14:03:40 crc kubenswrapper[4817]: I0218 14:03:40.824099 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-cw95t" Feb 18 14:03:40 crc kubenswrapper[4817]: I0218 14:03:40.824173 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cw95t" event={"ID":"039201b1-3f23-4f22-80cb-17f07e1732df","Type":"ContainerStarted","Data":"05ca4ba35a0c5583c8bccb7eeb33a7b36999663d712cf4a90eef73dbc489f405"} Feb 18 14:03:40 crc kubenswrapper[4817]: I0218 14:03:40.834153 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-cw95t" Feb 18 14:03:40 crc kubenswrapper[4817]: I0218 14:03:40.846264 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-cw95t" podStartSLOduration=2.846231789 podStartE2EDuration="2.846231789s" podCreationTimestamp="2026-02-18 14:03:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:03:40.842595859 +0000 UTC m=+283.418131882" watchObservedRunningTime="2026-02-18 14:03:40.846231789 +0000 UTC m=+283.421767792" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.117852 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jbhmn"] Feb 18 14:03:41 crc kubenswrapper[4817]: E0218 14:03:41.118080 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18a3347a-5dd7-4047-8c43-9c073c9321e6" containerName="marketplace-operator" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.118092 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="18a3347a-5dd7-4047-8c43-9c073c9321e6" containerName="marketplace-operator" Feb 18 14:03:41 crc kubenswrapper[4817]: E0218 14:03:41.118102 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="307f9900-9137-46bb-9b32-254ae14c8c17" containerName="registry-server" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.118109 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="307f9900-9137-46bb-9b32-254ae14c8c17" containerName="registry-server" Feb 18 14:03:41 crc kubenswrapper[4817]: E0218 14:03:41.118118 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a860a4b3-b2dd-4d2f-8d2f-a959007a6197" containerName="extract-content" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.118124 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="a860a4b3-b2dd-4d2f-8d2f-a959007a6197" containerName="extract-content" Feb 18 14:03:41 crc kubenswrapper[4817]: E0218 14:03:41.118134 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8162b014-86d1-482a-8c7c-eba34fed3f62" containerName="registry-server" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.118140 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="8162b014-86d1-482a-8c7c-eba34fed3f62" containerName="registry-server" Feb 18 14:03:41 crc kubenswrapper[4817]: E0218 14:03:41.118147 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="307f9900-9137-46bb-9b32-254ae14c8c17" containerName="extract-utilities" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.118153 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="307f9900-9137-46bb-9b32-254ae14c8c17" containerName="extract-utilities" Feb 18 14:03:41 crc kubenswrapper[4817]: E0218 14:03:41.118162 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a860a4b3-b2dd-4d2f-8d2f-a959007a6197" containerName="extract-utilities" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.118168 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="a860a4b3-b2dd-4d2f-8d2f-a959007a6197" containerName="extract-utilities" Feb 18 14:03:41 crc kubenswrapper[4817]: E0218 14:03:41.118177 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8162b014-86d1-482a-8c7c-eba34fed3f62" containerName="extract-content" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.118182 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="8162b014-86d1-482a-8c7c-eba34fed3f62" containerName="extract-content" Feb 18 14:03:41 crc kubenswrapper[4817]: E0218 14:03:41.118190 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a62058-ed9b-4364-97d0-09af2bb1c22d" containerName="extract-content" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.118195 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a62058-ed9b-4364-97d0-09af2bb1c22d" containerName="extract-content" Feb 18 14:03:41 crc kubenswrapper[4817]: E0218 14:03:41.118203 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a860a4b3-b2dd-4d2f-8d2f-a959007a6197" containerName="registry-server" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.118208 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="a860a4b3-b2dd-4d2f-8d2f-a959007a6197" containerName="registry-server" Feb 18 14:03:41 crc kubenswrapper[4817]: E0218 14:03:41.118215 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a62058-ed9b-4364-97d0-09af2bb1c22d" containerName="registry-server" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.118220 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a62058-ed9b-4364-97d0-09af2bb1c22d" containerName="registry-server" Feb 18 14:03:41 crc kubenswrapper[4817]: E0218 14:03:41.118231 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="307f9900-9137-46bb-9b32-254ae14c8c17" containerName="extract-content" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.118236 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="307f9900-9137-46bb-9b32-254ae14c8c17" containerName="extract-content" Feb 18 14:03:41 crc kubenswrapper[4817]: E0218 14:03:41.118246 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8162b014-86d1-482a-8c7c-eba34fed3f62" containerName="extract-utilities" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.118251 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="8162b014-86d1-482a-8c7c-eba34fed3f62" containerName="extract-utilities" Feb 18 14:03:41 crc kubenswrapper[4817]: E0218 14:03:41.118259 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a62058-ed9b-4364-97d0-09af2bb1c22d" containerName="extract-utilities" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.118265 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a62058-ed9b-4364-97d0-09af2bb1c22d" containerName="extract-utilities" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.118346 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="a860a4b3-b2dd-4d2f-8d2f-a959007a6197" containerName="registry-server" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.118360 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="18a3347a-5dd7-4047-8c43-9c073c9321e6" containerName="marketplace-operator" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.118370 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="8162b014-86d1-482a-8c7c-eba34fed3f62" containerName="registry-server" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.118378 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="44a62058-ed9b-4364-97d0-09af2bb1c22d" containerName="registry-server" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.118385 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="307f9900-9137-46bb-9b32-254ae14c8c17" containerName="registry-server" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.119042 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jbhmn" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.121179 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.130546 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jbhmn"] Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.272039 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvsms\" (UniqueName: \"kubernetes.io/projected/32480fcf-d389-4f17-adee-4870e948038c-kube-api-access-pvsms\") pod \"redhat-marketplace-jbhmn\" (UID: \"32480fcf-d389-4f17-adee-4870e948038c\") " pod="openshift-marketplace/redhat-marketplace-jbhmn" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.272131 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32480fcf-d389-4f17-adee-4870e948038c-utilities\") pod \"redhat-marketplace-jbhmn\" (UID: \"32480fcf-d389-4f17-adee-4870e948038c\") " pod="openshift-marketplace/redhat-marketplace-jbhmn" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.272213 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32480fcf-d389-4f17-adee-4870e948038c-catalog-content\") pod \"redhat-marketplace-jbhmn\" (UID: \"32480fcf-d389-4f17-adee-4870e948038c\") " pod="openshift-marketplace/redhat-marketplace-jbhmn" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.328353 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rxkcv"] Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.330189 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rxkcv" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.332438 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.334207 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rxkcv"] Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.373774 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32480fcf-d389-4f17-adee-4870e948038c-catalog-content\") pod \"redhat-marketplace-jbhmn\" (UID: \"32480fcf-d389-4f17-adee-4870e948038c\") " pod="openshift-marketplace/redhat-marketplace-jbhmn" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.374139 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvsms\" (UniqueName: \"kubernetes.io/projected/32480fcf-d389-4f17-adee-4870e948038c-kube-api-access-pvsms\") pod \"redhat-marketplace-jbhmn\" (UID: \"32480fcf-d389-4f17-adee-4870e948038c\") " pod="openshift-marketplace/redhat-marketplace-jbhmn" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.374219 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32480fcf-d389-4f17-adee-4870e948038c-utilities\") pod \"redhat-marketplace-jbhmn\" (UID: \"32480fcf-d389-4f17-adee-4870e948038c\") " pod="openshift-marketplace/redhat-marketplace-jbhmn" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.374485 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32480fcf-d389-4f17-adee-4870e948038c-catalog-content\") pod \"redhat-marketplace-jbhmn\" (UID: \"32480fcf-d389-4f17-adee-4870e948038c\") " pod="openshift-marketplace/redhat-marketplace-jbhmn" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.374652 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32480fcf-d389-4f17-adee-4870e948038c-utilities\") pod \"redhat-marketplace-jbhmn\" (UID: \"32480fcf-d389-4f17-adee-4870e948038c\") " pod="openshift-marketplace/redhat-marketplace-jbhmn" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.398815 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvsms\" (UniqueName: \"kubernetes.io/projected/32480fcf-d389-4f17-adee-4870e948038c-kube-api-access-pvsms\") pod \"redhat-marketplace-jbhmn\" (UID: \"32480fcf-d389-4f17-adee-4870e948038c\") " pod="openshift-marketplace/redhat-marketplace-jbhmn" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.435470 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jbhmn" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.475600 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddv2x\" (UniqueName: \"kubernetes.io/projected/bc6b2223-1330-41d7-aad0-944699ee1f3c-kube-api-access-ddv2x\") pod \"redhat-operators-rxkcv\" (UID: \"bc6b2223-1330-41d7-aad0-944699ee1f3c\") " pod="openshift-marketplace/redhat-operators-rxkcv" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.475680 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc6b2223-1330-41d7-aad0-944699ee1f3c-utilities\") pod \"redhat-operators-rxkcv\" (UID: \"bc6b2223-1330-41d7-aad0-944699ee1f3c\") " pod="openshift-marketplace/redhat-operators-rxkcv" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.475878 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc6b2223-1330-41d7-aad0-944699ee1f3c-catalog-content\") pod \"redhat-operators-rxkcv\" (UID: \"bc6b2223-1330-41d7-aad0-944699ee1f3c\") " pod="openshift-marketplace/redhat-operators-rxkcv" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.577146 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc6b2223-1330-41d7-aad0-944699ee1f3c-utilities\") pod \"redhat-operators-rxkcv\" (UID: \"bc6b2223-1330-41d7-aad0-944699ee1f3c\") " pod="openshift-marketplace/redhat-operators-rxkcv" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.577216 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc6b2223-1330-41d7-aad0-944699ee1f3c-catalog-content\") pod \"redhat-operators-rxkcv\" (UID: \"bc6b2223-1330-41d7-aad0-944699ee1f3c\") " pod="openshift-marketplace/redhat-operators-rxkcv" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.577263 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddv2x\" (UniqueName: \"kubernetes.io/projected/bc6b2223-1330-41d7-aad0-944699ee1f3c-kube-api-access-ddv2x\") pod \"redhat-operators-rxkcv\" (UID: \"bc6b2223-1330-41d7-aad0-944699ee1f3c\") " pod="openshift-marketplace/redhat-operators-rxkcv" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.577701 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc6b2223-1330-41d7-aad0-944699ee1f3c-utilities\") pod \"redhat-operators-rxkcv\" (UID: \"bc6b2223-1330-41d7-aad0-944699ee1f3c\") " pod="openshift-marketplace/redhat-operators-rxkcv" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.577825 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc6b2223-1330-41d7-aad0-944699ee1f3c-catalog-content\") pod \"redhat-operators-rxkcv\" (UID: \"bc6b2223-1330-41d7-aad0-944699ee1f3c\") " pod="openshift-marketplace/redhat-operators-rxkcv" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.594641 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddv2x\" (UniqueName: \"kubernetes.io/projected/bc6b2223-1330-41d7-aad0-944699ee1f3c-kube-api-access-ddv2x\") pod \"redhat-operators-rxkcv\" (UID: \"bc6b2223-1330-41d7-aad0-944699ee1f3c\") " pod="openshift-marketplace/redhat-operators-rxkcv" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.657643 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rxkcv" Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.861651 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rxkcv"] Feb 18 14:03:41 crc kubenswrapper[4817]: W0218 14:03:41.870826 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc6b2223_1330_41d7_aad0_944699ee1f3c.slice/crio-e4dae77c358163b3026c66a6dc2b0f4c53da2bfc0abc9fb0290d936e903a79e9 WatchSource:0}: Error finding container e4dae77c358163b3026c66a6dc2b0f4c53da2bfc0abc9fb0290d936e903a79e9: Status 404 returned error can't find the container with id e4dae77c358163b3026c66a6dc2b0f4c53da2bfc0abc9fb0290d936e903a79e9 Feb 18 14:03:41 crc kubenswrapper[4817]: I0218 14:03:41.896157 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jbhmn"] Feb 18 14:03:42 crc kubenswrapper[4817]: E0218 14:03:42.140626 4817 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc6b2223_1330_41d7_aad0_944699ee1f3c.slice/crio-conmon-39d705451ea265ef9bf5db17576e31f23365fbd7341d6fa8b277d8553895b18c.scope\": RecentStats: unable to find data in memory cache]" Feb 18 14:03:42 crc kubenswrapper[4817]: I0218 14:03:42.840477 4817 generic.go:334] "Generic (PLEG): container finished" podID="32480fcf-d389-4f17-adee-4870e948038c" containerID="a4c1537b74e4b6b33f61f27e9979f399304c1cff3518434f0fdf29141d30321e" exitCode=0 Feb 18 14:03:42 crc kubenswrapper[4817]: I0218 14:03:42.840594 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jbhmn" event={"ID":"32480fcf-d389-4f17-adee-4870e948038c","Type":"ContainerDied","Data":"a4c1537b74e4b6b33f61f27e9979f399304c1cff3518434f0fdf29141d30321e"} Feb 18 14:03:42 crc kubenswrapper[4817]: I0218 14:03:42.840777 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jbhmn" event={"ID":"32480fcf-d389-4f17-adee-4870e948038c","Type":"ContainerStarted","Data":"a9d5af6008c4c29ea6019269376455406d3884e2119ea7d13fa04af7b5698588"} Feb 18 14:03:42 crc kubenswrapper[4817]: I0218 14:03:42.842971 4817 generic.go:334] "Generic (PLEG): container finished" podID="bc6b2223-1330-41d7-aad0-944699ee1f3c" containerID="39d705451ea265ef9bf5db17576e31f23365fbd7341d6fa8b277d8553895b18c" exitCode=0 Feb 18 14:03:42 crc kubenswrapper[4817]: I0218 14:03:42.843722 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rxkcv" event={"ID":"bc6b2223-1330-41d7-aad0-944699ee1f3c","Type":"ContainerDied","Data":"39d705451ea265ef9bf5db17576e31f23365fbd7341d6fa8b277d8553895b18c"} Feb 18 14:03:42 crc kubenswrapper[4817]: I0218 14:03:42.843749 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rxkcv" event={"ID":"bc6b2223-1330-41d7-aad0-944699ee1f3c","Type":"ContainerStarted","Data":"e4dae77c358163b3026c66a6dc2b0f4c53da2bfc0abc9fb0290d936e903a79e9"} Feb 18 14:03:43 crc kubenswrapper[4817]: I0218 14:03:43.537317 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-95btg"] Feb 18 14:03:43 crc kubenswrapper[4817]: I0218 14:03:43.538520 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-95btg" Feb 18 14:03:43 crc kubenswrapper[4817]: I0218 14:03:43.542220 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 18 14:03:43 crc kubenswrapper[4817]: I0218 14:03:43.561952 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-95btg"] Feb 18 14:03:43 crc kubenswrapper[4817]: I0218 14:03:43.708508 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97057c75-124d-48f2-8931-667fa9ad766f-utilities\") pod \"certified-operators-95btg\" (UID: \"97057c75-124d-48f2-8931-667fa9ad766f\") " pod="openshift-marketplace/certified-operators-95btg" Feb 18 14:03:43 crc kubenswrapper[4817]: I0218 14:03:43.708944 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vwgs\" (UniqueName: \"kubernetes.io/projected/97057c75-124d-48f2-8931-667fa9ad766f-kube-api-access-7vwgs\") pod \"certified-operators-95btg\" (UID: \"97057c75-124d-48f2-8931-667fa9ad766f\") " pod="openshift-marketplace/certified-operators-95btg" Feb 18 14:03:43 crc kubenswrapper[4817]: I0218 14:03:43.709155 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97057c75-124d-48f2-8931-667fa9ad766f-catalog-content\") pod \"certified-operators-95btg\" (UID: \"97057c75-124d-48f2-8931-667fa9ad766f\") " pod="openshift-marketplace/certified-operators-95btg" Feb 18 14:03:43 crc kubenswrapper[4817]: I0218 14:03:43.720688 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b54sl"] Feb 18 14:03:43 crc kubenswrapper[4817]: I0218 14:03:43.723243 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b54sl" Feb 18 14:03:43 crc kubenswrapper[4817]: I0218 14:03:43.726329 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 18 14:03:43 crc kubenswrapper[4817]: I0218 14:03:43.736737 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b54sl"] Feb 18 14:03:43 crc kubenswrapper[4817]: I0218 14:03:43.810544 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vwgs\" (UniqueName: \"kubernetes.io/projected/97057c75-124d-48f2-8931-667fa9ad766f-kube-api-access-7vwgs\") pod \"certified-operators-95btg\" (UID: \"97057c75-124d-48f2-8931-667fa9ad766f\") " pod="openshift-marketplace/certified-operators-95btg" Feb 18 14:03:43 crc kubenswrapper[4817]: I0218 14:03:43.810630 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97057c75-124d-48f2-8931-667fa9ad766f-catalog-content\") pod \"certified-operators-95btg\" (UID: \"97057c75-124d-48f2-8931-667fa9ad766f\") " pod="openshift-marketplace/certified-operators-95btg" Feb 18 14:03:43 crc kubenswrapper[4817]: I0218 14:03:43.810671 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97057c75-124d-48f2-8931-667fa9ad766f-utilities\") pod \"certified-operators-95btg\" (UID: \"97057c75-124d-48f2-8931-667fa9ad766f\") " pod="openshift-marketplace/certified-operators-95btg" Feb 18 14:03:43 crc kubenswrapper[4817]: I0218 14:03:43.811147 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97057c75-124d-48f2-8931-667fa9ad766f-catalog-content\") pod \"certified-operators-95btg\" (UID: \"97057c75-124d-48f2-8931-667fa9ad766f\") " pod="openshift-marketplace/certified-operators-95btg" Feb 18 14:03:43 crc kubenswrapper[4817]: I0218 14:03:43.811208 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97057c75-124d-48f2-8931-667fa9ad766f-utilities\") pod \"certified-operators-95btg\" (UID: \"97057c75-124d-48f2-8931-667fa9ad766f\") " pod="openshift-marketplace/certified-operators-95btg" Feb 18 14:03:43 crc kubenswrapper[4817]: I0218 14:03:43.830636 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vwgs\" (UniqueName: \"kubernetes.io/projected/97057c75-124d-48f2-8931-667fa9ad766f-kube-api-access-7vwgs\") pod \"certified-operators-95btg\" (UID: \"97057c75-124d-48f2-8931-667fa9ad766f\") " pod="openshift-marketplace/certified-operators-95btg" Feb 18 14:03:43 crc kubenswrapper[4817]: I0218 14:03:43.858808 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-95btg" Feb 18 14:03:43 crc kubenswrapper[4817]: I0218 14:03:43.867116 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jbhmn" event={"ID":"32480fcf-d389-4f17-adee-4870e948038c","Type":"ContainerStarted","Data":"36a283fc57e9aaeab958de2f8b034c0ccff484ff0bc672d721d55aa27f98b8cb"} Feb 18 14:03:43 crc kubenswrapper[4817]: I0218 14:03:43.911656 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc468458-8e7b-4993-aac2-87477b183acc-utilities\") pod \"community-operators-b54sl\" (UID: \"fc468458-8e7b-4993-aac2-87477b183acc\") " pod="openshift-marketplace/community-operators-b54sl" Feb 18 14:03:43 crc kubenswrapper[4817]: I0218 14:03:43.911747 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc468458-8e7b-4993-aac2-87477b183acc-catalog-content\") pod \"community-operators-b54sl\" (UID: \"fc468458-8e7b-4993-aac2-87477b183acc\") " pod="openshift-marketplace/community-operators-b54sl" Feb 18 14:03:43 crc kubenswrapper[4817]: I0218 14:03:43.911775 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rmpr\" (UniqueName: \"kubernetes.io/projected/fc468458-8e7b-4993-aac2-87477b183acc-kube-api-access-2rmpr\") pod \"community-operators-b54sl\" (UID: \"fc468458-8e7b-4993-aac2-87477b183acc\") " pod="openshift-marketplace/community-operators-b54sl" Feb 18 14:03:44 crc kubenswrapper[4817]: I0218 14:03:44.013180 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc468458-8e7b-4993-aac2-87477b183acc-utilities\") pod \"community-operators-b54sl\" (UID: \"fc468458-8e7b-4993-aac2-87477b183acc\") " pod="openshift-marketplace/community-operators-b54sl" Feb 18 14:03:44 crc kubenswrapper[4817]: I0218 14:03:44.013488 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc468458-8e7b-4993-aac2-87477b183acc-catalog-content\") pod \"community-operators-b54sl\" (UID: \"fc468458-8e7b-4993-aac2-87477b183acc\") " pod="openshift-marketplace/community-operators-b54sl" Feb 18 14:03:44 crc kubenswrapper[4817]: I0218 14:03:44.013510 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rmpr\" (UniqueName: \"kubernetes.io/projected/fc468458-8e7b-4993-aac2-87477b183acc-kube-api-access-2rmpr\") pod \"community-operators-b54sl\" (UID: \"fc468458-8e7b-4993-aac2-87477b183acc\") " pod="openshift-marketplace/community-operators-b54sl" Feb 18 14:03:44 crc kubenswrapper[4817]: I0218 14:03:44.014241 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc468458-8e7b-4993-aac2-87477b183acc-utilities\") pod \"community-operators-b54sl\" (UID: \"fc468458-8e7b-4993-aac2-87477b183acc\") " pod="openshift-marketplace/community-operators-b54sl" Feb 18 14:03:44 crc kubenswrapper[4817]: I0218 14:03:44.016662 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc468458-8e7b-4993-aac2-87477b183acc-catalog-content\") pod \"community-operators-b54sl\" (UID: \"fc468458-8e7b-4993-aac2-87477b183acc\") " pod="openshift-marketplace/community-operators-b54sl" Feb 18 14:03:44 crc kubenswrapper[4817]: I0218 14:03:44.036733 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rmpr\" (UniqueName: \"kubernetes.io/projected/fc468458-8e7b-4993-aac2-87477b183acc-kube-api-access-2rmpr\") pod \"community-operators-b54sl\" (UID: \"fc468458-8e7b-4993-aac2-87477b183acc\") " pod="openshift-marketplace/community-operators-b54sl" Feb 18 14:03:44 crc kubenswrapper[4817]: I0218 14:03:44.266105 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-95btg"] Feb 18 14:03:44 crc kubenswrapper[4817]: W0218 14:03:44.270917 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97057c75_124d_48f2_8931_667fa9ad766f.slice/crio-c778a58cfc29d75c82ce7f3a9e7ac366cdd81f4554b8157b3c251bb42fc7cc20 WatchSource:0}: Error finding container c778a58cfc29d75c82ce7f3a9e7ac366cdd81f4554b8157b3c251bb42fc7cc20: Status 404 returned error can't find the container with id c778a58cfc29d75c82ce7f3a9e7ac366cdd81f4554b8157b3c251bb42fc7cc20 Feb 18 14:03:44 crc kubenswrapper[4817]: I0218 14:03:44.336693 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b54sl" Feb 18 14:03:44 crc kubenswrapper[4817]: I0218 14:03:44.541598 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b54sl"] Feb 18 14:03:44 crc kubenswrapper[4817]: I0218 14:03:44.873716 4817 generic.go:334] "Generic (PLEG): container finished" podID="97057c75-124d-48f2-8931-667fa9ad766f" containerID="6e7f3b0cf0ef6d1c1d9563b3d07fa83d1e0ef2d60f116aa0cd7d1ff0ba062b50" exitCode=0 Feb 18 14:03:44 crc kubenswrapper[4817]: I0218 14:03:44.873923 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-95btg" event={"ID":"97057c75-124d-48f2-8931-667fa9ad766f","Type":"ContainerDied","Data":"6e7f3b0cf0ef6d1c1d9563b3d07fa83d1e0ef2d60f116aa0cd7d1ff0ba062b50"} Feb 18 14:03:44 crc kubenswrapper[4817]: I0218 14:03:44.874131 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-95btg" event={"ID":"97057c75-124d-48f2-8931-667fa9ad766f","Type":"ContainerStarted","Data":"c778a58cfc29d75c82ce7f3a9e7ac366cdd81f4554b8157b3c251bb42fc7cc20"} Feb 18 14:03:44 crc kubenswrapper[4817]: I0218 14:03:44.876368 4817 generic.go:334] "Generic (PLEG): container finished" podID="fc468458-8e7b-4993-aac2-87477b183acc" containerID="8815a11b1e72453b97934bb00951b616ffef6679970ccd480a0f71d0fd2ac63d" exitCode=0 Feb 18 14:03:44 crc kubenswrapper[4817]: I0218 14:03:44.876479 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b54sl" event={"ID":"fc468458-8e7b-4993-aac2-87477b183acc","Type":"ContainerDied","Data":"8815a11b1e72453b97934bb00951b616ffef6679970ccd480a0f71d0fd2ac63d"} Feb 18 14:03:44 crc kubenswrapper[4817]: I0218 14:03:44.876519 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b54sl" event={"ID":"fc468458-8e7b-4993-aac2-87477b183acc","Type":"ContainerStarted","Data":"b6e20c1325c9367241c9c9dd69e530b339ccda44daa4d79a2b00b1ce0467ffd7"} Feb 18 14:03:44 crc kubenswrapper[4817]: I0218 14:03:44.884013 4817 generic.go:334] "Generic (PLEG): container finished" podID="32480fcf-d389-4f17-adee-4870e948038c" containerID="36a283fc57e9aaeab958de2f8b034c0ccff484ff0bc672d721d55aa27f98b8cb" exitCode=0 Feb 18 14:03:44 crc kubenswrapper[4817]: I0218 14:03:44.884138 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jbhmn" event={"ID":"32480fcf-d389-4f17-adee-4870e948038c","Type":"ContainerDied","Data":"36a283fc57e9aaeab958de2f8b034c0ccff484ff0bc672d721d55aa27f98b8cb"} Feb 18 14:03:44 crc kubenswrapper[4817]: I0218 14:03:44.888293 4817 generic.go:334] "Generic (PLEG): container finished" podID="bc6b2223-1330-41d7-aad0-944699ee1f3c" containerID="3382e2d1e8d128a8cf4b8c0b3ef168a0e8e8af682a1885736930424f5b0456cf" exitCode=0 Feb 18 14:03:44 crc kubenswrapper[4817]: I0218 14:03:44.888395 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rxkcv" event={"ID":"bc6b2223-1330-41d7-aad0-944699ee1f3c","Type":"ContainerDied","Data":"3382e2d1e8d128a8cf4b8c0b3ef168a0e8e8af682a1885736930424f5b0456cf"} Feb 18 14:03:45 crc kubenswrapper[4817]: I0218 14:03:45.905121 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jbhmn" event={"ID":"32480fcf-d389-4f17-adee-4870e948038c","Type":"ContainerStarted","Data":"7c640411286d0ba9a72153c6391da96ee4b5bbff8ae0a3609ffed54651aaddc5"} Feb 18 14:03:45 crc kubenswrapper[4817]: I0218 14:03:45.907871 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rxkcv" event={"ID":"bc6b2223-1330-41d7-aad0-944699ee1f3c","Type":"ContainerStarted","Data":"542ea0e26151ce730342f18a3559c45637e430cf27d259610b4a3fe970e3a6f4"} Feb 18 14:03:45 crc kubenswrapper[4817]: I0218 14:03:45.910013 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-95btg" event={"ID":"97057c75-124d-48f2-8931-667fa9ad766f","Type":"ContainerStarted","Data":"4742f1e91c018836103c48d284fa5b44b9b29b0954c13cb0a78077e607c2f3a0"} Feb 18 14:03:45 crc kubenswrapper[4817]: I0218 14:03:45.911967 4817 generic.go:334] "Generic (PLEG): container finished" podID="fc468458-8e7b-4993-aac2-87477b183acc" containerID="c6e04bfe466db6c26de56d221cd190e1ff13144289cc89cc7c5700214da7795b" exitCode=0 Feb 18 14:03:45 crc kubenswrapper[4817]: I0218 14:03:45.912040 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b54sl" event={"ID":"fc468458-8e7b-4993-aac2-87477b183acc","Type":"ContainerDied","Data":"c6e04bfe466db6c26de56d221cd190e1ff13144289cc89cc7c5700214da7795b"} Feb 18 14:03:45 crc kubenswrapper[4817]: I0218 14:03:45.951589 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jbhmn" podStartSLOduration=2.498864876 podStartE2EDuration="4.951572802s" podCreationTimestamp="2026-02-18 14:03:41 +0000 UTC" firstStartedPulling="2026-02-18 14:03:42.843019443 +0000 UTC m=+285.418555456" lastFinishedPulling="2026-02-18 14:03:45.295727399 +0000 UTC m=+287.871263382" observedRunningTime="2026-02-18 14:03:45.933993877 +0000 UTC m=+288.509529870" watchObservedRunningTime="2026-02-18 14:03:45.951572802 +0000 UTC m=+288.527108785" Feb 18 14:03:45 crc kubenswrapper[4817]: I0218 14:03:45.954842 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rxkcv" podStartSLOduration=2.553890116 podStartE2EDuration="4.954834632s" podCreationTimestamp="2026-02-18 14:03:41 +0000 UTC" firstStartedPulling="2026-02-18 14:03:42.846873788 +0000 UTC m=+285.422409811" lastFinishedPulling="2026-02-18 14:03:45.247818224 +0000 UTC m=+287.823354327" observedRunningTime="2026-02-18 14:03:45.949625714 +0000 UTC m=+288.525161697" watchObservedRunningTime="2026-02-18 14:03:45.954834632 +0000 UTC m=+288.530370605" Feb 18 14:03:46 crc kubenswrapper[4817]: I0218 14:03:46.921024 4817 generic.go:334] "Generic (PLEG): container finished" podID="97057c75-124d-48f2-8931-667fa9ad766f" containerID="4742f1e91c018836103c48d284fa5b44b9b29b0954c13cb0a78077e607c2f3a0" exitCode=0 Feb 18 14:03:46 crc kubenswrapper[4817]: I0218 14:03:46.921217 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-95btg" event={"ID":"97057c75-124d-48f2-8931-667fa9ad766f","Type":"ContainerDied","Data":"4742f1e91c018836103c48d284fa5b44b9b29b0954c13cb0a78077e607c2f3a0"} Feb 18 14:03:46 crc kubenswrapper[4817]: I0218 14:03:46.924674 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b54sl" event={"ID":"fc468458-8e7b-4993-aac2-87477b183acc","Type":"ContainerStarted","Data":"49fd39a61797ffb027f7990ef79d0db01803b7aa1ea4df5aa3ff647d57b4e32b"} Feb 18 14:03:46 crc kubenswrapper[4817]: I0218 14:03:46.963142 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b54sl" podStartSLOduration=2.5571496419999997 podStartE2EDuration="3.963118649s" podCreationTimestamp="2026-02-18 14:03:43 +0000 UTC" firstStartedPulling="2026-02-18 14:03:44.879331095 +0000 UTC m=+287.454867078" lastFinishedPulling="2026-02-18 14:03:46.285300102 +0000 UTC m=+288.860836085" observedRunningTime="2026-02-18 14:03:46.960053114 +0000 UTC m=+289.535589097" watchObservedRunningTime="2026-02-18 14:03:46.963118649 +0000 UTC m=+289.538654632" Feb 18 14:03:47 crc kubenswrapper[4817]: I0218 14:03:47.931843 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-95btg" event={"ID":"97057c75-124d-48f2-8931-667fa9ad766f","Type":"ContainerStarted","Data":"b74540939b717bcde7217319327b5c30c4ad474a2c44a16d100d506909e3bb1f"} Feb 18 14:03:51 crc kubenswrapper[4817]: I0218 14:03:51.436375 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jbhmn" Feb 18 14:03:51 crc kubenswrapper[4817]: I0218 14:03:51.437002 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jbhmn" Feb 18 14:03:51 crc kubenswrapper[4817]: I0218 14:03:51.488900 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jbhmn" Feb 18 14:03:51 crc kubenswrapper[4817]: I0218 14:03:51.506780 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-95btg" podStartSLOduration=5.995667569 podStartE2EDuration="8.506763607s" podCreationTimestamp="2026-02-18 14:03:43 +0000 UTC" firstStartedPulling="2026-02-18 14:03:44.876902415 +0000 UTC m=+287.452438398" lastFinishedPulling="2026-02-18 14:03:47.387998453 +0000 UTC m=+289.963534436" observedRunningTime="2026-02-18 14:03:47.949402532 +0000 UTC m=+290.524938535" watchObservedRunningTime="2026-02-18 14:03:51.506763607 +0000 UTC m=+294.082299590" Feb 18 14:03:51 crc kubenswrapper[4817]: I0218 14:03:51.658772 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rxkcv" Feb 18 14:03:51 crc kubenswrapper[4817]: I0218 14:03:51.658832 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rxkcv" Feb 18 14:03:51 crc kubenswrapper[4817]: I0218 14:03:51.697813 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rxkcv" Feb 18 14:03:52 crc kubenswrapper[4817]: I0218 14:03:52.018762 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jbhmn" Feb 18 14:03:52 crc kubenswrapper[4817]: I0218 14:03:52.019576 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rxkcv" Feb 18 14:03:53 crc kubenswrapper[4817]: I0218 14:03:53.859340 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-95btg" Feb 18 14:03:53 crc kubenswrapper[4817]: I0218 14:03:53.860613 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-95btg" Feb 18 14:03:53 crc kubenswrapper[4817]: I0218 14:03:53.918355 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-95btg" Feb 18 14:03:54 crc kubenswrapper[4817]: I0218 14:03:54.014180 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-95btg" Feb 18 14:03:54 crc kubenswrapper[4817]: I0218 14:03:54.337412 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b54sl" Feb 18 14:03:54 crc kubenswrapper[4817]: I0218 14:03:54.337824 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b54sl" Feb 18 14:03:54 crc kubenswrapper[4817]: I0218 14:03:54.377944 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b54sl" Feb 18 14:03:55 crc kubenswrapper[4817]: I0218 14:03:55.022644 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b54sl" Feb 18 14:03:57 crc kubenswrapper[4817]: I0218 14:03:57.950316 4817 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 18 14:05:12 crc kubenswrapper[4817]: I0218 14:05:12.863635 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:05:12 crc kubenswrapper[4817]: I0218 14:05:12.864355 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:05:42 crc kubenswrapper[4817]: I0218 14:05:42.863063 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:05:42 crc kubenswrapper[4817]: I0218 14:05:42.863655 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:05:58 crc kubenswrapper[4817]: I0218 14:05:58.466594 4817 scope.go:117] "RemoveContainer" containerID="c1ca46ca13eead25086d7cfba371f7db6f3d8f0f30cc2f2dcf67c84a101948a5" Feb 18 14:05:58 crc kubenswrapper[4817]: I0218 14:05:58.500100 4817 scope.go:117] "RemoveContainer" containerID="aef25b3d5c1cf6030c480735a540395a770eecf1c36854fce256f043901d64ed" Feb 18 14:06:12 crc kubenswrapper[4817]: I0218 14:06:12.863500 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:06:12 crc kubenswrapper[4817]: I0218 14:06:12.866238 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:06:12 crc kubenswrapper[4817]: I0218 14:06:12.866365 4817 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" Feb 18 14:06:12 crc kubenswrapper[4817]: I0218 14:06:12.867350 4817 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d3e9adde8434a7716ab4563cbde77006c2cd5de9992720aea0fc7ac8f5c1757e"} pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 14:06:12 crc kubenswrapper[4817]: I0218 14:06:12.867446 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" containerID="cri-o://d3e9adde8434a7716ab4563cbde77006c2cd5de9992720aea0fc7ac8f5c1757e" gracePeriod=600 Feb 18 14:06:13 crc kubenswrapper[4817]: I0218 14:06:13.876482 4817 generic.go:334] "Generic (PLEG): container finished" podID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerID="d3e9adde8434a7716ab4563cbde77006c2cd5de9992720aea0fc7ac8f5c1757e" exitCode=0 Feb 18 14:06:13 crc kubenswrapper[4817]: I0218 14:06:13.876570 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerDied","Data":"d3e9adde8434a7716ab4563cbde77006c2cd5de9992720aea0fc7ac8f5c1757e"} Feb 18 14:06:13 crc kubenswrapper[4817]: I0218 14:06:13.876833 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerStarted","Data":"7dfc7dd34d408c82e87d251482328355deda32f5409047841a6c0bd478ccafc4"} Feb 18 14:06:13 crc kubenswrapper[4817]: I0218 14:06:13.876860 4817 scope.go:117] "RemoveContainer" containerID="c7c1c799a80a9d975ab53d4cf5272008822680f6f55efd7a2e6bec382bbea671" Feb 18 14:06:58 crc kubenswrapper[4817]: I0218 14:06:58.530357 4817 scope.go:117] "RemoveContainer" containerID="7fb66e54528ab6ff9d1c0c8d4b09001788707003b8efc16d7c6c01848f98f457" Feb 18 14:06:58 crc kubenswrapper[4817]: I0218 14:06:58.556862 4817 scope.go:117] "RemoveContainer" containerID="20c296afdc5f49ae6034c399882826415a69343541925a9cc42470bfa011b92a" Feb 18 14:06:58 crc kubenswrapper[4817]: I0218 14:06:58.575439 4817 scope.go:117] "RemoveContainer" containerID="2d9d2e2ff1a1966c633ab7b28ba0f193515d3c1f1e534ef12985232374250949" Feb 18 14:06:58 crc kubenswrapper[4817]: I0218 14:06:58.614048 4817 scope.go:117] "RemoveContainer" containerID="7ec2b58eecf629d97d2f22905fe6a3111b678e0ccdcf90e5126468ab53580908" Feb 18 14:08:29 crc kubenswrapper[4817]: I0218 14:08:29.910391 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk"] Feb 18 14:08:29 crc kubenswrapper[4817]: I0218 14:08:29.912044 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk" Feb 18 14:08:29 crc kubenswrapper[4817]: I0218 14:08:29.914241 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 18 14:08:29 crc kubenswrapper[4817]: I0218 14:08:29.919409 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk"] Feb 18 14:08:29 crc kubenswrapper[4817]: I0218 14:08:29.922278 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f23865a8-7bc6-47b8-a9c8-6c9188463757-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk\" (UID: \"f23865a8-7bc6-47b8-a9c8-6c9188463757\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk" Feb 18 14:08:29 crc kubenswrapper[4817]: I0218 14:08:29.922349 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f23865a8-7bc6-47b8-a9c8-6c9188463757-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk\" (UID: \"f23865a8-7bc6-47b8-a9c8-6c9188463757\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk" Feb 18 14:08:29 crc kubenswrapper[4817]: I0218 14:08:29.922394 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdfrj\" (UniqueName: \"kubernetes.io/projected/f23865a8-7bc6-47b8-a9c8-6c9188463757-kube-api-access-fdfrj\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk\" (UID: \"f23865a8-7bc6-47b8-a9c8-6c9188463757\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk" Feb 18 14:08:30 crc kubenswrapper[4817]: I0218 14:08:30.023506 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdfrj\" (UniqueName: \"kubernetes.io/projected/f23865a8-7bc6-47b8-a9c8-6c9188463757-kube-api-access-fdfrj\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk\" (UID: \"f23865a8-7bc6-47b8-a9c8-6c9188463757\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk" Feb 18 14:08:30 crc kubenswrapper[4817]: I0218 14:08:30.023827 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f23865a8-7bc6-47b8-a9c8-6c9188463757-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk\" (UID: \"f23865a8-7bc6-47b8-a9c8-6c9188463757\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk" Feb 18 14:08:30 crc kubenswrapper[4817]: I0218 14:08:30.023956 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f23865a8-7bc6-47b8-a9c8-6c9188463757-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk\" (UID: \"f23865a8-7bc6-47b8-a9c8-6c9188463757\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk" Feb 18 14:08:30 crc kubenswrapper[4817]: I0218 14:08:30.024410 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f23865a8-7bc6-47b8-a9c8-6c9188463757-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk\" (UID: \"f23865a8-7bc6-47b8-a9c8-6c9188463757\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk" Feb 18 14:08:30 crc kubenswrapper[4817]: I0218 14:08:30.024435 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f23865a8-7bc6-47b8-a9c8-6c9188463757-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk\" (UID: \"f23865a8-7bc6-47b8-a9c8-6c9188463757\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk" Feb 18 14:08:30 crc kubenswrapper[4817]: I0218 14:08:30.049043 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdfrj\" (UniqueName: \"kubernetes.io/projected/f23865a8-7bc6-47b8-a9c8-6c9188463757-kube-api-access-fdfrj\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk\" (UID: \"f23865a8-7bc6-47b8-a9c8-6c9188463757\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk" Feb 18 14:08:30 crc kubenswrapper[4817]: I0218 14:08:30.229018 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk" Feb 18 14:08:30 crc kubenswrapper[4817]: I0218 14:08:30.455738 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk"] Feb 18 14:08:31 crc kubenswrapper[4817]: I0218 14:08:31.212830 4817 generic.go:334] "Generic (PLEG): container finished" podID="f23865a8-7bc6-47b8-a9c8-6c9188463757" containerID="a4738309aca78815a815ca2156e3d202a548aede94f7df8f6e12501f1548f6dd" exitCode=0 Feb 18 14:08:31 crc kubenswrapper[4817]: I0218 14:08:31.212882 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk" event={"ID":"f23865a8-7bc6-47b8-a9c8-6c9188463757","Type":"ContainerDied","Data":"a4738309aca78815a815ca2156e3d202a548aede94f7df8f6e12501f1548f6dd"} Feb 18 14:08:31 crc kubenswrapper[4817]: I0218 14:08:31.212913 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk" event={"ID":"f23865a8-7bc6-47b8-a9c8-6c9188463757","Type":"ContainerStarted","Data":"cb41a0774866324b1c35673856e4cebb384994b07e1f341c7309321f2628229e"} Feb 18 14:08:31 crc kubenswrapper[4817]: I0218 14:08:31.215806 4817 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 14:08:33 crc kubenswrapper[4817]: I0218 14:08:33.230100 4817 generic.go:334] "Generic (PLEG): container finished" podID="f23865a8-7bc6-47b8-a9c8-6c9188463757" containerID="8a2527ee5f2f337f4915fe9c5f677ff81da359e643a79d78dacd283113c233fa" exitCode=0 Feb 18 14:08:33 crc kubenswrapper[4817]: I0218 14:08:33.230429 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk" event={"ID":"f23865a8-7bc6-47b8-a9c8-6c9188463757","Type":"ContainerDied","Data":"8a2527ee5f2f337f4915fe9c5f677ff81da359e643a79d78dacd283113c233fa"} Feb 18 14:08:34 crc kubenswrapper[4817]: I0218 14:08:34.238492 4817 generic.go:334] "Generic (PLEG): container finished" podID="f23865a8-7bc6-47b8-a9c8-6c9188463757" containerID="b6c0ad35b55f1870780309d39e951382fd223a0551669f9e0d2135d6cc16b905" exitCode=0 Feb 18 14:08:34 crc kubenswrapper[4817]: I0218 14:08:34.238554 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk" event={"ID":"f23865a8-7bc6-47b8-a9c8-6c9188463757","Type":"ContainerDied","Data":"b6c0ad35b55f1870780309d39e951382fd223a0551669f9e0d2135d6cc16b905"} Feb 18 14:08:35 crc kubenswrapper[4817]: I0218 14:08:35.610609 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk" Feb 18 14:08:35 crc kubenswrapper[4817]: I0218 14:08:35.717799 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f23865a8-7bc6-47b8-a9c8-6c9188463757-bundle\") pod \"f23865a8-7bc6-47b8-a9c8-6c9188463757\" (UID: \"f23865a8-7bc6-47b8-a9c8-6c9188463757\") " Feb 18 14:08:35 crc kubenswrapper[4817]: I0218 14:08:35.717893 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdfrj\" (UniqueName: \"kubernetes.io/projected/f23865a8-7bc6-47b8-a9c8-6c9188463757-kube-api-access-fdfrj\") pod \"f23865a8-7bc6-47b8-a9c8-6c9188463757\" (UID: \"f23865a8-7bc6-47b8-a9c8-6c9188463757\") " Feb 18 14:08:35 crc kubenswrapper[4817]: I0218 14:08:35.717944 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f23865a8-7bc6-47b8-a9c8-6c9188463757-util\") pod \"f23865a8-7bc6-47b8-a9c8-6c9188463757\" (UID: \"f23865a8-7bc6-47b8-a9c8-6c9188463757\") " Feb 18 14:08:35 crc kubenswrapper[4817]: I0218 14:08:35.720631 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f23865a8-7bc6-47b8-a9c8-6c9188463757-bundle" (OuterVolumeSpecName: "bundle") pod "f23865a8-7bc6-47b8-a9c8-6c9188463757" (UID: "f23865a8-7bc6-47b8-a9c8-6c9188463757"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:08:35 crc kubenswrapper[4817]: I0218 14:08:35.723013 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f23865a8-7bc6-47b8-a9c8-6c9188463757-kube-api-access-fdfrj" (OuterVolumeSpecName: "kube-api-access-fdfrj") pod "f23865a8-7bc6-47b8-a9c8-6c9188463757" (UID: "f23865a8-7bc6-47b8-a9c8-6c9188463757"). InnerVolumeSpecName "kube-api-access-fdfrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:08:35 crc kubenswrapper[4817]: I0218 14:08:35.737619 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f23865a8-7bc6-47b8-a9c8-6c9188463757-util" (OuterVolumeSpecName: "util") pod "f23865a8-7bc6-47b8-a9c8-6c9188463757" (UID: "f23865a8-7bc6-47b8-a9c8-6c9188463757"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:08:35 crc kubenswrapper[4817]: I0218 14:08:35.819272 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdfrj\" (UniqueName: \"kubernetes.io/projected/f23865a8-7bc6-47b8-a9c8-6c9188463757-kube-api-access-fdfrj\") on node \"crc\" DevicePath \"\"" Feb 18 14:08:35 crc kubenswrapper[4817]: I0218 14:08:35.819326 4817 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f23865a8-7bc6-47b8-a9c8-6c9188463757-util\") on node \"crc\" DevicePath \"\"" Feb 18 14:08:35 crc kubenswrapper[4817]: I0218 14:08:35.819346 4817 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f23865a8-7bc6-47b8-a9c8-6c9188463757-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:08:36 crc kubenswrapper[4817]: I0218 14:08:36.259737 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk" event={"ID":"f23865a8-7bc6-47b8-a9c8-6c9188463757","Type":"ContainerDied","Data":"cb41a0774866324b1c35673856e4cebb384994b07e1f341c7309321f2628229e"} Feb 18 14:08:36 crc kubenswrapper[4817]: I0218 14:08:36.259816 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb41a0774866324b1c35673856e4cebb384994b07e1f341c7309321f2628229e" Feb 18 14:08:36 crc kubenswrapper[4817]: I0218 14:08:36.259864 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.321130 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zh96d"] Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.322239 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerName="ovn-controller" containerID="cri-o://96802a0db761b37b3d378e229e8e23ac39f92b53a43da83d23b00f66cfb4028f" gracePeriod=30 Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.322341 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerName="nbdb" containerID="cri-o://0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1" gracePeriod=30 Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.322396 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerName="sbdb" containerID="cri-o://f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1" gracePeriod=30 Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.322476 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerName="northd" containerID="cri-o://006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63" gracePeriod=30 Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.322476 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerName="kube-rbac-proxy-node" containerID="cri-o://74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed" gracePeriod=30 Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.322438 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerName="ovn-acl-logging" containerID="cri-o://d32ca79e186fd02a706c91b03da7c227b89971fa9ce92a1ff85a5554269db82a" gracePeriod=30 Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.322556 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80" gracePeriod=30 Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.370939 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerName="ovnkube-controller" containerID="cri-o://1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11" gracePeriod=30 Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.673933 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zh96d_1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b/ovn-acl-logging/0.log" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.674400 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zh96d_1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b/ovn-controller/0.log" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.674956 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.702277 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-log-socket\") pod \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.702336 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-ovnkube-config\") pod \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.702354 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-etc-openvswitch\") pod \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.702385 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-run-netns\") pod \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.702407 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-env-overrides\") pod \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.702429 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-run-ovn-kubernetes\") pod \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.702451 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-run-openvswitch\") pod \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.702480 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-cni-netd\") pod \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.702464 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" (UID: "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.702507 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" (UID: "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.702472 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" (UID: "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.702510 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-ovnkube-script-lib\") pod \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.702521 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" (UID: "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.702554 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-log-socket" (OuterVolumeSpecName: "log-socket") pod "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" (UID: "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.702551 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" (UID: "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.702584 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-kubelet\") pod \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.702609 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" (UID: "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.702660 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-systemd-units\") pod \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.702689 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-cni-bin\") pod \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.702718 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-ovn-node-metrics-cert\") pod \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.702749 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-var-lib-openvswitch\") pod \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.702758 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" (UID: "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.702792 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" (UID: "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.702793 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-node-log\") pod \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.702819 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-node-log" (OuterVolumeSpecName: "node-log") pod "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" (UID: "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.702832 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.702866 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-slash\") pod \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.702906 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npdcw\" (UniqueName: \"kubernetes.io/projected/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-kube-api-access-npdcw\") pod \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.702939 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-run-ovn\") pod \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.702970 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-run-systemd\") pod \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\" (UID: \"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b\") " Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.703269 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" (UID: "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.703290 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" (UID: "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.703319 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" (UID: "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.703324 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" (UID: "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.703300 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-slash" (OuterVolumeSpecName: "host-slash") pod "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" (UID: "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.703355 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" (UID: "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.703450 4817 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-log-socket\") on node \"crc\" DevicePath \"\"" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.703463 4817 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.703477 4817 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.703484 4817 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.703493 4817 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.703501 4817 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.703508 4817 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.703516 4817 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.703524 4817 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.703531 4817 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.703539 4817 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.703548 4817 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.703556 4817 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-node-log\") on node \"crc\" DevicePath \"\"" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.703563 4817 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.703572 4817 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-host-slash\") on node \"crc\" DevicePath \"\"" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.704281 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" (UID: "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.720008 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" (UID: "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.721164 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-kube-api-access-npdcw" (OuterVolumeSpecName: "kube-api-access-npdcw") pod "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" (UID: "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b"). InnerVolumeSpecName "kube-api-access-npdcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.726340 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" (UID: "1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.740661 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-c2xjs"] Feb 18 14:08:41 crc kubenswrapper[4817]: E0218 14:08:41.740879 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerName="kube-rbac-proxy-ovn-metrics" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.740895 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerName="kube-rbac-proxy-ovn-metrics" Feb 18 14:08:41 crc kubenswrapper[4817]: E0218 14:08:41.740909 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerName="sbdb" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.740917 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerName="sbdb" Feb 18 14:08:41 crc kubenswrapper[4817]: E0218 14:08:41.740927 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerName="ovn-controller" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.740935 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerName="ovn-controller" Feb 18 14:08:41 crc kubenswrapper[4817]: E0218 14:08:41.740947 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerName="northd" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.740956 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerName="northd" Feb 18 14:08:41 crc kubenswrapper[4817]: E0218 14:08:41.740967 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f23865a8-7bc6-47b8-a9c8-6c9188463757" containerName="pull" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.740997 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f23865a8-7bc6-47b8-a9c8-6c9188463757" containerName="pull" Feb 18 14:08:41 crc kubenswrapper[4817]: E0218 14:08:41.741016 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerName="nbdb" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.741025 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerName="nbdb" Feb 18 14:08:41 crc kubenswrapper[4817]: E0218 14:08:41.741037 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerName="ovn-acl-logging" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.741046 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerName="ovn-acl-logging" Feb 18 14:08:41 crc kubenswrapper[4817]: E0218 14:08:41.741062 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f23865a8-7bc6-47b8-a9c8-6c9188463757" containerName="util" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.741070 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f23865a8-7bc6-47b8-a9c8-6c9188463757" containerName="util" Feb 18 14:08:41 crc kubenswrapper[4817]: E0218 14:08:41.741081 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerName="kubecfg-setup" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.741089 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerName="kubecfg-setup" Feb 18 14:08:41 crc kubenswrapper[4817]: E0218 14:08:41.741101 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerName="kube-rbac-proxy-node" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.741108 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerName="kube-rbac-proxy-node" Feb 18 14:08:41 crc kubenswrapper[4817]: E0218 14:08:41.741121 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerName="ovnkube-controller" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.741129 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerName="ovnkube-controller" Feb 18 14:08:41 crc kubenswrapper[4817]: E0218 14:08:41.741141 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f23865a8-7bc6-47b8-a9c8-6c9188463757" containerName="extract" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.741150 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f23865a8-7bc6-47b8-a9c8-6c9188463757" containerName="extract" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.741297 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerName="ovn-controller" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.741318 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerName="sbdb" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.741335 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerName="ovn-acl-logging" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.741352 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerName="northd" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.741368 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerName="kube-rbac-proxy-node" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.741379 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerName="ovnkube-controller" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.741398 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="f23865a8-7bc6-47b8-a9c8-6c9188463757" containerName="extract" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.741432 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerName="nbdb" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.741442 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerName="kube-rbac-proxy-ovn-metrics" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.747358 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.804182 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-var-lib-openvswitch\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.804484 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-ovnkube-script-lib\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.804602 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df9wh\" (UniqueName: \"kubernetes.io/projected/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-kube-api-access-df9wh\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.804698 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-run-systemd\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.804799 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-run-openvswitch\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.804900 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-ovn-node-metrics-cert\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.805021 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-etc-openvswitch\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.805132 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-systemd-units\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.805225 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.805353 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-node-log\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.805466 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-host-cni-netd\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.805585 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-ovnkube-config\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.805693 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-host-run-ovn-kubernetes\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.805801 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-host-run-netns\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.805908 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-host-cni-bin\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.806038 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-env-overrides\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.806157 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-host-kubelet\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.806265 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-log-socket\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.806387 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-run-ovn\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.806441 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-host-slash\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.806521 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npdcw\" (UniqueName: \"kubernetes.io/projected/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-kube-api-access-npdcw\") on node \"crc\" DevicePath \"\"" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.806538 4817 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.806549 4817 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.806563 4817 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.806576 4817 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.907278 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-var-lib-openvswitch\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.907334 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-ovnkube-script-lib\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.907358 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df9wh\" (UniqueName: \"kubernetes.io/projected/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-kube-api-access-df9wh\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.907378 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-run-systemd\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.907402 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-run-openvswitch\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.907410 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-var-lib-openvswitch\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.907421 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-ovn-node-metrics-cert\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.907510 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-etc-openvswitch\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.907565 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-systemd-units\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.907612 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.907681 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-node-log\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.907731 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-host-cni-netd\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.907786 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-ovnkube-config\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.907827 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-host-run-ovn-kubernetes\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.907866 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-host-run-netns\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.907920 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-host-cni-bin\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.907960 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-env-overrides\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.908046 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-host-kubelet\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.908076 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-etc-openvswitch\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.908095 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-log-socket\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.908046 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-host-cni-netd\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.908129 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-systemd-units\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.908131 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-run-ovn\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.908150 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.908168 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-host-slash\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.908189 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-run-systemd\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.908048 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-ovnkube-script-lib\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.908256 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-host-kubelet\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.908286 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-log-socket\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.908321 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-run-openvswitch\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.908350 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-host-run-ovn-kubernetes\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.908383 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-host-cni-bin\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.908077 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-host-run-netns\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.908173 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-node-log\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.908431 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-ovnkube-config\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.908501 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-host-slash\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.908513 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-run-ovn\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.908924 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-env-overrides\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.912917 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-ovn-node-metrics-cert\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:41 crc kubenswrapper[4817]: I0218 14:08:41.934493 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df9wh\" (UniqueName: \"kubernetes.io/projected/ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5-kube-api-access-df9wh\") pod \"ovnkube-node-c2xjs\" (UID: \"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.077232 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.296435 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xkbz6_04978aec-7cd4-435f-a1d9-d3e0223c0e75/kube-multus/0.log" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.296992 4817 generic.go:334] "Generic (PLEG): container finished" podID="04978aec-7cd4-435f-a1d9-d3e0223c0e75" containerID="f6781392caddb60590b6d79fa957427f81b0bbf46661c2ea8b56564e7b4bac1b" exitCode=2 Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.297080 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xkbz6" event={"ID":"04978aec-7cd4-435f-a1d9-d3e0223c0e75","Type":"ContainerDied","Data":"f6781392caddb60590b6d79fa957427f81b0bbf46661c2ea8b56564e7b4bac1b"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.297575 4817 scope.go:117] "RemoveContainer" containerID="f6781392caddb60590b6d79fa957427f81b0bbf46661c2ea8b56564e7b4bac1b" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.318485 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zh96d_1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b/ovn-acl-logging/0.log" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.321362 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zh96d_1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b/ovn-controller/0.log" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.321893 4817 generic.go:334] "Generic (PLEG): container finished" podID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerID="1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11" exitCode=0 Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.321926 4817 generic.go:334] "Generic (PLEG): container finished" podID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerID="f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1" exitCode=0 Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.321937 4817 generic.go:334] "Generic (PLEG): container finished" podID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerID="0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1" exitCode=0 Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.321947 4817 generic.go:334] "Generic (PLEG): container finished" podID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerID="006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63" exitCode=0 Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.321954 4817 generic.go:334] "Generic (PLEG): container finished" podID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerID="e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80" exitCode=0 Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.321960 4817 generic.go:334] "Generic (PLEG): container finished" podID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerID="74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed" exitCode=0 Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.321966 4817 generic.go:334] "Generic (PLEG): container finished" podID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerID="d32ca79e186fd02a706c91b03da7c227b89971fa9ce92a1ff85a5554269db82a" exitCode=143 Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.321954 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" event={"ID":"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b","Type":"ContainerDied","Data":"1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322029 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322052 4817 scope.go:117] "RemoveContainer" containerID="1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.321986 4817 generic.go:334] "Generic (PLEG): container finished" podID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" containerID="96802a0db761b37b3d378e229e8e23ac39f92b53a43da83d23b00f66cfb4028f" exitCode=143 Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322037 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" event={"ID":"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b","Type":"ContainerDied","Data":"f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322181 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" event={"ID":"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b","Type":"ContainerDied","Data":"0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322199 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" event={"ID":"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b","Type":"ContainerDied","Data":"006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322212 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" event={"ID":"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b","Type":"ContainerDied","Data":"e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322222 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" event={"ID":"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b","Type":"ContainerDied","Data":"74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322235 4817 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d32ca79e186fd02a706c91b03da7c227b89971fa9ce92a1ff85a5554269db82a"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322245 4817 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"96802a0db761b37b3d378e229e8e23ac39f92b53a43da83d23b00f66cfb4028f"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322251 4817 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"87f3456aec91d11793ca513ea95756997ec27fc4e5f51fa639e3fb82b419de95"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322258 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" event={"ID":"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b","Type":"ContainerDied","Data":"d32ca79e186fd02a706c91b03da7c227b89971fa9ce92a1ff85a5554269db82a"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322265 4817 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322273 4817 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322278 4817 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322283 4817 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322288 4817 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322293 4817 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322298 4817 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d32ca79e186fd02a706c91b03da7c227b89971fa9ce92a1ff85a5554269db82a"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322302 4817 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"96802a0db761b37b3d378e229e8e23ac39f92b53a43da83d23b00f66cfb4028f"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322308 4817 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"87f3456aec91d11793ca513ea95756997ec27fc4e5f51fa639e3fb82b419de95"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322315 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" event={"ID":"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b","Type":"ContainerDied","Data":"96802a0db761b37b3d378e229e8e23ac39f92b53a43da83d23b00f66cfb4028f"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322323 4817 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322330 4817 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322335 4817 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322341 4817 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322348 4817 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322353 4817 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322358 4817 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d32ca79e186fd02a706c91b03da7c227b89971fa9ce92a1ff85a5554269db82a"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322363 4817 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"96802a0db761b37b3d378e229e8e23ac39f92b53a43da83d23b00f66cfb4028f"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322368 4817 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"87f3456aec91d11793ca513ea95756997ec27fc4e5f51fa639e3fb82b419de95"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322375 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zh96d" event={"ID":"1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b","Type":"ContainerDied","Data":"6959bc59ccef4b1b32e4b2c520bee0db960072623fbbc83680c6ae2e722c80d9"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322382 4817 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322388 4817 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322393 4817 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322399 4817 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322403 4817 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322408 4817 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322413 4817 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d32ca79e186fd02a706c91b03da7c227b89971fa9ce92a1ff85a5554269db82a"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322418 4817 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"96802a0db761b37b3d378e229e8e23ac39f92b53a43da83d23b00f66cfb4028f"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.322423 4817 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"87f3456aec91d11793ca513ea95756997ec27fc4e5f51fa639e3fb82b419de95"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.323752 4817 generic.go:334] "Generic (PLEG): container finished" podID="ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5" containerID="20d66092628cd66961b5afb0de2c97808b6e003a5c30936b38daf736c78dcf86" exitCode=0 Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.323787 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" event={"ID":"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5","Type":"ContainerDied","Data":"20d66092628cd66961b5afb0de2c97808b6e003a5c30936b38daf736c78dcf86"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.323812 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" event={"ID":"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5","Type":"ContainerStarted","Data":"7341b0fb632674806ecce83a2e37a7044d6ec3846f4f041a57da41eceb4175a6"} Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.357191 4817 scope.go:117] "RemoveContainer" containerID="f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.398045 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zh96d"] Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.399994 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zh96d"] Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.405425 4817 scope.go:117] "RemoveContainer" containerID="0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.426831 4817 scope.go:117] "RemoveContainer" containerID="006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.450617 4817 scope.go:117] "RemoveContainer" containerID="e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.495181 4817 scope.go:117] "RemoveContainer" containerID="74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.515818 4817 scope.go:117] "RemoveContainer" containerID="d32ca79e186fd02a706c91b03da7c227b89971fa9ce92a1ff85a5554269db82a" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.533465 4817 scope.go:117] "RemoveContainer" containerID="96802a0db761b37b3d378e229e8e23ac39f92b53a43da83d23b00f66cfb4028f" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.549264 4817 scope.go:117] "RemoveContainer" containerID="87f3456aec91d11793ca513ea95756997ec27fc4e5f51fa639e3fb82b419de95" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.591749 4817 scope.go:117] "RemoveContainer" containerID="1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11" Feb 18 14:08:42 crc kubenswrapper[4817]: E0218 14:08:42.592285 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11\": container with ID starting with 1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11 not found: ID does not exist" containerID="1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.592341 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11"} err="failed to get container status \"1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11\": rpc error: code = NotFound desc = could not find container \"1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11\": container with ID starting with 1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11 not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.592377 4817 scope.go:117] "RemoveContainer" containerID="f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1" Feb 18 14:08:42 crc kubenswrapper[4817]: E0218 14:08:42.593257 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1\": container with ID starting with f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1 not found: ID does not exist" containerID="f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.593285 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1"} err="failed to get container status \"f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1\": rpc error: code = NotFound desc = could not find container \"f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1\": container with ID starting with f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1 not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.593310 4817 scope.go:117] "RemoveContainer" containerID="0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1" Feb 18 14:08:42 crc kubenswrapper[4817]: E0218 14:08:42.597346 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1\": container with ID starting with 0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1 not found: ID does not exist" containerID="0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.597373 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1"} err="failed to get container status \"0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1\": rpc error: code = NotFound desc = could not find container \"0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1\": container with ID starting with 0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1 not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.597391 4817 scope.go:117] "RemoveContainer" containerID="006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63" Feb 18 14:08:42 crc kubenswrapper[4817]: E0218 14:08:42.601393 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63\": container with ID starting with 006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63 not found: ID does not exist" containerID="006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.601426 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63"} err="failed to get container status \"006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63\": rpc error: code = NotFound desc = could not find container \"006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63\": container with ID starting with 006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63 not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.601444 4817 scope.go:117] "RemoveContainer" containerID="e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80" Feb 18 14:08:42 crc kubenswrapper[4817]: E0218 14:08:42.602177 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80\": container with ID starting with e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80 not found: ID does not exist" containerID="e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.602204 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80"} err="failed to get container status \"e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80\": rpc error: code = NotFound desc = could not find container \"e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80\": container with ID starting with e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80 not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.602217 4817 scope.go:117] "RemoveContainer" containerID="74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed" Feb 18 14:08:42 crc kubenswrapper[4817]: E0218 14:08:42.604171 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed\": container with ID starting with 74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed not found: ID does not exist" containerID="74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.604194 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed"} err="failed to get container status \"74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed\": rpc error: code = NotFound desc = could not find container \"74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed\": container with ID starting with 74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.604208 4817 scope.go:117] "RemoveContainer" containerID="d32ca79e186fd02a706c91b03da7c227b89971fa9ce92a1ff85a5554269db82a" Feb 18 14:08:42 crc kubenswrapper[4817]: E0218 14:08:42.605197 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d32ca79e186fd02a706c91b03da7c227b89971fa9ce92a1ff85a5554269db82a\": container with ID starting with d32ca79e186fd02a706c91b03da7c227b89971fa9ce92a1ff85a5554269db82a not found: ID does not exist" containerID="d32ca79e186fd02a706c91b03da7c227b89971fa9ce92a1ff85a5554269db82a" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.605257 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d32ca79e186fd02a706c91b03da7c227b89971fa9ce92a1ff85a5554269db82a"} err="failed to get container status \"d32ca79e186fd02a706c91b03da7c227b89971fa9ce92a1ff85a5554269db82a\": rpc error: code = NotFound desc = could not find container \"d32ca79e186fd02a706c91b03da7c227b89971fa9ce92a1ff85a5554269db82a\": container with ID starting with d32ca79e186fd02a706c91b03da7c227b89971fa9ce92a1ff85a5554269db82a not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.605293 4817 scope.go:117] "RemoveContainer" containerID="96802a0db761b37b3d378e229e8e23ac39f92b53a43da83d23b00f66cfb4028f" Feb 18 14:08:42 crc kubenswrapper[4817]: E0218 14:08:42.606283 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96802a0db761b37b3d378e229e8e23ac39f92b53a43da83d23b00f66cfb4028f\": container with ID starting with 96802a0db761b37b3d378e229e8e23ac39f92b53a43da83d23b00f66cfb4028f not found: ID does not exist" containerID="96802a0db761b37b3d378e229e8e23ac39f92b53a43da83d23b00f66cfb4028f" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.606312 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96802a0db761b37b3d378e229e8e23ac39f92b53a43da83d23b00f66cfb4028f"} err="failed to get container status \"96802a0db761b37b3d378e229e8e23ac39f92b53a43da83d23b00f66cfb4028f\": rpc error: code = NotFound desc = could not find container \"96802a0db761b37b3d378e229e8e23ac39f92b53a43da83d23b00f66cfb4028f\": container with ID starting with 96802a0db761b37b3d378e229e8e23ac39f92b53a43da83d23b00f66cfb4028f not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.606328 4817 scope.go:117] "RemoveContainer" containerID="87f3456aec91d11793ca513ea95756997ec27fc4e5f51fa639e3fb82b419de95" Feb 18 14:08:42 crc kubenswrapper[4817]: E0218 14:08:42.607416 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87f3456aec91d11793ca513ea95756997ec27fc4e5f51fa639e3fb82b419de95\": container with ID starting with 87f3456aec91d11793ca513ea95756997ec27fc4e5f51fa639e3fb82b419de95 not found: ID does not exist" containerID="87f3456aec91d11793ca513ea95756997ec27fc4e5f51fa639e3fb82b419de95" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.607442 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87f3456aec91d11793ca513ea95756997ec27fc4e5f51fa639e3fb82b419de95"} err="failed to get container status \"87f3456aec91d11793ca513ea95756997ec27fc4e5f51fa639e3fb82b419de95\": rpc error: code = NotFound desc = could not find container \"87f3456aec91d11793ca513ea95756997ec27fc4e5f51fa639e3fb82b419de95\": container with ID starting with 87f3456aec91d11793ca513ea95756997ec27fc4e5f51fa639e3fb82b419de95 not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.607456 4817 scope.go:117] "RemoveContainer" containerID="1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.608712 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11"} err="failed to get container status \"1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11\": rpc error: code = NotFound desc = could not find container \"1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11\": container with ID starting with 1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11 not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.608731 4817 scope.go:117] "RemoveContainer" containerID="f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.609205 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1"} err="failed to get container status \"f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1\": rpc error: code = NotFound desc = could not find container \"f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1\": container with ID starting with f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1 not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.609222 4817 scope.go:117] "RemoveContainer" containerID="0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.610447 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1"} err="failed to get container status \"0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1\": rpc error: code = NotFound desc = could not find container \"0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1\": container with ID starting with 0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1 not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.610495 4817 scope.go:117] "RemoveContainer" containerID="006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.611215 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63"} err="failed to get container status \"006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63\": rpc error: code = NotFound desc = could not find container \"006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63\": container with ID starting with 006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63 not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.611243 4817 scope.go:117] "RemoveContainer" containerID="e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.613873 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80"} err="failed to get container status \"e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80\": rpc error: code = NotFound desc = could not find container \"e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80\": container with ID starting with e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80 not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.613894 4817 scope.go:117] "RemoveContainer" containerID="74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.615660 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed"} err="failed to get container status \"74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed\": rpc error: code = NotFound desc = could not find container \"74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed\": container with ID starting with 74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.615679 4817 scope.go:117] "RemoveContainer" containerID="d32ca79e186fd02a706c91b03da7c227b89971fa9ce92a1ff85a5554269db82a" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.616849 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d32ca79e186fd02a706c91b03da7c227b89971fa9ce92a1ff85a5554269db82a"} err="failed to get container status \"d32ca79e186fd02a706c91b03da7c227b89971fa9ce92a1ff85a5554269db82a\": rpc error: code = NotFound desc = could not find container \"d32ca79e186fd02a706c91b03da7c227b89971fa9ce92a1ff85a5554269db82a\": container with ID starting with d32ca79e186fd02a706c91b03da7c227b89971fa9ce92a1ff85a5554269db82a not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.616869 4817 scope.go:117] "RemoveContainer" containerID="96802a0db761b37b3d378e229e8e23ac39f92b53a43da83d23b00f66cfb4028f" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.620167 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96802a0db761b37b3d378e229e8e23ac39f92b53a43da83d23b00f66cfb4028f"} err="failed to get container status \"96802a0db761b37b3d378e229e8e23ac39f92b53a43da83d23b00f66cfb4028f\": rpc error: code = NotFound desc = could not find container \"96802a0db761b37b3d378e229e8e23ac39f92b53a43da83d23b00f66cfb4028f\": container with ID starting with 96802a0db761b37b3d378e229e8e23ac39f92b53a43da83d23b00f66cfb4028f not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.620185 4817 scope.go:117] "RemoveContainer" containerID="87f3456aec91d11793ca513ea95756997ec27fc4e5f51fa639e3fb82b419de95" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.620969 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87f3456aec91d11793ca513ea95756997ec27fc4e5f51fa639e3fb82b419de95"} err="failed to get container status \"87f3456aec91d11793ca513ea95756997ec27fc4e5f51fa639e3fb82b419de95\": rpc error: code = NotFound desc = could not find container \"87f3456aec91d11793ca513ea95756997ec27fc4e5f51fa639e3fb82b419de95\": container with ID starting with 87f3456aec91d11793ca513ea95756997ec27fc4e5f51fa639e3fb82b419de95 not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.621026 4817 scope.go:117] "RemoveContainer" containerID="1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.625209 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11"} err="failed to get container status \"1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11\": rpc error: code = NotFound desc = could not find container \"1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11\": container with ID starting with 1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11 not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.625233 4817 scope.go:117] "RemoveContainer" containerID="f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.625679 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1"} err="failed to get container status \"f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1\": rpc error: code = NotFound desc = could not find container \"f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1\": container with ID starting with f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1 not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.625697 4817 scope.go:117] "RemoveContainer" containerID="0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.625903 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1"} err="failed to get container status \"0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1\": rpc error: code = NotFound desc = could not find container \"0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1\": container with ID starting with 0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1 not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.625923 4817 scope.go:117] "RemoveContainer" containerID="006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.626132 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63"} err="failed to get container status \"006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63\": rpc error: code = NotFound desc = could not find container \"006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63\": container with ID starting with 006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63 not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.626153 4817 scope.go:117] "RemoveContainer" containerID="e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.626361 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80"} err="failed to get container status \"e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80\": rpc error: code = NotFound desc = could not find container \"e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80\": container with ID starting with e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80 not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.626378 4817 scope.go:117] "RemoveContainer" containerID="74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.626565 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed"} err="failed to get container status \"74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed\": rpc error: code = NotFound desc = could not find container \"74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed\": container with ID starting with 74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.626580 4817 scope.go:117] "RemoveContainer" containerID="d32ca79e186fd02a706c91b03da7c227b89971fa9ce92a1ff85a5554269db82a" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.629342 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d32ca79e186fd02a706c91b03da7c227b89971fa9ce92a1ff85a5554269db82a"} err="failed to get container status \"d32ca79e186fd02a706c91b03da7c227b89971fa9ce92a1ff85a5554269db82a\": rpc error: code = NotFound desc = could not find container \"d32ca79e186fd02a706c91b03da7c227b89971fa9ce92a1ff85a5554269db82a\": container with ID starting with d32ca79e186fd02a706c91b03da7c227b89971fa9ce92a1ff85a5554269db82a not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.629364 4817 scope.go:117] "RemoveContainer" containerID="96802a0db761b37b3d378e229e8e23ac39f92b53a43da83d23b00f66cfb4028f" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.631775 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96802a0db761b37b3d378e229e8e23ac39f92b53a43da83d23b00f66cfb4028f"} err="failed to get container status \"96802a0db761b37b3d378e229e8e23ac39f92b53a43da83d23b00f66cfb4028f\": rpc error: code = NotFound desc = could not find container \"96802a0db761b37b3d378e229e8e23ac39f92b53a43da83d23b00f66cfb4028f\": container with ID starting with 96802a0db761b37b3d378e229e8e23ac39f92b53a43da83d23b00f66cfb4028f not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.631797 4817 scope.go:117] "RemoveContainer" containerID="87f3456aec91d11793ca513ea95756997ec27fc4e5f51fa639e3fb82b419de95" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.633260 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87f3456aec91d11793ca513ea95756997ec27fc4e5f51fa639e3fb82b419de95"} err="failed to get container status \"87f3456aec91d11793ca513ea95756997ec27fc4e5f51fa639e3fb82b419de95\": rpc error: code = NotFound desc = could not find container \"87f3456aec91d11793ca513ea95756997ec27fc4e5f51fa639e3fb82b419de95\": container with ID starting with 87f3456aec91d11793ca513ea95756997ec27fc4e5f51fa639e3fb82b419de95 not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.633354 4817 scope.go:117] "RemoveContainer" containerID="1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.633897 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11"} err="failed to get container status \"1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11\": rpc error: code = NotFound desc = could not find container \"1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11\": container with ID starting with 1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11 not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.633936 4817 scope.go:117] "RemoveContainer" containerID="f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.634318 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1"} err="failed to get container status \"f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1\": rpc error: code = NotFound desc = could not find container \"f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1\": container with ID starting with f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1 not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.634418 4817 scope.go:117] "RemoveContainer" containerID="0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.634686 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1"} err="failed to get container status \"0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1\": rpc error: code = NotFound desc = could not find container \"0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1\": container with ID starting with 0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1 not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.634763 4817 scope.go:117] "RemoveContainer" containerID="006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.636461 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63"} err="failed to get container status \"006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63\": rpc error: code = NotFound desc = could not find container \"006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63\": container with ID starting with 006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63 not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.636549 4817 scope.go:117] "RemoveContainer" containerID="e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.636972 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80"} err="failed to get container status \"e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80\": rpc error: code = NotFound desc = could not find container \"e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80\": container with ID starting with e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80 not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.637060 4817 scope.go:117] "RemoveContainer" containerID="74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.637349 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed"} err="failed to get container status \"74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed\": rpc error: code = NotFound desc = could not find container \"74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed\": container with ID starting with 74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.637423 4817 scope.go:117] "RemoveContainer" containerID="d32ca79e186fd02a706c91b03da7c227b89971fa9ce92a1ff85a5554269db82a" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.637659 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d32ca79e186fd02a706c91b03da7c227b89971fa9ce92a1ff85a5554269db82a"} err="failed to get container status \"d32ca79e186fd02a706c91b03da7c227b89971fa9ce92a1ff85a5554269db82a\": rpc error: code = NotFound desc = could not find container \"d32ca79e186fd02a706c91b03da7c227b89971fa9ce92a1ff85a5554269db82a\": container with ID starting with d32ca79e186fd02a706c91b03da7c227b89971fa9ce92a1ff85a5554269db82a not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.637739 4817 scope.go:117] "RemoveContainer" containerID="96802a0db761b37b3d378e229e8e23ac39f92b53a43da83d23b00f66cfb4028f" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.637962 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96802a0db761b37b3d378e229e8e23ac39f92b53a43da83d23b00f66cfb4028f"} err="failed to get container status \"96802a0db761b37b3d378e229e8e23ac39f92b53a43da83d23b00f66cfb4028f\": rpc error: code = NotFound desc = could not find container \"96802a0db761b37b3d378e229e8e23ac39f92b53a43da83d23b00f66cfb4028f\": container with ID starting with 96802a0db761b37b3d378e229e8e23ac39f92b53a43da83d23b00f66cfb4028f not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.638055 4817 scope.go:117] "RemoveContainer" containerID="87f3456aec91d11793ca513ea95756997ec27fc4e5f51fa639e3fb82b419de95" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.638278 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87f3456aec91d11793ca513ea95756997ec27fc4e5f51fa639e3fb82b419de95"} err="failed to get container status \"87f3456aec91d11793ca513ea95756997ec27fc4e5f51fa639e3fb82b419de95\": rpc error: code = NotFound desc = could not find container \"87f3456aec91d11793ca513ea95756997ec27fc4e5f51fa639e3fb82b419de95\": container with ID starting with 87f3456aec91d11793ca513ea95756997ec27fc4e5f51fa639e3fb82b419de95 not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.638363 4817 scope.go:117] "RemoveContainer" containerID="1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.638587 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11"} err="failed to get container status \"1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11\": rpc error: code = NotFound desc = could not find container \"1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11\": container with ID starting with 1ae50616ebb36c87520298c00803cc5550b00bf39e5e8559acf38698039d9b11 not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.638738 4817 scope.go:117] "RemoveContainer" containerID="f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.638962 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1"} err="failed to get container status \"f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1\": rpc error: code = NotFound desc = could not find container \"f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1\": container with ID starting with f871f55045e58772e8e55e689974b7145a9bf1eb75351951e0a98d5706750ec1 not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.639000 4817 scope.go:117] "RemoveContainer" containerID="0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.639668 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1"} err="failed to get container status \"0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1\": rpc error: code = NotFound desc = could not find container \"0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1\": container with ID starting with 0229124cadbf4c9cd008e5f2548803714d73841ef59d6935ca66428e213dc7a1 not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.639687 4817 scope.go:117] "RemoveContainer" containerID="006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.639936 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63"} err="failed to get container status \"006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63\": rpc error: code = NotFound desc = could not find container \"006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63\": container with ID starting with 006997c7298e646b4c4af9e484a730210e0454cb8e4840122dc8ae4786cceb63 not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.640057 4817 scope.go:117] "RemoveContainer" containerID="e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.640393 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80"} err="failed to get container status \"e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80\": rpc error: code = NotFound desc = could not find container \"e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80\": container with ID starting with e6c9321f862dbcb7d2372148cffc08c3bb5644753ef212c4e7843d0f88895b80 not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.640468 4817 scope.go:117] "RemoveContainer" containerID="74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.640738 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed"} err="failed to get container status \"74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed\": rpc error: code = NotFound desc = could not find container \"74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed\": container with ID starting with 74fd740f996fbca4cc4b0ceef601777cc90a0c05812ddaba08afac4d609848ed not found: ID does not exist" Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.863099 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:08:42 crc kubenswrapper[4817]: I0218 14:08:42.863167 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:08:43 crc kubenswrapper[4817]: I0218 14:08:43.330971 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xkbz6_04978aec-7cd4-435f-a1d9-d3e0223c0e75/kube-multus/0.log" Feb 18 14:08:43 crc kubenswrapper[4817]: I0218 14:08:43.331340 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xkbz6" event={"ID":"04978aec-7cd4-435f-a1d9-d3e0223c0e75","Type":"ContainerStarted","Data":"0184267e3ce6b18e150c60c50ed30d3fc7e345b5866b635e02fc95c9e47969cc"} Feb 18 14:08:43 crc kubenswrapper[4817]: I0218 14:08:43.337646 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" event={"ID":"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5","Type":"ContainerStarted","Data":"4e10b7b99f0bcfcd7c027c4eda041f038409fa5ffe7c789de26eba45312cabb2"} Feb 18 14:08:43 crc kubenswrapper[4817]: I0218 14:08:43.337683 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" event={"ID":"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5","Type":"ContainerStarted","Data":"d614418fe77df338fb2de10bce85da9d36bcea46beaa9eba18d9838e6e5dcf59"} Feb 18 14:08:43 crc kubenswrapper[4817]: I0218 14:08:43.337693 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" event={"ID":"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5","Type":"ContainerStarted","Data":"c2250951b63cc08ba0e7639e384dd4c4694f3d756b3970b71c0253cb6a82511c"} Feb 18 14:08:43 crc kubenswrapper[4817]: I0218 14:08:43.337700 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" event={"ID":"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5","Type":"ContainerStarted","Data":"c10aa333d80e7fc85e0c44f053b937d21cbf1895a7ad68f40795530a60cc1dc8"} Feb 18 14:08:43 crc kubenswrapper[4817]: I0218 14:08:43.337708 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" event={"ID":"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5","Type":"ContainerStarted","Data":"e6564c2336cc76c9b441ea207e9fbd64dcc8fcfa69cd19969072d2536d4cf24b"} Feb 18 14:08:43 crc kubenswrapper[4817]: I0218 14:08:43.337716 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" event={"ID":"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5","Type":"ContainerStarted","Data":"4daf2324a0ddd7b90f10fa99705e9119f98d3c6da1186b25095e2e2a57bb4784"} Feb 18 14:08:44 crc kubenswrapper[4817]: I0218 14:08:44.180227 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b" path="/var/lib/kubelet/pods/1d3841ed-d9b2-4e7a-9eb3-6650fa0be74b/volumes" Feb 18 14:08:46 crc kubenswrapper[4817]: I0218 14:08:46.354235 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" event={"ID":"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5","Type":"ContainerStarted","Data":"468cacfb91ffd0b91928c881534253b211f4b77d659908fb8d278a4b34cd1ab6"} Feb 18 14:08:47 crc kubenswrapper[4817]: I0218 14:08:47.847696 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-2p6zw"] Feb 18 14:08:47 crc kubenswrapper[4817]: I0218 14:08:47.850496 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2p6zw" Feb 18 14:08:47 crc kubenswrapper[4817]: I0218 14:08:47.856013 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 18 14:08:47 crc kubenswrapper[4817]: I0218 14:08:47.856489 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 18 14:08:47 crc kubenswrapper[4817]: I0218 14:08:47.856449 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-4rw6n" Feb 18 14:08:47 crc kubenswrapper[4817]: I0218 14:08:47.982064 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw"] Feb 18 14:08:47 crc kubenswrapper[4817]: I0218 14:08:47.982885 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw" Feb 18 14:08:47 crc kubenswrapper[4817]: I0218 14:08:47.983312 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg7vb\" (UniqueName: \"kubernetes.io/projected/bc446e23-6b46-40cc-b058-5f8d491d8310-kube-api-access-qg7vb\") pod \"obo-prometheus-operator-68bc856cb9-2p6zw\" (UID: \"bc446e23-6b46-40cc-b058-5f8d491d8310\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2p6zw" Feb 18 14:08:47 crc kubenswrapper[4817]: I0218 14:08:47.984762 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-hzpfh" Feb 18 14:08:47 crc kubenswrapper[4817]: I0218 14:08:47.984941 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 18 14:08:47 crc kubenswrapper[4817]: I0218 14:08:47.999257 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh"] Feb 18 14:08:47 crc kubenswrapper[4817]: I0218 14:08:47.999862 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.070804 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-xcdgl"] Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.071511 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-xcdgl" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.073594 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.073807 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-4t5n8" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.084412 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fa24c32b-4905-4756-a765-195d6b0b6c1a-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw\" (UID: \"fa24c32b-4905-4756-a765-195d6b0b6c1a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.084486 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fa24c32b-4905-4756-a765-195d6b0b6c1a-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw\" (UID: \"fa24c32b-4905-4756-a765-195d6b0b6c1a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.084542 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fad4abaa-bb3e-4fa2-9478-37e792ead430-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh\" (UID: \"fad4abaa-bb3e-4fa2-9478-37e792ead430\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.084582 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg7vb\" (UniqueName: \"kubernetes.io/projected/bc446e23-6b46-40cc-b058-5f8d491d8310-kube-api-access-qg7vb\") pod \"obo-prometheus-operator-68bc856cb9-2p6zw\" (UID: \"bc446e23-6b46-40cc-b058-5f8d491d8310\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2p6zw" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.084622 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fad4abaa-bb3e-4fa2-9478-37e792ead430-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh\" (UID: \"fad4abaa-bb3e-4fa2-9478-37e792ead430\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.102190 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg7vb\" (UniqueName: \"kubernetes.io/projected/bc446e23-6b46-40cc-b058-5f8d491d8310-kube-api-access-qg7vb\") pod \"obo-prometheus-operator-68bc856cb9-2p6zw\" (UID: \"bc446e23-6b46-40cc-b058-5f8d491d8310\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2p6zw" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.169403 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2p6zw" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.181863 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-qsnlp"] Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.182758 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-qsnlp" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.185943 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fa24c32b-4905-4756-a765-195d6b0b6c1a-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw\" (UID: \"fa24c32b-4905-4756-a765-195d6b0b6c1a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.190209 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2hgm\" (UniqueName: \"kubernetes.io/projected/02b7c5c2-ac49-498f-9c4c-c64cf484d131-kube-api-access-x2hgm\") pod \"observability-operator-59bdc8b94-xcdgl\" (UID: \"02b7c5c2-ac49-498f-9c4c-c64cf484d131\") " pod="openshift-operators/observability-operator-59bdc8b94-xcdgl" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.190279 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fa24c32b-4905-4756-a765-195d6b0b6c1a-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw\" (UID: \"fa24c32b-4905-4756-a765-195d6b0b6c1a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.190509 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fad4abaa-bb3e-4fa2-9478-37e792ead430-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh\" (UID: \"fad4abaa-bb3e-4fa2-9478-37e792ead430\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.190547 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/02b7c5c2-ac49-498f-9c4c-c64cf484d131-observability-operator-tls\") pod \"observability-operator-59bdc8b94-xcdgl\" (UID: \"02b7c5c2-ac49-498f-9c4c-c64cf484d131\") " pod="openshift-operators/observability-operator-59bdc8b94-xcdgl" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.190579 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fad4abaa-bb3e-4fa2-9478-37e792ead430-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh\" (UID: \"fad4abaa-bb3e-4fa2-9478-37e792ead430\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.189779 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fa24c32b-4905-4756-a765-195d6b0b6c1a-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw\" (UID: \"fa24c32b-4905-4756-a765-195d6b0b6c1a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.188358 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-s4jpt" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.210029 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fad4abaa-bb3e-4fa2-9478-37e792ead430-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh\" (UID: \"fad4abaa-bb3e-4fa2-9478-37e792ead430\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.211557 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fad4abaa-bb3e-4fa2-9478-37e792ead430-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh\" (UID: \"fad4abaa-bb3e-4fa2-9478-37e792ead430\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh" Feb 18 14:08:48 crc kubenswrapper[4817]: E0218 14:08:48.211607 4817 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-2p6zw_openshift-operators_bc446e23-6b46-40cc-b058-5f8d491d8310_0(5d3b6ebe9601c743e3ce117eb08a633a0e6ada061ad79a422139b2d1ae252230): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 18 14:08:48 crc kubenswrapper[4817]: E0218 14:08:48.211842 4817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-2p6zw_openshift-operators_bc446e23-6b46-40cc-b058-5f8d491d8310_0(5d3b6ebe9601c743e3ce117eb08a633a0e6ada061ad79a422139b2d1ae252230): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2p6zw" Feb 18 14:08:48 crc kubenswrapper[4817]: E0218 14:08:48.211940 4817 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-2p6zw_openshift-operators_bc446e23-6b46-40cc-b058-5f8d491d8310_0(5d3b6ebe9601c743e3ce117eb08a633a0e6ada061ad79a422139b2d1ae252230): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2p6zw" Feb 18 14:08:48 crc kubenswrapper[4817]: E0218 14:08:48.212090 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-2p6zw_openshift-operators(bc446e23-6b46-40cc-b058-5f8d491d8310)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-2p6zw_openshift-operators(bc446e23-6b46-40cc-b058-5f8d491d8310)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-2p6zw_openshift-operators_bc446e23-6b46-40cc-b058-5f8d491d8310_0(5d3b6ebe9601c743e3ce117eb08a633a0e6ada061ad79a422139b2d1ae252230): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2p6zw" podUID="bc446e23-6b46-40cc-b058-5f8d491d8310" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.212453 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fa24c32b-4905-4756-a765-195d6b0b6c1a-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw\" (UID: \"fa24c32b-4905-4756-a765-195d6b0b6c1a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.292135 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/02b7c5c2-ac49-498f-9c4c-c64cf484d131-observability-operator-tls\") pod \"observability-operator-59bdc8b94-xcdgl\" (UID: \"02b7c5c2-ac49-498f-9c4c-c64cf484d131\") " pod="openshift-operators/observability-operator-59bdc8b94-xcdgl" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.292207 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/f5816544-7d2c-4bf3-aeab-98f546573810-openshift-service-ca\") pod \"perses-operator-5bf474d74f-qsnlp\" (UID: \"f5816544-7d2c-4bf3-aeab-98f546573810\") " pod="openshift-operators/perses-operator-5bf474d74f-qsnlp" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.292231 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvzcm\" (UniqueName: \"kubernetes.io/projected/f5816544-7d2c-4bf3-aeab-98f546573810-kube-api-access-jvzcm\") pod \"perses-operator-5bf474d74f-qsnlp\" (UID: \"f5816544-7d2c-4bf3-aeab-98f546573810\") " pod="openshift-operators/perses-operator-5bf474d74f-qsnlp" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.292299 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2hgm\" (UniqueName: \"kubernetes.io/projected/02b7c5c2-ac49-498f-9c4c-c64cf484d131-kube-api-access-x2hgm\") pod \"observability-operator-59bdc8b94-xcdgl\" (UID: \"02b7c5c2-ac49-498f-9c4c-c64cf484d131\") " pod="openshift-operators/observability-operator-59bdc8b94-xcdgl" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.295279 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.297837 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/02b7c5c2-ac49-498f-9c4c-c64cf484d131-observability-operator-tls\") pod \"observability-operator-59bdc8b94-xcdgl\" (UID: \"02b7c5c2-ac49-498f-9c4c-c64cf484d131\") " pod="openshift-operators/observability-operator-59bdc8b94-xcdgl" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.311929 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2hgm\" (UniqueName: \"kubernetes.io/projected/02b7c5c2-ac49-498f-9c4c-c64cf484d131-kube-api-access-x2hgm\") pod \"observability-operator-59bdc8b94-xcdgl\" (UID: \"02b7c5c2-ac49-498f-9c4c-c64cf484d131\") " pod="openshift-operators/observability-operator-59bdc8b94-xcdgl" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.312602 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh" Feb 18 14:08:48 crc kubenswrapper[4817]: E0218 14:08:48.329431 4817 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw_openshift-operators_fa24c32b-4905-4756-a765-195d6b0b6c1a_0(129e0de421b9bc841dd468a8ff86b85b79b91d346260e18d1aed07179f8e9335): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 18 14:08:48 crc kubenswrapper[4817]: E0218 14:08:48.329573 4817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw_openshift-operators_fa24c32b-4905-4756-a765-195d6b0b6c1a_0(129e0de421b9bc841dd468a8ff86b85b79b91d346260e18d1aed07179f8e9335): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw" Feb 18 14:08:48 crc kubenswrapper[4817]: E0218 14:08:48.329644 4817 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw_openshift-operators_fa24c32b-4905-4756-a765-195d6b0b6c1a_0(129e0de421b9bc841dd468a8ff86b85b79b91d346260e18d1aed07179f8e9335): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw" Feb 18 14:08:48 crc kubenswrapper[4817]: E0218 14:08:48.329739 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw_openshift-operators(fa24c32b-4905-4756-a765-195d6b0b6c1a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw_openshift-operators(fa24c32b-4905-4756-a765-195d6b0b6c1a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw_openshift-operators_fa24c32b-4905-4756-a765-195d6b0b6c1a_0(129e0de421b9bc841dd468a8ff86b85b79b91d346260e18d1aed07179f8e9335): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw" podUID="fa24c32b-4905-4756-a765-195d6b0b6c1a" Feb 18 14:08:48 crc kubenswrapper[4817]: E0218 14:08:48.339816 4817 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh_openshift-operators_fad4abaa-bb3e-4fa2-9478-37e792ead430_0(b5805d3e32ad21f5dacc9b52ea2bd0e3de241f576fa24ecda7c8ddc36ac94819): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 18 14:08:48 crc kubenswrapper[4817]: E0218 14:08:48.339938 4817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh_openshift-operators_fad4abaa-bb3e-4fa2-9478-37e792ead430_0(b5805d3e32ad21f5dacc9b52ea2bd0e3de241f576fa24ecda7c8ddc36ac94819): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh" Feb 18 14:08:48 crc kubenswrapper[4817]: E0218 14:08:48.340085 4817 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh_openshift-operators_fad4abaa-bb3e-4fa2-9478-37e792ead430_0(b5805d3e32ad21f5dacc9b52ea2bd0e3de241f576fa24ecda7c8ddc36ac94819): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh" Feb 18 14:08:48 crc kubenswrapper[4817]: E0218 14:08:48.340189 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh_openshift-operators(fad4abaa-bb3e-4fa2-9478-37e792ead430)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh_openshift-operators(fad4abaa-bb3e-4fa2-9478-37e792ead430)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh_openshift-operators_fad4abaa-bb3e-4fa2-9478-37e792ead430_0(b5805d3e32ad21f5dacc9b52ea2bd0e3de241f576fa24ecda7c8ddc36ac94819): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh" podUID="fad4abaa-bb3e-4fa2-9478-37e792ead430" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.368497 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" event={"ID":"ca32d0f6-1d85-40cb-b6d4-12aaf8d345a5","Type":"ContainerStarted","Data":"faac87606bb33cb9e45c6b8845b1d772e81c3247c4ed02cdb28341e0401862de"} Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.368851 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.368955 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.386700 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-xcdgl" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.393196 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/f5816544-7d2c-4bf3-aeab-98f546573810-openshift-service-ca\") pod \"perses-operator-5bf474d74f-qsnlp\" (UID: \"f5816544-7d2c-4bf3-aeab-98f546573810\") " pod="openshift-operators/perses-operator-5bf474d74f-qsnlp" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.393369 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvzcm\" (UniqueName: \"kubernetes.io/projected/f5816544-7d2c-4bf3-aeab-98f546573810-kube-api-access-jvzcm\") pod \"perses-operator-5bf474d74f-qsnlp\" (UID: \"f5816544-7d2c-4bf3-aeab-98f546573810\") " pod="openshift-operators/perses-operator-5bf474d74f-qsnlp" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.394134 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/f5816544-7d2c-4bf3-aeab-98f546573810-openshift-service-ca\") pod \"perses-operator-5bf474d74f-qsnlp\" (UID: \"f5816544-7d2c-4bf3-aeab-98f546573810\") " pod="openshift-operators/perses-operator-5bf474d74f-qsnlp" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.395264 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.401621 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" podStartSLOduration=7.401600342 podStartE2EDuration="7.401600342s" podCreationTimestamp="2026-02-18 14:08:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:08:48.400552006 +0000 UTC m=+590.976087999" watchObservedRunningTime="2026-02-18 14:08:48.401600342 +0000 UTC m=+590.977136335" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.430852 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvzcm\" (UniqueName: \"kubernetes.io/projected/f5816544-7d2c-4bf3-aeab-98f546573810-kube-api-access-jvzcm\") pod \"perses-operator-5bf474d74f-qsnlp\" (UID: \"f5816544-7d2c-4bf3-aeab-98f546573810\") " pod="openshift-operators/perses-operator-5bf474d74f-qsnlp" Feb 18 14:08:48 crc kubenswrapper[4817]: E0218 14:08:48.433913 4817 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-xcdgl_openshift-operators_02b7c5c2-ac49-498f-9c4c-c64cf484d131_0(1d6de517f1b7130209b514f8e74a7bfc0730b4cc39f972f976dec6e7b64bc6d8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 18 14:08:48 crc kubenswrapper[4817]: E0218 14:08:48.434011 4817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-xcdgl_openshift-operators_02b7c5c2-ac49-498f-9c4c-c64cf484d131_0(1d6de517f1b7130209b514f8e74a7bfc0730b4cc39f972f976dec6e7b64bc6d8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-xcdgl" Feb 18 14:08:48 crc kubenswrapper[4817]: E0218 14:08:48.434041 4817 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-xcdgl_openshift-operators_02b7c5c2-ac49-498f-9c4c-c64cf484d131_0(1d6de517f1b7130209b514f8e74a7bfc0730b4cc39f972f976dec6e7b64bc6d8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-xcdgl" Feb 18 14:08:48 crc kubenswrapper[4817]: E0218 14:08:48.434090 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-xcdgl_openshift-operators(02b7c5c2-ac49-498f-9c4c-c64cf484d131)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-xcdgl_openshift-operators(02b7c5c2-ac49-498f-9c4c-c64cf484d131)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-xcdgl_openshift-operators_02b7c5c2-ac49-498f-9c4c-c64cf484d131_0(1d6de517f1b7130209b514f8e74a7bfc0730b4cc39f972f976dec6e7b64bc6d8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-xcdgl" podUID="02b7c5c2-ac49-498f-9c4c-c64cf484d131" Feb 18 14:08:48 crc kubenswrapper[4817]: I0218 14:08:48.531455 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-qsnlp" Feb 18 14:08:48 crc kubenswrapper[4817]: E0218 14:08:48.558370 4817 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-qsnlp_openshift-operators_f5816544-7d2c-4bf3-aeab-98f546573810_0(14de12a902c54b4065439457fb6e567332f2933c30f6824ba899f3b252567fcb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 18 14:08:48 crc kubenswrapper[4817]: E0218 14:08:48.558445 4817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-qsnlp_openshift-operators_f5816544-7d2c-4bf3-aeab-98f546573810_0(14de12a902c54b4065439457fb6e567332f2933c30f6824ba899f3b252567fcb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-qsnlp" Feb 18 14:08:48 crc kubenswrapper[4817]: E0218 14:08:48.558469 4817 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-qsnlp_openshift-operators_f5816544-7d2c-4bf3-aeab-98f546573810_0(14de12a902c54b4065439457fb6e567332f2933c30f6824ba899f3b252567fcb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-qsnlp" Feb 18 14:08:48 crc kubenswrapper[4817]: E0218 14:08:48.558517 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-qsnlp_openshift-operators(f5816544-7d2c-4bf3-aeab-98f546573810)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-qsnlp_openshift-operators(f5816544-7d2c-4bf3-aeab-98f546573810)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-qsnlp_openshift-operators_f5816544-7d2c-4bf3-aeab-98f546573810_0(14de12a902c54b4065439457fb6e567332f2933c30f6824ba899f3b252567fcb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-qsnlp" podUID="f5816544-7d2c-4bf3-aeab-98f546573810" Feb 18 14:08:49 crc kubenswrapper[4817]: I0218 14:08:49.373171 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:49 crc kubenswrapper[4817]: I0218 14:08:49.438186 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:08:50 crc kubenswrapper[4817]: I0218 14:08:50.211562 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-qsnlp"] Feb 18 14:08:50 crc kubenswrapper[4817]: I0218 14:08:50.212229 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-qsnlp" Feb 18 14:08:50 crc kubenswrapper[4817]: I0218 14:08:50.212769 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-qsnlp" Feb 18 14:08:50 crc kubenswrapper[4817]: I0218 14:08:50.218876 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw"] Feb 18 14:08:50 crc kubenswrapper[4817]: I0218 14:08:50.219007 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw" Feb 18 14:08:50 crc kubenswrapper[4817]: I0218 14:08:50.219565 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw" Feb 18 14:08:50 crc kubenswrapper[4817]: E0218 14:08:50.255532 4817 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-qsnlp_openshift-operators_f5816544-7d2c-4bf3-aeab-98f546573810_0(b63b19a2ff823e3a4a78b638104af54a3a6c5c0bdb4ff7c156eb3f9362f9433d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 18 14:08:50 crc kubenswrapper[4817]: E0218 14:08:50.255649 4817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-qsnlp_openshift-operators_f5816544-7d2c-4bf3-aeab-98f546573810_0(b63b19a2ff823e3a4a78b638104af54a3a6c5c0bdb4ff7c156eb3f9362f9433d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-qsnlp" Feb 18 14:08:50 crc kubenswrapper[4817]: E0218 14:08:50.255685 4817 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-qsnlp_openshift-operators_f5816544-7d2c-4bf3-aeab-98f546573810_0(b63b19a2ff823e3a4a78b638104af54a3a6c5c0bdb4ff7c156eb3f9362f9433d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-qsnlp" Feb 18 14:08:50 crc kubenswrapper[4817]: E0218 14:08:50.255741 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-qsnlp_openshift-operators(f5816544-7d2c-4bf3-aeab-98f546573810)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-qsnlp_openshift-operators(f5816544-7d2c-4bf3-aeab-98f546573810)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-qsnlp_openshift-operators_f5816544-7d2c-4bf3-aeab-98f546573810_0(b63b19a2ff823e3a4a78b638104af54a3a6c5c0bdb4ff7c156eb3f9362f9433d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-qsnlp" podUID="f5816544-7d2c-4bf3-aeab-98f546573810" Feb 18 14:08:50 crc kubenswrapper[4817]: I0218 14:08:50.268360 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh"] Feb 18 14:08:50 crc kubenswrapper[4817]: I0218 14:08:50.268479 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh" Feb 18 14:08:50 crc kubenswrapper[4817]: I0218 14:08:50.268953 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh" Feb 18 14:08:50 crc kubenswrapper[4817]: E0218 14:08:50.270644 4817 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw_openshift-operators_fa24c32b-4905-4756-a765-195d6b0b6c1a_0(1ee63cf84d2ce78fec430fdca6dd70af26110ce7409f2d230a786a4f2ce99c61): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 18 14:08:50 crc kubenswrapper[4817]: E0218 14:08:50.270692 4817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw_openshift-operators_fa24c32b-4905-4756-a765-195d6b0b6c1a_0(1ee63cf84d2ce78fec430fdca6dd70af26110ce7409f2d230a786a4f2ce99c61): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw" Feb 18 14:08:50 crc kubenswrapper[4817]: E0218 14:08:50.270725 4817 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw_openshift-operators_fa24c32b-4905-4756-a765-195d6b0b6c1a_0(1ee63cf84d2ce78fec430fdca6dd70af26110ce7409f2d230a786a4f2ce99c61): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw" Feb 18 14:08:50 crc kubenswrapper[4817]: E0218 14:08:50.270782 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw_openshift-operators(fa24c32b-4905-4756-a765-195d6b0b6c1a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw_openshift-operators(fa24c32b-4905-4756-a765-195d6b0b6c1a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw_openshift-operators_fa24c32b-4905-4756-a765-195d6b0b6c1a_0(1ee63cf84d2ce78fec430fdca6dd70af26110ce7409f2d230a786a4f2ce99c61): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw" podUID="fa24c32b-4905-4756-a765-195d6b0b6c1a" Feb 18 14:08:50 crc kubenswrapper[4817]: I0218 14:08:50.271432 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-2p6zw"] Feb 18 14:08:50 crc kubenswrapper[4817]: I0218 14:08:50.271582 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2p6zw" Feb 18 14:08:50 crc kubenswrapper[4817]: I0218 14:08:50.272219 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2p6zw" Feb 18 14:08:50 crc kubenswrapper[4817]: I0218 14:08:50.281825 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-xcdgl"] Feb 18 14:08:50 crc kubenswrapper[4817]: I0218 14:08:50.281937 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-xcdgl" Feb 18 14:08:50 crc kubenswrapper[4817]: I0218 14:08:50.282495 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-xcdgl" Feb 18 14:08:50 crc kubenswrapper[4817]: E0218 14:08:50.305147 4817 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh_openshift-operators_fad4abaa-bb3e-4fa2-9478-37e792ead430_0(ddd2be83fea452708eb5f37442421da75afd7ccb1ba26cfc62ee9564a21934dd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 18 14:08:50 crc kubenswrapper[4817]: E0218 14:08:50.305203 4817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh_openshift-operators_fad4abaa-bb3e-4fa2-9478-37e792ead430_0(ddd2be83fea452708eb5f37442421da75afd7ccb1ba26cfc62ee9564a21934dd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh" Feb 18 14:08:50 crc kubenswrapper[4817]: E0218 14:08:50.305226 4817 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh_openshift-operators_fad4abaa-bb3e-4fa2-9478-37e792ead430_0(ddd2be83fea452708eb5f37442421da75afd7ccb1ba26cfc62ee9564a21934dd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh" Feb 18 14:08:50 crc kubenswrapper[4817]: E0218 14:08:50.305267 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh_openshift-operators(fad4abaa-bb3e-4fa2-9478-37e792ead430)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh_openshift-operators(fad4abaa-bb3e-4fa2-9478-37e792ead430)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh_openshift-operators_fad4abaa-bb3e-4fa2-9478-37e792ead430_0(ddd2be83fea452708eb5f37442421da75afd7ccb1ba26cfc62ee9564a21934dd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh" podUID="fad4abaa-bb3e-4fa2-9478-37e792ead430" Feb 18 14:08:50 crc kubenswrapper[4817]: E0218 14:08:50.330918 4817 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-2p6zw_openshift-operators_bc446e23-6b46-40cc-b058-5f8d491d8310_0(100bd22730871d0165ddec3479045941dae7943d79a671e66318580eeb1538e0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 18 14:08:50 crc kubenswrapper[4817]: E0218 14:08:50.330996 4817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-2p6zw_openshift-operators_bc446e23-6b46-40cc-b058-5f8d491d8310_0(100bd22730871d0165ddec3479045941dae7943d79a671e66318580eeb1538e0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2p6zw" Feb 18 14:08:50 crc kubenswrapper[4817]: E0218 14:08:50.331058 4817 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-2p6zw_openshift-operators_bc446e23-6b46-40cc-b058-5f8d491d8310_0(100bd22730871d0165ddec3479045941dae7943d79a671e66318580eeb1538e0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2p6zw" Feb 18 14:08:50 crc kubenswrapper[4817]: E0218 14:08:50.331127 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-2p6zw_openshift-operators(bc446e23-6b46-40cc-b058-5f8d491d8310)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-2p6zw_openshift-operators(bc446e23-6b46-40cc-b058-5f8d491d8310)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-2p6zw_openshift-operators_bc446e23-6b46-40cc-b058-5f8d491d8310_0(100bd22730871d0165ddec3479045941dae7943d79a671e66318580eeb1538e0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2p6zw" podUID="bc446e23-6b46-40cc-b058-5f8d491d8310" Feb 18 14:08:50 crc kubenswrapper[4817]: E0218 14:08:50.337327 4817 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-xcdgl_openshift-operators_02b7c5c2-ac49-498f-9c4c-c64cf484d131_0(e59ec0a048bfcc1ac893a9ffea56c2eed26478821ae6296ddf22cdf83bd819ba): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 18 14:08:50 crc kubenswrapper[4817]: E0218 14:08:50.337395 4817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-xcdgl_openshift-operators_02b7c5c2-ac49-498f-9c4c-c64cf484d131_0(e59ec0a048bfcc1ac893a9ffea56c2eed26478821ae6296ddf22cdf83bd819ba): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-xcdgl" Feb 18 14:08:50 crc kubenswrapper[4817]: E0218 14:08:50.337419 4817 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-xcdgl_openshift-operators_02b7c5c2-ac49-498f-9c4c-c64cf484d131_0(e59ec0a048bfcc1ac893a9ffea56c2eed26478821ae6296ddf22cdf83bd819ba): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-xcdgl" Feb 18 14:08:50 crc kubenswrapper[4817]: E0218 14:08:50.337461 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-xcdgl_openshift-operators(02b7c5c2-ac49-498f-9c4c-c64cf484d131)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-xcdgl_openshift-operators(02b7c5c2-ac49-498f-9c4c-c64cf484d131)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-xcdgl_openshift-operators_02b7c5c2-ac49-498f-9c4c-c64cf484d131_0(e59ec0a048bfcc1ac893a9ffea56c2eed26478821ae6296ddf22cdf83bd819ba): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-xcdgl" podUID="02b7c5c2-ac49-498f-9c4c-c64cf484d131" Feb 18 14:09:01 crc kubenswrapper[4817]: I0218 14:09:01.171276 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-xcdgl" Feb 18 14:09:01 crc kubenswrapper[4817]: I0218 14:09:01.171276 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-qsnlp" Feb 18 14:09:01 crc kubenswrapper[4817]: I0218 14:09:01.173479 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-xcdgl" Feb 18 14:09:01 crc kubenswrapper[4817]: I0218 14:09:01.173505 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-qsnlp" Feb 18 14:09:01 crc kubenswrapper[4817]: I0218 14:09:01.387241 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-qsnlp"] Feb 18 14:09:01 crc kubenswrapper[4817]: W0218 14:09:01.391988 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5816544_7d2c_4bf3_aeab_98f546573810.slice/crio-6bfb2d61590241c8acfe43233466e779aa50550f02709cda1d8fbf717e35df02 WatchSource:0}: Error finding container 6bfb2d61590241c8acfe43233466e779aa50550f02709cda1d8fbf717e35df02: Status 404 returned error can't find the container with id 6bfb2d61590241c8acfe43233466e779aa50550f02709cda1d8fbf717e35df02 Feb 18 14:09:01 crc kubenswrapper[4817]: I0218 14:09:01.429589 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-qsnlp" event={"ID":"f5816544-7d2c-4bf3-aeab-98f546573810","Type":"ContainerStarted","Data":"6bfb2d61590241c8acfe43233466e779aa50550f02709cda1d8fbf717e35df02"} Feb 18 14:09:01 crc kubenswrapper[4817]: I0218 14:09:01.651521 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-xcdgl"] Feb 18 14:09:01 crc kubenswrapper[4817]: W0218 14:09:01.657432 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02b7c5c2_ac49_498f_9c4c_c64cf484d131.slice/crio-b9b7927c18eb7fc6633c2deb74fdc222499e9e970c34de898ce41b82afcb51b3 WatchSource:0}: Error finding container b9b7927c18eb7fc6633c2deb74fdc222499e9e970c34de898ce41b82afcb51b3: Status 404 returned error can't find the container with id b9b7927c18eb7fc6633c2deb74fdc222499e9e970c34de898ce41b82afcb51b3 Feb 18 14:09:02 crc kubenswrapper[4817]: I0218 14:09:02.172214 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw" Feb 18 14:09:02 crc kubenswrapper[4817]: I0218 14:09:02.172426 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw" Feb 18 14:09:02 crc kubenswrapper[4817]: I0218 14:09:02.438331 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-xcdgl" event={"ID":"02b7c5c2-ac49-498f-9c4c-c64cf484d131","Type":"ContainerStarted","Data":"b9b7927c18eb7fc6633c2deb74fdc222499e9e970c34de898ce41b82afcb51b3"} Feb 18 14:09:02 crc kubenswrapper[4817]: I0218 14:09:02.601628 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw"] Feb 18 14:09:03 crc kubenswrapper[4817]: I0218 14:09:03.170560 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2p6zw" Feb 18 14:09:03 crc kubenswrapper[4817]: I0218 14:09:03.170947 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2p6zw" Feb 18 14:09:03 crc kubenswrapper[4817]: I0218 14:09:03.447185 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw" event={"ID":"fa24c32b-4905-4756-a765-195d6b0b6c1a","Type":"ContainerStarted","Data":"45e4d5febb6e32f7eb466399c86c80baf2a5d2f8e5cc41ea9ed08d2315ef1530"} Feb 18 14:09:03 crc kubenswrapper[4817]: I0218 14:09:03.593250 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-2p6zw"] Feb 18 14:09:03 crc kubenswrapper[4817]: W0218 14:09:03.600671 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc446e23_6b46_40cc_b058_5f8d491d8310.slice/crio-a33c4c1a07ba1a5aa655416852b03a0d149d79a854529e5dec65b25040fb45b7 WatchSource:0}: Error finding container a33c4c1a07ba1a5aa655416852b03a0d149d79a854529e5dec65b25040fb45b7: Status 404 returned error can't find the container with id a33c4c1a07ba1a5aa655416852b03a0d149d79a854529e5dec65b25040fb45b7 Feb 18 14:09:04 crc kubenswrapper[4817]: I0218 14:09:04.171257 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh" Feb 18 14:09:04 crc kubenswrapper[4817]: I0218 14:09:04.171766 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh" Feb 18 14:09:04 crc kubenswrapper[4817]: I0218 14:09:04.455676 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2p6zw" event={"ID":"bc446e23-6b46-40cc-b058-5f8d491d8310","Type":"ContainerStarted","Data":"a33c4c1a07ba1a5aa655416852b03a0d149d79a854529e5dec65b25040fb45b7"} Feb 18 14:09:08 crc kubenswrapper[4817]: I0218 14:09:08.082591 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh"] Feb 18 14:09:08 crc kubenswrapper[4817]: W0218 14:09:08.091303 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfad4abaa_bb3e_4fa2_9478_37e792ead430.slice/crio-461396a01b3cc632b2f262e57a717c358b427c4cd180d029a4cc11a236f1175f WatchSource:0}: Error finding container 461396a01b3cc632b2f262e57a717c358b427c4cd180d029a4cc11a236f1175f: Status 404 returned error can't find the container with id 461396a01b3cc632b2f262e57a717c358b427c4cd180d029a4cc11a236f1175f Feb 18 14:09:08 crc kubenswrapper[4817]: I0218 14:09:08.485318 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-xcdgl" event={"ID":"02b7c5c2-ac49-498f-9c4c-c64cf484d131","Type":"ContainerStarted","Data":"c623a14249a2ebda6e041dc53b69311775dd33a4674d6ad37ddeae4bd662b0ca"} Feb 18 14:09:08 crc kubenswrapper[4817]: I0218 14:09:08.486436 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-xcdgl" Feb 18 14:09:08 crc kubenswrapper[4817]: I0218 14:09:08.487282 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw" event={"ID":"fa24c32b-4905-4756-a765-195d6b0b6c1a","Type":"ContainerStarted","Data":"5babde1a30429cd09c0c9edc81f6bd2b9ccf97fa51744c63d7b281b407fd7746"} Feb 18 14:09:08 crc kubenswrapper[4817]: I0218 14:09:08.488089 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-xcdgl" Feb 18 14:09:08 crc kubenswrapper[4817]: I0218 14:09:08.502183 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-qsnlp" event={"ID":"f5816544-7d2c-4bf3-aeab-98f546573810","Type":"ContainerStarted","Data":"f31c6f890786ac9e3a367da714deddf9f447060cd6fb182671183f0366dc18dc"} Feb 18 14:09:08 crc kubenswrapper[4817]: I0218 14:09:08.503288 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-qsnlp" Feb 18 14:09:08 crc kubenswrapper[4817]: I0218 14:09:08.512185 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2p6zw" event={"ID":"bc446e23-6b46-40cc-b058-5f8d491d8310","Type":"ContainerStarted","Data":"8cce60e8d03807e0fc3824877a181806a082756abdc85fdef7a692712581ad5c"} Feb 18 14:09:08 crc kubenswrapper[4817]: I0218 14:09:08.513809 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh" event={"ID":"fad4abaa-bb3e-4fa2-9478-37e792ead430","Type":"ContainerStarted","Data":"adcd209428bcf2c67347b6c7e557bd699b8e77480c7e3f8dfa93a945709e522f"} Feb 18 14:09:08 crc kubenswrapper[4817]: I0218 14:09:08.513845 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh" event={"ID":"fad4abaa-bb3e-4fa2-9478-37e792ead430","Type":"ContainerStarted","Data":"461396a01b3cc632b2f262e57a717c358b427c4cd180d029a4cc11a236f1175f"} Feb 18 14:09:08 crc kubenswrapper[4817]: I0218 14:09:08.518549 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-xcdgl" podStartSLOduration=14.279749084 podStartE2EDuration="20.518530566s" podCreationTimestamp="2026-02-18 14:08:48 +0000 UTC" firstStartedPulling="2026-02-18 14:09:01.660144056 +0000 UTC m=+604.235680039" lastFinishedPulling="2026-02-18 14:09:07.898925538 +0000 UTC m=+610.474461521" observedRunningTime="2026-02-18 14:09:08.514092226 +0000 UTC m=+611.089628209" watchObservedRunningTime="2026-02-18 14:09:08.518530566 +0000 UTC m=+611.094066549" Feb 18 14:09:08 crc kubenswrapper[4817]: I0218 14:09:08.562189 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh" podStartSLOduration=21.562171976 podStartE2EDuration="21.562171976s" podCreationTimestamp="2026-02-18 14:08:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:09:08.529852886 +0000 UTC m=+611.105388879" watchObservedRunningTime="2026-02-18 14:09:08.562171976 +0000 UTC m=+611.137707959" Feb 18 14:09:08 crc kubenswrapper[4817]: I0218 14:09:08.563228 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2p6zw" podStartSLOduration=17.20694181 podStartE2EDuration="21.563220872s" podCreationTimestamp="2026-02-18 14:08:47 +0000 UTC" firstStartedPulling="2026-02-18 14:09:03.602920717 +0000 UTC m=+606.178456700" lastFinishedPulling="2026-02-18 14:09:07.959199769 +0000 UTC m=+610.534735762" observedRunningTime="2026-02-18 14:09:08.56033816 +0000 UTC m=+611.135874153" watchObservedRunningTime="2026-02-18 14:09:08.563220872 +0000 UTC m=+611.138756855" Feb 18 14:09:08 crc kubenswrapper[4817]: I0218 14:09:08.591284 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw" podStartSLOduration=16.259309706 podStartE2EDuration="21.591269695s" podCreationTimestamp="2026-02-18 14:08:47 +0000 UTC" firstStartedPulling="2026-02-18 14:09:02.615351496 +0000 UTC m=+605.190887489" lastFinishedPulling="2026-02-18 14:09:07.947311475 +0000 UTC m=+610.522847478" observedRunningTime="2026-02-18 14:09:08.589085471 +0000 UTC m=+611.164621454" watchObservedRunningTime="2026-02-18 14:09:08.591269695 +0000 UTC m=+611.166805678" Feb 18 14:09:08 crc kubenswrapper[4817]: I0218 14:09:08.614999 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-qsnlp" podStartSLOduration=14.062078179 podStartE2EDuration="20.614967472s" podCreationTimestamp="2026-02-18 14:08:48 +0000 UTC" firstStartedPulling="2026-02-18 14:09:01.395008957 +0000 UTC m=+603.970544940" lastFinishedPulling="2026-02-18 14:09:07.94789825 +0000 UTC m=+610.523434233" observedRunningTime="2026-02-18 14:09:08.61206619 +0000 UTC m=+611.187602173" watchObservedRunningTime="2026-02-18 14:09:08.614967472 +0000 UTC m=+611.190503455" Feb 18 14:09:12 crc kubenswrapper[4817]: I0218 14:09:12.108833 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-c2xjs" Feb 18 14:09:12 crc kubenswrapper[4817]: I0218 14:09:12.863090 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:09:12 crc kubenswrapper[4817]: I0218 14:09:12.863458 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:09:18 crc kubenswrapper[4817]: I0218 14:09:18.503587 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-52jl8"] Feb 18 14:09:18 crc kubenswrapper[4817]: I0218 14:09:18.504867 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-52jl8" Feb 18 14:09:18 crc kubenswrapper[4817]: I0218 14:09:18.506886 4817 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-ptggs" Feb 18 14:09:18 crc kubenswrapper[4817]: I0218 14:09:18.510265 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 18 14:09:18 crc kubenswrapper[4817]: I0218 14:09:18.510677 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 18 14:09:18 crc kubenswrapper[4817]: I0218 14:09:18.515924 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-kgq95"] Feb 18 14:09:18 crc kubenswrapper[4817]: I0218 14:09:18.517080 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-kgq95" Feb 18 14:09:18 crc kubenswrapper[4817]: I0218 14:09:18.519577 4817 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-t47ft" Feb 18 14:09:18 crc kubenswrapper[4817]: I0218 14:09:18.521576 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-52jl8"] Feb 18 14:09:18 crc kubenswrapper[4817]: I0218 14:09:18.536036 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-qsnlp" Feb 18 14:09:18 crc kubenswrapper[4817]: I0218 14:09:18.540506 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-kgq95"] Feb 18 14:09:18 crc kubenswrapper[4817]: I0218 14:09:18.563764 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-l2hvx"] Feb 18 14:09:18 crc kubenswrapper[4817]: I0218 14:09:18.564476 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-l2hvx" Feb 18 14:09:18 crc kubenswrapper[4817]: I0218 14:09:18.571029 4817 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-frk8k" Feb 18 14:09:18 crc kubenswrapper[4817]: I0218 14:09:18.579944 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-l2hvx"] Feb 18 14:09:18 crc kubenswrapper[4817]: I0218 14:09:18.685344 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrbs2\" (UniqueName: \"kubernetes.io/projected/00d35822-7854-4547-8c51-7d8f747fcb9c-kube-api-access-vrbs2\") pod \"cert-manager-cainjector-cf98fcc89-kgq95\" (UID: \"00d35822-7854-4547-8c51-7d8f747fcb9c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-kgq95" Feb 18 14:09:18 crc kubenswrapper[4817]: I0218 14:09:18.685538 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxst9\" (UniqueName: \"kubernetes.io/projected/cff7c17a-00dd-470b-a121-c8e86485d4ac-kube-api-access-bxst9\") pod \"cert-manager-858654f9db-52jl8\" (UID: \"cff7c17a-00dd-470b-a121-c8e86485d4ac\") " pod="cert-manager/cert-manager-858654f9db-52jl8" Feb 18 14:09:18 crc kubenswrapper[4817]: I0218 14:09:18.685715 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmwnb\" (UniqueName: \"kubernetes.io/projected/64c3da31-0521-4691-86b4-66f99e11c898-kube-api-access-vmwnb\") pod \"cert-manager-webhook-687f57d79b-l2hvx\" (UID: \"64c3da31-0521-4691-86b4-66f99e11c898\") " pod="cert-manager/cert-manager-webhook-687f57d79b-l2hvx" Feb 18 14:09:18 crc kubenswrapper[4817]: I0218 14:09:18.786784 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxst9\" (UniqueName: \"kubernetes.io/projected/cff7c17a-00dd-470b-a121-c8e86485d4ac-kube-api-access-bxst9\") pod \"cert-manager-858654f9db-52jl8\" (UID: \"cff7c17a-00dd-470b-a121-c8e86485d4ac\") " pod="cert-manager/cert-manager-858654f9db-52jl8" Feb 18 14:09:18 crc kubenswrapper[4817]: I0218 14:09:18.786887 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmwnb\" (UniqueName: \"kubernetes.io/projected/64c3da31-0521-4691-86b4-66f99e11c898-kube-api-access-vmwnb\") pod \"cert-manager-webhook-687f57d79b-l2hvx\" (UID: \"64c3da31-0521-4691-86b4-66f99e11c898\") " pod="cert-manager/cert-manager-webhook-687f57d79b-l2hvx" Feb 18 14:09:18 crc kubenswrapper[4817]: I0218 14:09:18.786938 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrbs2\" (UniqueName: \"kubernetes.io/projected/00d35822-7854-4547-8c51-7d8f747fcb9c-kube-api-access-vrbs2\") pod \"cert-manager-cainjector-cf98fcc89-kgq95\" (UID: \"00d35822-7854-4547-8c51-7d8f747fcb9c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-kgq95" Feb 18 14:09:18 crc kubenswrapper[4817]: I0218 14:09:18.808197 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxst9\" (UniqueName: \"kubernetes.io/projected/cff7c17a-00dd-470b-a121-c8e86485d4ac-kube-api-access-bxst9\") pod \"cert-manager-858654f9db-52jl8\" (UID: \"cff7c17a-00dd-470b-a121-c8e86485d4ac\") " pod="cert-manager/cert-manager-858654f9db-52jl8" Feb 18 14:09:18 crc kubenswrapper[4817]: I0218 14:09:18.809528 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrbs2\" (UniqueName: \"kubernetes.io/projected/00d35822-7854-4547-8c51-7d8f747fcb9c-kube-api-access-vrbs2\") pod \"cert-manager-cainjector-cf98fcc89-kgq95\" (UID: \"00d35822-7854-4547-8c51-7d8f747fcb9c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-kgq95" Feb 18 14:09:18 crc kubenswrapper[4817]: I0218 14:09:18.813550 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmwnb\" (UniqueName: \"kubernetes.io/projected/64c3da31-0521-4691-86b4-66f99e11c898-kube-api-access-vmwnb\") pod \"cert-manager-webhook-687f57d79b-l2hvx\" (UID: \"64c3da31-0521-4691-86b4-66f99e11c898\") " pod="cert-manager/cert-manager-webhook-687f57d79b-l2hvx" Feb 18 14:09:18 crc kubenswrapper[4817]: I0218 14:09:18.829348 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-52jl8" Feb 18 14:09:18 crc kubenswrapper[4817]: I0218 14:09:18.840612 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-kgq95" Feb 18 14:09:18 crc kubenswrapper[4817]: I0218 14:09:18.887841 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-l2hvx" Feb 18 14:09:19 crc kubenswrapper[4817]: I0218 14:09:19.169309 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-l2hvx"] Feb 18 14:09:19 crc kubenswrapper[4817]: I0218 14:09:19.289661 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-52jl8"] Feb 18 14:09:19 crc kubenswrapper[4817]: W0218 14:09:19.290429 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcff7c17a_00dd_470b_a121_c8e86485d4ac.slice/crio-4e5246810dfb6775e41cfdf8685e3c4466c293b8d8650f2918990f25fa7fc63f WatchSource:0}: Error finding container 4e5246810dfb6775e41cfdf8685e3c4466c293b8d8650f2918990f25fa7fc63f: Status 404 returned error can't find the container with id 4e5246810dfb6775e41cfdf8685e3c4466c293b8d8650f2918990f25fa7fc63f Feb 18 14:09:19 crc kubenswrapper[4817]: I0218 14:09:19.364327 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-kgq95"] Feb 18 14:09:19 crc kubenswrapper[4817]: I0218 14:09:19.581220 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-52jl8" event={"ID":"cff7c17a-00dd-470b-a121-c8e86485d4ac","Type":"ContainerStarted","Data":"4e5246810dfb6775e41cfdf8685e3c4466c293b8d8650f2918990f25fa7fc63f"} Feb 18 14:09:19 crc kubenswrapper[4817]: I0218 14:09:19.582494 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-l2hvx" event={"ID":"64c3da31-0521-4691-86b4-66f99e11c898","Type":"ContainerStarted","Data":"7ea27dc0f93f035cf47a75c419e0f16e7347682cf5b5980494d467fa710eaa8a"} Feb 18 14:09:19 crc kubenswrapper[4817]: I0218 14:09:19.583568 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-kgq95" event={"ID":"00d35822-7854-4547-8c51-7d8f747fcb9c","Type":"ContainerStarted","Data":"6b7b3b08a84f4211d635c4f969941d6404feac39595222cc3ec1c2e1092245bf"} Feb 18 14:09:27 crc kubenswrapper[4817]: I0218 14:09:27.654738 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-kgq95" event={"ID":"00d35822-7854-4547-8c51-7d8f747fcb9c","Type":"ContainerStarted","Data":"56c8bc6ff1e12204bdac5fc463a1ecc3862395a1a3c561ea7d7982cebe88ab50"} Feb 18 14:09:27 crc kubenswrapper[4817]: I0218 14:09:27.656347 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-52jl8" event={"ID":"cff7c17a-00dd-470b-a121-c8e86485d4ac","Type":"ContainerStarted","Data":"a2eb64330bc1b28c0c55a42082e644f8bd27a4e0654f9db213168fcb9bcbbdc6"} Feb 18 14:09:27 crc kubenswrapper[4817]: I0218 14:09:27.657794 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-l2hvx" event={"ID":"64c3da31-0521-4691-86b4-66f99e11c898","Type":"ContainerStarted","Data":"22b4d389cf6426218aef9b903202c5ed98dc99e7661d7e19dd980ba2386cac63"} Feb 18 14:09:27 crc kubenswrapper[4817]: I0218 14:09:27.657962 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-l2hvx" Feb 18 14:09:27 crc kubenswrapper[4817]: I0218 14:09:27.680369 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-kgq95" podStartSLOduration=1.822716395 podStartE2EDuration="9.680352134s" podCreationTimestamp="2026-02-18 14:09:18 +0000 UTC" firstStartedPulling="2026-02-18 14:09:19.370793505 +0000 UTC m=+621.946329488" lastFinishedPulling="2026-02-18 14:09:27.228429204 +0000 UTC m=+629.803965227" observedRunningTime="2026-02-18 14:09:27.675809131 +0000 UTC m=+630.251345124" watchObservedRunningTime="2026-02-18 14:09:27.680352134 +0000 UTC m=+630.255888117" Feb 18 14:09:27 crc kubenswrapper[4817]: I0218 14:09:27.722462 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-l2hvx" podStartSLOduration=1.673033022 podStartE2EDuration="9.722443315s" podCreationTimestamp="2026-02-18 14:09:18 +0000 UTC" firstStartedPulling="2026-02-18 14:09:19.179058082 +0000 UTC m=+621.754594065" lastFinishedPulling="2026-02-18 14:09:27.228468375 +0000 UTC m=+629.804004358" observedRunningTime="2026-02-18 14:09:27.703096446 +0000 UTC m=+630.278632450" watchObservedRunningTime="2026-02-18 14:09:27.722443315 +0000 UTC m=+630.297979298" Feb 18 14:09:27 crc kubenswrapper[4817]: I0218 14:09:27.723178 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-52jl8" podStartSLOduration=1.606626859 podStartE2EDuration="9.723172143s" podCreationTimestamp="2026-02-18 14:09:18 +0000 UTC" firstStartedPulling="2026-02-18 14:09:19.293252867 +0000 UTC m=+621.868788860" lastFinishedPulling="2026-02-18 14:09:27.409798161 +0000 UTC m=+629.985334144" observedRunningTime="2026-02-18 14:09:27.720593879 +0000 UTC m=+630.296129882" watchObservedRunningTime="2026-02-18 14:09:27.723172143 +0000 UTC m=+630.298708126" Feb 18 14:09:33 crc kubenswrapper[4817]: I0218 14:09:33.890633 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-l2hvx" Feb 18 14:09:42 crc kubenswrapper[4817]: I0218 14:09:42.863372 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:09:42 crc kubenswrapper[4817]: I0218 14:09:42.864013 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:09:42 crc kubenswrapper[4817]: I0218 14:09:42.864068 4817 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" Feb 18 14:09:42 crc kubenswrapper[4817]: I0218 14:09:42.864679 4817 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7dfc7dd34d408c82e87d251482328355deda32f5409047841a6c0bd478ccafc4"} pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 14:09:42 crc kubenswrapper[4817]: I0218 14:09:42.864750 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" containerID="cri-o://7dfc7dd34d408c82e87d251482328355deda32f5409047841a6c0bd478ccafc4" gracePeriod=600 Feb 18 14:09:43 crc kubenswrapper[4817]: I0218 14:09:43.793410 4817 generic.go:334] "Generic (PLEG): container finished" podID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerID="7dfc7dd34d408c82e87d251482328355deda32f5409047841a6c0bd478ccafc4" exitCode=0 Feb 18 14:09:43 crc kubenswrapper[4817]: I0218 14:09:43.793487 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerDied","Data":"7dfc7dd34d408c82e87d251482328355deda32f5409047841a6c0bd478ccafc4"} Feb 18 14:09:43 crc kubenswrapper[4817]: I0218 14:09:43.793901 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerStarted","Data":"45f4df11b9cafd0abed8804744792dcd58abd224e061fc9294ed85d9ec653f5f"} Feb 18 14:09:43 crc kubenswrapper[4817]: I0218 14:09:43.793942 4817 scope.go:117] "RemoveContainer" containerID="d3e9adde8434a7716ab4563cbde77006c2cd5de9992720aea0fc7ac8f5c1757e" Feb 18 14:09:47 crc kubenswrapper[4817]: I0218 14:09:47.365493 4817 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 18 14:09:49 crc kubenswrapper[4817]: I0218 14:09:49.233306 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vc4kx"] Feb 18 14:09:49 crc kubenswrapper[4817]: I0218 14:09:49.234822 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vc4kx" Feb 18 14:09:49 crc kubenswrapper[4817]: I0218 14:09:49.245841 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vc4kx"] Feb 18 14:09:49 crc kubenswrapper[4817]: I0218 14:09:49.279874 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/201c4dfa-e452-4f0f-ab64-b2104e0b1c12-catalog-content\") pod \"certified-operators-vc4kx\" (UID: \"201c4dfa-e452-4f0f-ab64-b2104e0b1c12\") " pod="openshift-marketplace/certified-operators-vc4kx" Feb 18 14:09:49 crc kubenswrapper[4817]: I0218 14:09:49.279928 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/201c4dfa-e452-4f0f-ab64-b2104e0b1c12-utilities\") pod \"certified-operators-vc4kx\" (UID: \"201c4dfa-e452-4f0f-ab64-b2104e0b1c12\") " pod="openshift-marketplace/certified-operators-vc4kx" Feb 18 14:09:49 crc kubenswrapper[4817]: I0218 14:09:49.280077 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd9nn\" (UniqueName: \"kubernetes.io/projected/201c4dfa-e452-4f0f-ab64-b2104e0b1c12-kube-api-access-gd9nn\") pod \"certified-operators-vc4kx\" (UID: \"201c4dfa-e452-4f0f-ab64-b2104e0b1c12\") " pod="openshift-marketplace/certified-operators-vc4kx" Feb 18 14:09:49 crc kubenswrapper[4817]: I0218 14:09:49.380958 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd9nn\" (UniqueName: \"kubernetes.io/projected/201c4dfa-e452-4f0f-ab64-b2104e0b1c12-kube-api-access-gd9nn\") pod \"certified-operators-vc4kx\" (UID: \"201c4dfa-e452-4f0f-ab64-b2104e0b1c12\") " pod="openshift-marketplace/certified-operators-vc4kx" Feb 18 14:09:49 crc kubenswrapper[4817]: I0218 14:09:49.381040 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/201c4dfa-e452-4f0f-ab64-b2104e0b1c12-catalog-content\") pod \"certified-operators-vc4kx\" (UID: \"201c4dfa-e452-4f0f-ab64-b2104e0b1c12\") " pod="openshift-marketplace/certified-operators-vc4kx" Feb 18 14:09:49 crc kubenswrapper[4817]: I0218 14:09:49.381079 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/201c4dfa-e452-4f0f-ab64-b2104e0b1c12-utilities\") pod \"certified-operators-vc4kx\" (UID: \"201c4dfa-e452-4f0f-ab64-b2104e0b1c12\") " pod="openshift-marketplace/certified-operators-vc4kx" Feb 18 14:09:49 crc kubenswrapper[4817]: I0218 14:09:49.381608 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/201c4dfa-e452-4f0f-ab64-b2104e0b1c12-utilities\") pod \"certified-operators-vc4kx\" (UID: \"201c4dfa-e452-4f0f-ab64-b2104e0b1c12\") " pod="openshift-marketplace/certified-operators-vc4kx" Feb 18 14:09:49 crc kubenswrapper[4817]: I0218 14:09:49.381668 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/201c4dfa-e452-4f0f-ab64-b2104e0b1c12-catalog-content\") pod \"certified-operators-vc4kx\" (UID: \"201c4dfa-e452-4f0f-ab64-b2104e0b1c12\") " pod="openshift-marketplace/certified-operators-vc4kx" Feb 18 14:09:49 crc kubenswrapper[4817]: I0218 14:09:49.399746 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd9nn\" (UniqueName: \"kubernetes.io/projected/201c4dfa-e452-4f0f-ab64-b2104e0b1c12-kube-api-access-gd9nn\") pod \"certified-operators-vc4kx\" (UID: \"201c4dfa-e452-4f0f-ab64-b2104e0b1c12\") " pod="openshift-marketplace/certified-operators-vc4kx" Feb 18 14:09:49 crc kubenswrapper[4817]: I0218 14:09:49.606627 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vc4kx" Feb 18 14:09:50 crc kubenswrapper[4817]: I0218 14:09:50.046087 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vc4kx"] Feb 18 14:09:50 crc kubenswrapper[4817]: I0218 14:09:50.841274 4817 generic.go:334] "Generic (PLEG): container finished" podID="201c4dfa-e452-4f0f-ab64-b2104e0b1c12" containerID="3590c45c27c9045c8440c1f80c209a255e17bd0c6a54e14b027b9bbd92dd59f6" exitCode=0 Feb 18 14:09:50 crc kubenswrapper[4817]: I0218 14:09:50.841325 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vc4kx" event={"ID":"201c4dfa-e452-4f0f-ab64-b2104e0b1c12","Type":"ContainerDied","Data":"3590c45c27c9045c8440c1f80c209a255e17bd0c6a54e14b027b9bbd92dd59f6"} Feb 18 14:09:50 crc kubenswrapper[4817]: I0218 14:09:50.841360 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vc4kx" event={"ID":"201c4dfa-e452-4f0f-ab64-b2104e0b1c12","Type":"ContainerStarted","Data":"33ea8b7c82319dbaa0c3cecc05cbf2d0bf3c5ccc3c1157fc971589abf8ca8dfe"} Feb 18 14:09:52 crc kubenswrapper[4817]: I0218 14:09:52.853232 4817 generic.go:334] "Generic (PLEG): container finished" podID="201c4dfa-e452-4f0f-ab64-b2104e0b1c12" containerID="7b9ef3cfb8b64964d157e2cff54136814905fc98cf2f72abaabfbb81fb7d9040" exitCode=0 Feb 18 14:09:52 crc kubenswrapper[4817]: I0218 14:09:52.853385 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vc4kx" event={"ID":"201c4dfa-e452-4f0f-ab64-b2104e0b1c12","Type":"ContainerDied","Data":"7b9ef3cfb8b64964d157e2cff54136814905fc98cf2f72abaabfbb81fb7d9040"} Feb 18 14:09:53 crc kubenswrapper[4817]: I0218 14:09:53.874486 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vc4kx" event={"ID":"201c4dfa-e452-4f0f-ab64-b2104e0b1c12","Type":"ContainerStarted","Data":"e87ca6a60c41b0875181f54fdbdc4496a44b607f77bc7b02edab10a939e746c3"} Feb 18 14:09:53 crc kubenswrapper[4817]: I0218 14:09:53.897763 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vc4kx" podStartSLOduration=2.132430932 podStartE2EDuration="4.897741566s" podCreationTimestamp="2026-02-18 14:09:49 +0000 UTC" firstStartedPulling="2026-02-18 14:09:50.843588836 +0000 UTC m=+653.419124819" lastFinishedPulling="2026-02-18 14:09:53.60889947 +0000 UTC m=+656.184435453" observedRunningTime="2026-02-18 14:09:53.897615773 +0000 UTC m=+656.473151756" watchObservedRunningTime="2026-02-18 14:09:53.897741566 +0000 UTC m=+656.473277569" Feb 18 14:09:59 crc kubenswrapper[4817]: I0218 14:09:59.607319 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vc4kx" Feb 18 14:09:59 crc kubenswrapper[4817]: I0218 14:09:59.608173 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vc4kx" Feb 18 14:09:59 crc kubenswrapper[4817]: I0218 14:09:59.654238 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vc4kx" Feb 18 14:09:59 crc kubenswrapper[4817]: I0218 14:09:59.964894 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vc4kx" Feb 18 14:10:00 crc kubenswrapper[4817]: I0218 14:10:00.010148 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vc4kx"] Feb 18 14:10:01 crc kubenswrapper[4817]: I0218 14:10:01.929375 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vc4kx" podUID="201c4dfa-e452-4f0f-ab64-b2104e0b1c12" containerName="registry-server" containerID="cri-o://e87ca6a60c41b0875181f54fdbdc4496a44b607f77bc7b02edab10a939e746c3" gracePeriod=2 Feb 18 14:10:02 crc kubenswrapper[4817]: I0218 14:10:02.139201 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp"] Feb 18 14:10:02 crc kubenswrapper[4817]: I0218 14:10:02.140478 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp" Feb 18 14:10:02 crc kubenswrapper[4817]: I0218 14:10:02.142396 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 18 14:10:02 crc kubenswrapper[4817]: I0218 14:10:02.142864 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b460cb7c-dd22-42e4-91a1-1eee6a8340dc-util\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp\" (UID: \"b460cb7c-dd22-42e4-91a1-1eee6a8340dc\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp" Feb 18 14:10:02 crc kubenswrapper[4817]: I0218 14:10:02.142906 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b460cb7c-dd22-42e4-91a1-1eee6a8340dc-bundle\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp\" (UID: \"b460cb7c-dd22-42e4-91a1-1eee6a8340dc\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp" Feb 18 14:10:02 crc kubenswrapper[4817]: I0218 14:10:02.142934 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g4bp\" (UniqueName: \"kubernetes.io/projected/b460cb7c-dd22-42e4-91a1-1eee6a8340dc-kube-api-access-2g4bp\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp\" (UID: \"b460cb7c-dd22-42e4-91a1-1eee6a8340dc\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp" Feb 18 14:10:02 crc kubenswrapper[4817]: I0218 14:10:02.154765 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp"] Feb 18 14:10:02 crc kubenswrapper[4817]: I0218 14:10:02.243597 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g4bp\" (UniqueName: \"kubernetes.io/projected/b460cb7c-dd22-42e4-91a1-1eee6a8340dc-kube-api-access-2g4bp\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp\" (UID: \"b460cb7c-dd22-42e4-91a1-1eee6a8340dc\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp" Feb 18 14:10:02 crc kubenswrapper[4817]: I0218 14:10:02.244525 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b460cb7c-dd22-42e4-91a1-1eee6a8340dc-util\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp\" (UID: \"b460cb7c-dd22-42e4-91a1-1eee6a8340dc\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp" Feb 18 14:10:02 crc kubenswrapper[4817]: I0218 14:10:02.244585 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b460cb7c-dd22-42e4-91a1-1eee6a8340dc-bundle\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp\" (UID: \"b460cb7c-dd22-42e4-91a1-1eee6a8340dc\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp" Feb 18 14:10:02 crc kubenswrapper[4817]: I0218 14:10:02.245357 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b460cb7c-dd22-42e4-91a1-1eee6a8340dc-util\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp\" (UID: \"b460cb7c-dd22-42e4-91a1-1eee6a8340dc\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp" Feb 18 14:10:02 crc kubenswrapper[4817]: I0218 14:10:02.246049 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b460cb7c-dd22-42e4-91a1-1eee6a8340dc-bundle\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp\" (UID: \"b460cb7c-dd22-42e4-91a1-1eee6a8340dc\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp" Feb 18 14:10:02 crc kubenswrapper[4817]: I0218 14:10:02.264620 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vc4kx" Feb 18 14:10:02 crc kubenswrapper[4817]: I0218 14:10:02.268276 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g4bp\" (UniqueName: \"kubernetes.io/projected/b460cb7c-dd22-42e4-91a1-1eee6a8340dc-kube-api-access-2g4bp\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp\" (UID: \"b460cb7c-dd22-42e4-91a1-1eee6a8340dc\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp" Feb 18 14:10:02 crc kubenswrapper[4817]: I0218 14:10:02.345752 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/201c4dfa-e452-4f0f-ab64-b2104e0b1c12-utilities\") pod \"201c4dfa-e452-4f0f-ab64-b2104e0b1c12\" (UID: \"201c4dfa-e452-4f0f-ab64-b2104e0b1c12\") " Feb 18 14:10:02 crc kubenswrapper[4817]: I0218 14:10:02.345859 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/201c4dfa-e452-4f0f-ab64-b2104e0b1c12-catalog-content\") pod \"201c4dfa-e452-4f0f-ab64-b2104e0b1c12\" (UID: \"201c4dfa-e452-4f0f-ab64-b2104e0b1c12\") " Feb 18 14:10:02 crc kubenswrapper[4817]: I0218 14:10:02.345899 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd9nn\" (UniqueName: \"kubernetes.io/projected/201c4dfa-e452-4f0f-ab64-b2104e0b1c12-kube-api-access-gd9nn\") pod \"201c4dfa-e452-4f0f-ab64-b2104e0b1c12\" (UID: \"201c4dfa-e452-4f0f-ab64-b2104e0b1c12\") " Feb 18 14:10:02 crc kubenswrapper[4817]: I0218 14:10:02.347636 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/201c4dfa-e452-4f0f-ab64-b2104e0b1c12-utilities" (OuterVolumeSpecName: "utilities") pod "201c4dfa-e452-4f0f-ab64-b2104e0b1c12" (UID: "201c4dfa-e452-4f0f-ab64-b2104e0b1c12"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:10:02 crc kubenswrapper[4817]: I0218 14:10:02.348755 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/201c4dfa-e452-4f0f-ab64-b2104e0b1c12-kube-api-access-gd9nn" (OuterVolumeSpecName: "kube-api-access-gd9nn") pod "201c4dfa-e452-4f0f-ab64-b2104e0b1c12" (UID: "201c4dfa-e452-4f0f-ab64-b2104e0b1c12"). InnerVolumeSpecName "kube-api-access-gd9nn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:10:02 crc kubenswrapper[4817]: I0218 14:10:02.402515 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/201c4dfa-e452-4f0f-ab64-b2104e0b1c12-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "201c4dfa-e452-4f0f-ab64-b2104e0b1c12" (UID: "201c4dfa-e452-4f0f-ab64-b2104e0b1c12"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:10:02 crc kubenswrapper[4817]: I0218 14:10:02.446547 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/201c4dfa-e452-4f0f-ab64-b2104e0b1c12-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 14:10:02 crc kubenswrapper[4817]: I0218 14:10:02.446575 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/201c4dfa-e452-4f0f-ab64-b2104e0b1c12-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 14:10:02 crc kubenswrapper[4817]: I0218 14:10:02.446585 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gd9nn\" (UniqueName: \"kubernetes.io/projected/201c4dfa-e452-4f0f-ab64-b2104e0b1c12-kube-api-access-gd9nn\") on node \"crc\" DevicePath \"\"" Feb 18 14:10:02 crc kubenswrapper[4817]: I0218 14:10:02.455394 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp" Feb 18 14:10:02 crc kubenswrapper[4817]: I0218 14:10:02.744085 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp"] Feb 18 14:10:02 crc kubenswrapper[4817]: I0218 14:10:02.938799 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp" event={"ID":"b460cb7c-dd22-42e4-91a1-1eee6a8340dc","Type":"ContainerStarted","Data":"62d6f3893b6936b6b83ab0c74537ccaef52140cfdc3b01f4e329e5331c3d2b28"} Feb 18 14:10:02 crc kubenswrapper[4817]: I0218 14:10:02.939107 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp" event={"ID":"b460cb7c-dd22-42e4-91a1-1eee6a8340dc","Type":"ContainerStarted","Data":"2110d21e27b61fe95764c7e2f85f389af1a4050cf3fbbdfb4fd1c67548485a8d"} Feb 18 14:10:02 crc kubenswrapper[4817]: I0218 14:10:02.942035 4817 generic.go:334] "Generic (PLEG): container finished" podID="201c4dfa-e452-4f0f-ab64-b2104e0b1c12" containerID="e87ca6a60c41b0875181f54fdbdc4496a44b607f77bc7b02edab10a939e746c3" exitCode=0 Feb 18 14:10:02 crc kubenswrapper[4817]: I0218 14:10:02.942075 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vc4kx" event={"ID":"201c4dfa-e452-4f0f-ab64-b2104e0b1c12","Type":"ContainerDied","Data":"e87ca6a60c41b0875181f54fdbdc4496a44b607f77bc7b02edab10a939e746c3"} Feb 18 14:10:02 crc kubenswrapper[4817]: I0218 14:10:02.942098 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vc4kx" event={"ID":"201c4dfa-e452-4f0f-ab64-b2104e0b1c12","Type":"ContainerDied","Data":"33ea8b7c82319dbaa0c3cecc05cbf2d0bf3c5ccc3c1157fc971589abf8ca8dfe"} Feb 18 14:10:02 crc kubenswrapper[4817]: I0218 14:10:02.942117 4817 scope.go:117] "RemoveContainer" containerID="e87ca6a60c41b0875181f54fdbdc4496a44b607f77bc7b02edab10a939e746c3" Feb 18 14:10:02 crc kubenswrapper[4817]: I0218 14:10:02.942117 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vc4kx" Feb 18 14:10:03 crc kubenswrapper[4817]: I0218 14:10:03.012149 4817 scope.go:117] "RemoveContainer" containerID="7b9ef3cfb8b64964d157e2cff54136814905fc98cf2f72abaabfbb81fb7d9040" Feb 18 14:10:03 crc kubenswrapper[4817]: I0218 14:10:03.023676 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vc4kx"] Feb 18 14:10:03 crc kubenswrapper[4817]: I0218 14:10:03.029382 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vc4kx"] Feb 18 14:10:03 crc kubenswrapper[4817]: I0218 14:10:03.048496 4817 scope.go:117] "RemoveContainer" containerID="3590c45c27c9045c8440c1f80c209a255e17bd0c6a54e14b027b9bbd92dd59f6" Feb 18 14:10:03 crc kubenswrapper[4817]: I0218 14:10:03.075940 4817 scope.go:117] "RemoveContainer" containerID="e87ca6a60c41b0875181f54fdbdc4496a44b607f77bc7b02edab10a939e746c3" Feb 18 14:10:03 crc kubenswrapper[4817]: E0218 14:10:03.076599 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e87ca6a60c41b0875181f54fdbdc4496a44b607f77bc7b02edab10a939e746c3\": container with ID starting with e87ca6a60c41b0875181f54fdbdc4496a44b607f77bc7b02edab10a939e746c3 not found: ID does not exist" containerID="e87ca6a60c41b0875181f54fdbdc4496a44b607f77bc7b02edab10a939e746c3" Feb 18 14:10:03 crc kubenswrapper[4817]: I0218 14:10:03.076681 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e87ca6a60c41b0875181f54fdbdc4496a44b607f77bc7b02edab10a939e746c3"} err="failed to get container status \"e87ca6a60c41b0875181f54fdbdc4496a44b607f77bc7b02edab10a939e746c3\": rpc error: code = NotFound desc = could not find container \"e87ca6a60c41b0875181f54fdbdc4496a44b607f77bc7b02edab10a939e746c3\": container with ID starting with e87ca6a60c41b0875181f54fdbdc4496a44b607f77bc7b02edab10a939e746c3 not found: ID does not exist" Feb 18 14:10:03 crc kubenswrapper[4817]: I0218 14:10:03.076726 4817 scope.go:117] "RemoveContainer" containerID="7b9ef3cfb8b64964d157e2cff54136814905fc98cf2f72abaabfbb81fb7d9040" Feb 18 14:10:03 crc kubenswrapper[4817]: E0218 14:10:03.077409 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b9ef3cfb8b64964d157e2cff54136814905fc98cf2f72abaabfbb81fb7d9040\": container with ID starting with 7b9ef3cfb8b64964d157e2cff54136814905fc98cf2f72abaabfbb81fb7d9040 not found: ID does not exist" containerID="7b9ef3cfb8b64964d157e2cff54136814905fc98cf2f72abaabfbb81fb7d9040" Feb 18 14:10:03 crc kubenswrapper[4817]: I0218 14:10:03.077471 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b9ef3cfb8b64964d157e2cff54136814905fc98cf2f72abaabfbb81fb7d9040"} err="failed to get container status \"7b9ef3cfb8b64964d157e2cff54136814905fc98cf2f72abaabfbb81fb7d9040\": rpc error: code = NotFound desc = could not find container \"7b9ef3cfb8b64964d157e2cff54136814905fc98cf2f72abaabfbb81fb7d9040\": container with ID starting with 7b9ef3cfb8b64964d157e2cff54136814905fc98cf2f72abaabfbb81fb7d9040 not found: ID does not exist" Feb 18 14:10:03 crc kubenswrapper[4817]: I0218 14:10:03.077503 4817 scope.go:117] "RemoveContainer" containerID="3590c45c27c9045c8440c1f80c209a255e17bd0c6a54e14b027b9bbd92dd59f6" Feb 18 14:10:03 crc kubenswrapper[4817]: E0218 14:10:03.078080 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3590c45c27c9045c8440c1f80c209a255e17bd0c6a54e14b027b9bbd92dd59f6\": container with ID starting with 3590c45c27c9045c8440c1f80c209a255e17bd0c6a54e14b027b9bbd92dd59f6 not found: ID does not exist" containerID="3590c45c27c9045c8440c1f80c209a255e17bd0c6a54e14b027b9bbd92dd59f6" Feb 18 14:10:03 crc kubenswrapper[4817]: I0218 14:10:03.078125 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3590c45c27c9045c8440c1f80c209a255e17bd0c6a54e14b027b9bbd92dd59f6"} err="failed to get container status \"3590c45c27c9045c8440c1f80c209a255e17bd0c6a54e14b027b9bbd92dd59f6\": rpc error: code = NotFound desc = could not find container \"3590c45c27c9045c8440c1f80c209a255e17bd0c6a54e14b027b9bbd92dd59f6\": container with ID starting with 3590c45c27c9045c8440c1f80c209a255e17bd0c6a54e14b027b9bbd92dd59f6 not found: ID does not exist" Feb 18 14:10:03 crc kubenswrapper[4817]: I0218 14:10:03.951548 4817 generic.go:334] "Generic (PLEG): container finished" podID="b460cb7c-dd22-42e4-91a1-1eee6a8340dc" containerID="62d6f3893b6936b6b83ab0c74537ccaef52140cfdc3b01f4e329e5331c3d2b28" exitCode=0 Feb 18 14:10:03 crc kubenswrapper[4817]: I0218 14:10:03.951588 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp" event={"ID":"b460cb7c-dd22-42e4-91a1-1eee6a8340dc","Type":"ContainerDied","Data":"62d6f3893b6936b6b83ab0c74537ccaef52140cfdc3b01f4e329e5331c3d2b28"} Feb 18 14:10:04 crc kubenswrapper[4817]: I0218 14:10:04.177997 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="201c4dfa-e452-4f0f-ab64-b2104e0b1c12" path="/var/lib/kubelet/pods/201c4dfa-e452-4f0f-ab64-b2104e0b1c12/volumes" Feb 18 14:10:04 crc kubenswrapper[4817]: I0218 14:10:04.855437 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Feb 18 14:10:04 crc kubenswrapper[4817]: E0218 14:10:04.855784 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="201c4dfa-e452-4f0f-ab64-b2104e0b1c12" containerName="registry-server" Feb 18 14:10:04 crc kubenswrapper[4817]: I0218 14:10:04.855815 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="201c4dfa-e452-4f0f-ab64-b2104e0b1c12" containerName="registry-server" Feb 18 14:10:04 crc kubenswrapper[4817]: E0218 14:10:04.855833 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="201c4dfa-e452-4f0f-ab64-b2104e0b1c12" containerName="extract-utilities" Feb 18 14:10:04 crc kubenswrapper[4817]: I0218 14:10:04.855845 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="201c4dfa-e452-4f0f-ab64-b2104e0b1c12" containerName="extract-utilities" Feb 18 14:10:04 crc kubenswrapper[4817]: E0218 14:10:04.855871 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="201c4dfa-e452-4f0f-ab64-b2104e0b1c12" containerName="extract-content" Feb 18 14:10:04 crc kubenswrapper[4817]: I0218 14:10:04.855883 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="201c4dfa-e452-4f0f-ab64-b2104e0b1c12" containerName="extract-content" Feb 18 14:10:04 crc kubenswrapper[4817]: I0218 14:10:04.856070 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="201c4dfa-e452-4f0f-ab64-b2104e0b1c12" containerName="registry-server" Feb 18 14:10:04 crc kubenswrapper[4817]: I0218 14:10:04.856692 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Feb 18 14:10:04 crc kubenswrapper[4817]: I0218 14:10:04.859642 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Feb 18 14:10:04 crc kubenswrapper[4817]: I0218 14:10:04.859718 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Feb 18 14:10:04 crc kubenswrapper[4817]: I0218 14:10:04.860362 4817 reflector.go:368] Caches populated for *v1.Secret from object-"minio-dev"/"default-dockercfg-rjpft" Feb 18 14:10:04 crc kubenswrapper[4817]: I0218 14:10:04.866940 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Feb 18 14:10:05 crc kubenswrapper[4817]: I0218 14:10:05.002624 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7qrc\" (UniqueName: \"kubernetes.io/projected/127ba13b-a17e-49db-9d7f-4722d6a4cb28-kube-api-access-d7qrc\") pod \"minio\" (UID: \"127ba13b-a17e-49db-9d7f-4722d6a4cb28\") " pod="minio-dev/minio" Feb 18 14:10:05 crc kubenswrapper[4817]: I0218 14:10:05.002968 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6e0d80b6-cbf3-418f-817d-9f94360205e8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e0d80b6-cbf3-418f-817d-9f94360205e8\") pod \"minio\" (UID: \"127ba13b-a17e-49db-9d7f-4722d6a4cb28\") " pod="minio-dev/minio" Feb 18 14:10:05 crc kubenswrapper[4817]: I0218 14:10:05.104798 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6e0d80b6-cbf3-418f-817d-9f94360205e8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e0d80b6-cbf3-418f-817d-9f94360205e8\") pod \"minio\" (UID: \"127ba13b-a17e-49db-9d7f-4722d6a4cb28\") " pod="minio-dev/minio" Feb 18 14:10:05 crc kubenswrapper[4817]: I0218 14:10:05.104899 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7qrc\" (UniqueName: \"kubernetes.io/projected/127ba13b-a17e-49db-9d7f-4722d6a4cb28-kube-api-access-d7qrc\") pod \"minio\" (UID: \"127ba13b-a17e-49db-9d7f-4722d6a4cb28\") " pod="minio-dev/minio" Feb 18 14:10:05 crc kubenswrapper[4817]: I0218 14:10:05.108547 4817 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 18 14:10:05 crc kubenswrapper[4817]: I0218 14:10:05.108612 4817 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6e0d80b6-cbf3-418f-817d-9f94360205e8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e0d80b6-cbf3-418f-817d-9f94360205e8\") pod \"minio\" (UID: \"127ba13b-a17e-49db-9d7f-4722d6a4cb28\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/da10230d15268388b49958379fa1684c8aa981a012861c45e40c101839c4db5a/globalmount\"" pod="minio-dev/minio" Feb 18 14:10:05 crc kubenswrapper[4817]: I0218 14:10:05.131424 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7qrc\" (UniqueName: \"kubernetes.io/projected/127ba13b-a17e-49db-9d7f-4722d6a4cb28-kube-api-access-d7qrc\") pod \"minio\" (UID: \"127ba13b-a17e-49db-9d7f-4722d6a4cb28\") " pod="minio-dev/minio" Feb 18 14:10:05 crc kubenswrapper[4817]: I0218 14:10:05.137191 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6e0d80b6-cbf3-418f-817d-9f94360205e8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e0d80b6-cbf3-418f-817d-9f94360205e8\") pod \"minio\" (UID: \"127ba13b-a17e-49db-9d7f-4722d6a4cb28\") " pod="minio-dev/minio" Feb 18 14:10:05 crc kubenswrapper[4817]: I0218 14:10:05.174685 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Feb 18 14:10:05 crc kubenswrapper[4817]: I0218 14:10:05.350519 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Feb 18 14:10:05 crc kubenswrapper[4817]: I0218 14:10:05.491710 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hn6w5"] Feb 18 14:10:05 crc kubenswrapper[4817]: I0218 14:10:05.492969 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hn6w5" Feb 18 14:10:05 crc kubenswrapper[4817]: I0218 14:10:05.503143 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hn6w5"] Feb 18 14:10:05 crc kubenswrapper[4817]: I0218 14:10:05.610406 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd-catalog-content\") pod \"redhat-operators-hn6w5\" (UID: \"20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd\") " pod="openshift-marketplace/redhat-operators-hn6w5" Feb 18 14:10:05 crc kubenswrapper[4817]: I0218 14:10:05.610523 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5sfw\" (UniqueName: \"kubernetes.io/projected/20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd-kube-api-access-d5sfw\") pod \"redhat-operators-hn6w5\" (UID: \"20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd\") " pod="openshift-marketplace/redhat-operators-hn6w5" Feb 18 14:10:05 crc kubenswrapper[4817]: I0218 14:10:05.610589 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd-utilities\") pod \"redhat-operators-hn6w5\" (UID: \"20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd\") " pod="openshift-marketplace/redhat-operators-hn6w5" Feb 18 14:10:05 crc kubenswrapper[4817]: I0218 14:10:05.711675 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd-utilities\") pod \"redhat-operators-hn6w5\" (UID: \"20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd\") " pod="openshift-marketplace/redhat-operators-hn6w5" Feb 18 14:10:05 crc kubenswrapper[4817]: I0218 14:10:05.711747 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd-catalog-content\") pod \"redhat-operators-hn6w5\" (UID: \"20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd\") " pod="openshift-marketplace/redhat-operators-hn6w5" Feb 18 14:10:05 crc kubenswrapper[4817]: I0218 14:10:05.711780 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5sfw\" (UniqueName: \"kubernetes.io/projected/20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd-kube-api-access-d5sfw\") pod \"redhat-operators-hn6w5\" (UID: \"20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd\") " pod="openshift-marketplace/redhat-operators-hn6w5" Feb 18 14:10:05 crc kubenswrapper[4817]: I0218 14:10:05.712486 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd-catalog-content\") pod \"redhat-operators-hn6w5\" (UID: \"20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd\") " pod="openshift-marketplace/redhat-operators-hn6w5" Feb 18 14:10:05 crc kubenswrapper[4817]: I0218 14:10:05.712563 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd-utilities\") pod \"redhat-operators-hn6w5\" (UID: \"20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd\") " pod="openshift-marketplace/redhat-operators-hn6w5" Feb 18 14:10:05 crc kubenswrapper[4817]: I0218 14:10:05.731453 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5sfw\" (UniqueName: \"kubernetes.io/projected/20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd-kube-api-access-d5sfw\") pod \"redhat-operators-hn6w5\" (UID: \"20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd\") " pod="openshift-marketplace/redhat-operators-hn6w5" Feb 18 14:10:05 crc kubenswrapper[4817]: I0218 14:10:05.829695 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hn6w5" Feb 18 14:10:05 crc kubenswrapper[4817]: I0218 14:10:05.967276 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"127ba13b-a17e-49db-9d7f-4722d6a4cb28","Type":"ContainerStarted","Data":"7e14c76df201e095078f109b468684174b03d381b0db1ae86385208221c55a85"} Feb 18 14:10:05 crc kubenswrapper[4817]: I0218 14:10:05.971216 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp" event={"ID":"b460cb7c-dd22-42e4-91a1-1eee6a8340dc","Type":"ContainerStarted","Data":"ca8f0c01cdc8bce6b7f9b884b43e36a0ac7d432a3011822a41ab6a00989e4916"} Feb 18 14:10:06 crc kubenswrapper[4817]: I0218 14:10:06.060830 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hn6w5"] Feb 18 14:10:06 crc kubenswrapper[4817]: W0218 14:10:06.092196 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20dbc44c_4fae_4a32_a7d1_8a02fc2cd9cd.slice/crio-37ba06a0d8fc3a52780881c69a42854087c080a821c159241a3c629c1e091271 WatchSource:0}: Error finding container 37ba06a0d8fc3a52780881c69a42854087c080a821c159241a3c629c1e091271: Status 404 returned error can't find the container with id 37ba06a0d8fc3a52780881c69a42854087c080a821c159241a3c629c1e091271 Feb 18 14:10:06 crc kubenswrapper[4817]: I0218 14:10:06.981176 4817 generic.go:334] "Generic (PLEG): container finished" podID="20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd" containerID="c91f5b4da90dfed8975ec4e888c0e0a6f79d54d7325bb400670c40c4b26d3d53" exitCode=0 Feb 18 14:10:06 crc kubenswrapper[4817]: I0218 14:10:06.981254 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hn6w5" event={"ID":"20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd","Type":"ContainerDied","Data":"c91f5b4da90dfed8975ec4e888c0e0a6f79d54d7325bb400670c40c4b26d3d53"} Feb 18 14:10:06 crc kubenswrapper[4817]: I0218 14:10:06.981586 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hn6w5" event={"ID":"20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd","Type":"ContainerStarted","Data":"37ba06a0d8fc3a52780881c69a42854087c080a821c159241a3c629c1e091271"} Feb 18 14:10:06 crc kubenswrapper[4817]: I0218 14:10:06.985378 4817 generic.go:334] "Generic (PLEG): container finished" podID="b460cb7c-dd22-42e4-91a1-1eee6a8340dc" containerID="ca8f0c01cdc8bce6b7f9b884b43e36a0ac7d432a3011822a41ab6a00989e4916" exitCode=0 Feb 18 14:10:06 crc kubenswrapper[4817]: I0218 14:10:06.985442 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp" event={"ID":"b460cb7c-dd22-42e4-91a1-1eee6a8340dc","Type":"ContainerDied","Data":"ca8f0c01cdc8bce6b7f9b884b43e36a0ac7d432a3011822a41ab6a00989e4916"} Feb 18 14:10:09 crc kubenswrapper[4817]: I0218 14:10:09.004068 4817 generic.go:334] "Generic (PLEG): container finished" podID="b460cb7c-dd22-42e4-91a1-1eee6a8340dc" containerID="15db5186bc79f8cc67aa27307015be67095a3352dda7cfd365e79b08e2257ab9" exitCode=0 Feb 18 14:10:09 crc kubenswrapper[4817]: I0218 14:10:09.004183 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp" event={"ID":"b460cb7c-dd22-42e4-91a1-1eee6a8340dc","Type":"ContainerDied","Data":"15db5186bc79f8cc67aa27307015be67095a3352dda7cfd365e79b08e2257ab9"} Feb 18 14:10:10 crc kubenswrapper[4817]: I0218 14:10:10.014910 4817 generic.go:334] "Generic (PLEG): container finished" podID="20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd" containerID="e58f3de69ac37102e81c1448f164365e6c288791356395e4c5ecc45404f7af0e" exitCode=0 Feb 18 14:10:10 crc kubenswrapper[4817]: I0218 14:10:10.015001 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hn6w5" event={"ID":"20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd","Type":"ContainerDied","Data":"e58f3de69ac37102e81c1448f164365e6c288791356395e4c5ecc45404f7af0e"} Feb 18 14:10:10 crc kubenswrapper[4817]: I0218 14:10:10.020110 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"127ba13b-a17e-49db-9d7f-4722d6a4cb28","Type":"ContainerStarted","Data":"21dfe13412e888e4023823eb0b9c53fb759e7acbca052d78a5b67a262d55c1e2"} Feb 18 14:10:10 crc kubenswrapper[4817]: I0218 14:10:10.281839 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp" Feb 18 14:10:10 crc kubenswrapper[4817]: I0218 14:10:10.299093 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=5.7788046170000005 podStartE2EDuration="9.29907492s" podCreationTimestamp="2026-02-18 14:10:01 +0000 UTC" firstStartedPulling="2026-02-18 14:10:05.358528568 +0000 UTC m=+667.934064561" lastFinishedPulling="2026-02-18 14:10:08.878798881 +0000 UTC m=+671.454334864" observedRunningTime="2026-02-18 14:10:10.071498049 +0000 UTC m=+672.647034032" watchObservedRunningTime="2026-02-18 14:10:10.29907492 +0000 UTC m=+672.874610913" Feb 18 14:10:10 crc kubenswrapper[4817]: I0218 14:10:10.385658 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2g4bp\" (UniqueName: \"kubernetes.io/projected/b460cb7c-dd22-42e4-91a1-1eee6a8340dc-kube-api-access-2g4bp\") pod \"b460cb7c-dd22-42e4-91a1-1eee6a8340dc\" (UID: \"b460cb7c-dd22-42e4-91a1-1eee6a8340dc\") " Feb 18 14:10:10 crc kubenswrapper[4817]: I0218 14:10:10.385764 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b460cb7c-dd22-42e4-91a1-1eee6a8340dc-util\") pod \"b460cb7c-dd22-42e4-91a1-1eee6a8340dc\" (UID: \"b460cb7c-dd22-42e4-91a1-1eee6a8340dc\") " Feb 18 14:10:10 crc kubenswrapper[4817]: I0218 14:10:10.385829 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b460cb7c-dd22-42e4-91a1-1eee6a8340dc-bundle\") pod \"b460cb7c-dd22-42e4-91a1-1eee6a8340dc\" (UID: \"b460cb7c-dd22-42e4-91a1-1eee6a8340dc\") " Feb 18 14:10:10 crc kubenswrapper[4817]: I0218 14:10:10.386607 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b460cb7c-dd22-42e4-91a1-1eee6a8340dc-bundle" (OuterVolumeSpecName: "bundle") pod "b460cb7c-dd22-42e4-91a1-1eee6a8340dc" (UID: "b460cb7c-dd22-42e4-91a1-1eee6a8340dc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:10:10 crc kubenswrapper[4817]: I0218 14:10:10.392102 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b460cb7c-dd22-42e4-91a1-1eee6a8340dc-kube-api-access-2g4bp" (OuterVolumeSpecName: "kube-api-access-2g4bp") pod "b460cb7c-dd22-42e4-91a1-1eee6a8340dc" (UID: "b460cb7c-dd22-42e4-91a1-1eee6a8340dc"). InnerVolumeSpecName "kube-api-access-2g4bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:10:10 crc kubenswrapper[4817]: I0218 14:10:10.395161 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b460cb7c-dd22-42e4-91a1-1eee6a8340dc-util" (OuterVolumeSpecName: "util") pod "b460cb7c-dd22-42e4-91a1-1eee6a8340dc" (UID: "b460cb7c-dd22-42e4-91a1-1eee6a8340dc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:10:10 crc kubenswrapper[4817]: I0218 14:10:10.487953 4817 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b460cb7c-dd22-42e4-91a1-1eee6a8340dc-util\") on node \"crc\" DevicePath \"\"" Feb 18 14:10:10 crc kubenswrapper[4817]: I0218 14:10:10.488019 4817 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b460cb7c-dd22-42e4-91a1-1eee6a8340dc-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:10:10 crc kubenswrapper[4817]: I0218 14:10:10.488038 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2g4bp\" (UniqueName: \"kubernetes.io/projected/b460cb7c-dd22-42e4-91a1-1eee6a8340dc-kube-api-access-2g4bp\") on node \"crc\" DevicePath \"\"" Feb 18 14:10:11 crc kubenswrapper[4817]: I0218 14:10:11.028244 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp" Feb 18 14:10:11 crc kubenswrapper[4817]: I0218 14:10:11.028290 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp" event={"ID":"b460cb7c-dd22-42e4-91a1-1eee6a8340dc","Type":"ContainerDied","Data":"2110d21e27b61fe95764c7e2f85f389af1a4050cf3fbbdfb4fd1c67548485a8d"} Feb 18 14:10:11 crc kubenswrapper[4817]: I0218 14:10:11.028430 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2110d21e27b61fe95764c7e2f85f389af1a4050cf3fbbdfb4fd1c67548485a8d" Feb 18 14:10:11 crc kubenswrapper[4817]: I0218 14:10:11.030428 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hn6w5" event={"ID":"20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd","Type":"ContainerStarted","Data":"c451f2a6ac96d03e4e06988a89952a7ef4c4308d8c2c0d78913a35d8c066b3a5"} Feb 18 14:10:11 crc kubenswrapper[4817]: I0218 14:10:11.055017 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hn6w5" podStartSLOduration=2.879587612 podStartE2EDuration="6.054994082s" podCreationTimestamp="2026-02-18 14:10:05 +0000 UTC" firstStartedPulling="2026-02-18 14:10:07.290959538 +0000 UTC m=+669.866495521" lastFinishedPulling="2026-02-18 14:10:10.466365978 +0000 UTC m=+673.041901991" observedRunningTime="2026-02-18 14:10:11.050008718 +0000 UTC m=+673.625544721" watchObservedRunningTime="2026-02-18 14:10:11.054994082 +0000 UTC m=+673.630530075" Feb 18 14:10:15 crc kubenswrapper[4817]: I0218 14:10:15.830012 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hn6w5" Feb 18 14:10:15 crc kubenswrapper[4817]: I0218 14:10:15.830468 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hn6w5" Feb 18 14:10:16 crc kubenswrapper[4817]: I0218 14:10:16.545148 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-59d4b4c7c-rvnbw"] Feb 18 14:10:16 crc kubenswrapper[4817]: E0218 14:10:16.545351 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b460cb7c-dd22-42e4-91a1-1eee6a8340dc" containerName="extract" Feb 18 14:10:16 crc kubenswrapper[4817]: I0218 14:10:16.545363 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="b460cb7c-dd22-42e4-91a1-1eee6a8340dc" containerName="extract" Feb 18 14:10:16 crc kubenswrapper[4817]: E0218 14:10:16.545374 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b460cb7c-dd22-42e4-91a1-1eee6a8340dc" containerName="pull" Feb 18 14:10:16 crc kubenswrapper[4817]: I0218 14:10:16.545381 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="b460cb7c-dd22-42e4-91a1-1eee6a8340dc" containerName="pull" Feb 18 14:10:16 crc kubenswrapper[4817]: E0218 14:10:16.545391 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b460cb7c-dd22-42e4-91a1-1eee6a8340dc" containerName="util" Feb 18 14:10:16 crc kubenswrapper[4817]: I0218 14:10:16.545396 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="b460cb7c-dd22-42e4-91a1-1eee6a8340dc" containerName="util" Feb 18 14:10:16 crc kubenswrapper[4817]: I0218 14:10:16.545500 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="b460cb7c-dd22-42e4-91a1-1eee6a8340dc" containerName="extract" Feb 18 14:10:16 crc kubenswrapper[4817]: I0218 14:10:16.546030 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-59d4b4c7c-rvnbw" Feb 18 14:10:16 crc kubenswrapper[4817]: I0218 14:10:16.549004 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-vb85t" Feb 18 14:10:16 crc kubenswrapper[4817]: I0218 14:10:16.549039 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Feb 18 14:10:16 crc kubenswrapper[4817]: I0218 14:10:16.549043 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Feb 18 14:10:16 crc kubenswrapper[4817]: I0218 14:10:16.549279 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Feb 18 14:10:16 crc kubenswrapper[4817]: I0218 14:10:16.549316 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Feb 18 14:10:16 crc kubenswrapper[4817]: I0218 14:10:16.549337 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Feb 18 14:10:16 crc kubenswrapper[4817]: I0218 14:10:16.570720 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-59d4b4c7c-rvnbw"] Feb 18 14:10:16 crc kubenswrapper[4817]: I0218 14:10:16.669546 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/bf27f33f-390f-44fa-91fb-40f18240d0df-manager-config\") pod \"loki-operator-controller-manager-59d4b4c7c-rvnbw\" (UID: \"bf27f33f-390f-44fa-91fb-40f18240d0df\") " pod="openshift-operators-redhat/loki-operator-controller-manager-59d4b4c7c-rvnbw" Feb 18 14:10:16 crc kubenswrapper[4817]: I0218 14:10:16.669627 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bf27f33f-390f-44fa-91fb-40f18240d0df-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-59d4b4c7c-rvnbw\" (UID: \"bf27f33f-390f-44fa-91fb-40f18240d0df\") " pod="openshift-operators-redhat/loki-operator-controller-manager-59d4b4c7c-rvnbw" Feb 18 14:10:16 crc kubenswrapper[4817]: I0218 14:10:16.669699 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6t7z\" (UniqueName: \"kubernetes.io/projected/bf27f33f-390f-44fa-91fb-40f18240d0df-kube-api-access-m6t7z\") pod \"loki-operator-controller-manager-59d4b4c7c-rvnbw\" (UID: \"bf27f33f-390f-44fa-91fb-40f18240d0df\") " pod="openshift-operators-redhat/loki-operator-controller-manager-59d4b4c7c-rvnbw" Feb 18 14:10:16 crc kubenswrapper[4817]: I0218 14:10:16.669751 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bf27f33f-390f-44fa-91fb-40f18240d0df-webhook-cert\") pod \"loki-operator-controller-manager-59d4b4c7c-rvnbw\" (UID: \"bf27f33f-390f-44fa-91fb-40f18240d0df\") " pod="openshift-operators-redhat/loki-operator-controller-manager-59d4b4c7c-rvnbw" Feb 18 14:10:16 crc kubenswrapper[4817]: I0218 14:10:16.669776 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bf27f33f-390f-44fa-91fb-40f18240d0df-apiservice-cert\") pod \"loki-operator-controller-manager-59d4b4c7c-rvnbw\" (UID: \"bf27f33f-390f-44fa-91fb-40f18240d0df\") " pod="openshift-operators-redhat/loki-operator-controller-manager-59d4b4c7c-rvnbw" Feb 18 14:10:16 crc kubenswrapper[4817]: I0218 14:10:16.771727 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6t7z\" (UniqueName: \"kubernetes.io/projected/bf27f33f-390f-44fa-91fb-40f18240d0df-kube-api-access-m6t7z\") pod \"loki-operator-controller-manager-59d4b4c7c-rvnbw\" (UID: \"bf27f33f-390f-44fa-91fb-40f18240d0df\") " pod="openshift-operators-redhat/loki-operator-controller-manager-59d4b4c7c-rvnbw" Feb 18 14:10:16 crc kubenswrapper[4817]: I0218 14:10:16.771811 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bf27f33f-390f-44fa-91fb-40f18240d0df-webhook-cert\") pod \"loki-operator-controller-manager-59d4b4c7c-rvnbw\" (UID: \"bf27f33f-390f-44fa-91fb-40f18240d0df\") " pod="openshift-operators-redhat/loki-operator-controller-manager-59d4b4c7c-rvnbw" Feb 18 14:10:16 crc kubenswrapper[4817]: I0218 14:10:16.771836 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bf27f33f-390f-44fa-91fb-40f18240d0df-apiservice-cert\") pod \"loki-operator-controller-manager-59d4b4c7c-rvnbw\" (UID: \"bf27f33f-390f-44fa-91fb-40f18240d0df\") " pod="openshift-operators-redhat/loki-operator-controller-manager-59d4b4c7c-rvnbw" Feb 18 14:10:16 crc kubenswrapper[4817]: I0218 14:10:16.771886 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/bf27f33f-390f-44fa-91fb-40f18240d0df-manager-config\") pod \"loki-operator-controller-manager-59d4b4c7c-rvnbw\" (UID: \"bf27f33f-390f-44fa-91fb-40f18240d0df\") " pod="openshift-operators-redhat/loki-operator-controller-manager-59d4b4c7c-rvnbw" Feb 18 14:10:16 crc kubenswrapper[4817]: I0218 14:10:16.771948 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bf27f33f-390f-44fa-91fb-40f18240d0df-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-59d4b4c7c-rvnbw\" (UID: \"bf27f33f-390f-44fa-91fb-40f18240d0df\") " pod="openshift-operators-redhat/loki-operator-controller-manager-59d4b4c7c-rvnbw" Feb 18 14:10:16 crc kubenswrapper[4817]: I0218 14:10:16.772905 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/bf27f33f-390f-44fa-91fb-40f18240d0df-manager-config\") pod \"loki-operator-controller-manager-59d4b4c7c-rvnbw\" (UID: \"bf27f33f-390f-44fa-91fb-40f18240d0df\") " pod="openshift-operators-redhat/loki-operator-controller-manager-59d4b4c7c-rvnbw" Feb 18 14:10:16 crc kubenswrapper[4817]: I0218 14:10:16.780760 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bf27f33f-390f-44fa-91fb-40f18240d0df-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-59d4b4c7c-rvnbw\" (UID: \"bf27f33f-390f-44fa-91fb-40f18240d0df\") " pod="openshift-operators-redhat/loki-operator-controller-manager-59d4b4c7c-rvnbw" Feb 18 14:10:16 crc kubenswrapper[4817]: I0218 14:10:16.790807 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bf27f33f-390f-44fa-91fb-40f18240d0df-webhook-cert\") pod \"loki-operator-controller-manager-59d4b4c7c-rvnbw\" (UID: \"bf27f33f-390f-44fa-91fb-40f18240d0df\") " pod="openshift-operators-redhat/loki-operator-controller-manager-59d4b4c7c-rvnbw" Feb 18 14:10:16 crc kubenswrapper[4817]: I0218 14:10:16.794186 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bf27f33f-390f-44fa-91fb-40f18240d0df-apiservice-cert\") pod \"loki-operator-controller-manager-59d4b4c7c-rvnbw\" (UID: \"bf27f33f-390f-44fa-91fb-40f18240d0df\") " pod="openshift-operators-redhat/loki-operator-controller-manager-59d4b4c7c-rvnbw" Feb 18 14:10:16 crc kubenswrapper[4817]: I0218 14:10:16.813658 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6t7z\" (UniqueName: \"kubernetes.io/projected/bf27f33f-390f-44fa-91fb-40f18240d0df-kube-api-access-m6t7z\") pod \"loki-operator-controller-manager-59d4b4c7c-rvnbw\" (UID: \"bf27f33f-390f-44fa-91fb-40f18240d0df\") " pod="openshift-operators-redhat/loki-operator-controller-manager-59d4b4c7c-rvnbw" Feb 18 14:10:16 crc kubenswrapper[4817]: I0218 14:10:16.863124 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-59d4b4c7c-rvnbw" Feb 18 14:10:16 crc kubenswrapper[4817]: I0218 14:10:16.875822 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hn6w5" podUID="20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd" containerName="registry-server" probeResult="failure" output=< Feb 18 14:10:16 crc kubenswrapper[4817]: timeout: failed to connect service ":50051" within 1s Feb 18 14:10:16 crc kubenswrapper[4817]: > Feb 18 14:10:17 crc kubenswrapper[4817]: I0218 14:10:17.190467 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-59d4b4c7c-rvnbw"] Feb 18 14:10:17 crc kubenswrapper[4817]: W0218 14:10:17.198862 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf27f33f_390f_44fa_91fb_40f18240d0df.slice/crio-0027b51b2417f59d3d22d81514161a6ce3c18d08f9aa075cc39d51b35ac31cb8 WatchSource:0}: Error finding container 0027b51b2417f59d3d22d81514161a6ce3c18d08f9aa075cc39d51b35ac31cb8: Status 404 returned error can't find the container with id 0027b51b2417f59d3d22d81514161a6ce3c18d08f9aa075cc39d51b35ac31cb8 Feb 18 14:10:18 crc kubenswrapper[4817]: I0218 14:10:18.087220 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-59d4b4c7c-rvnbw" event={"ID":"bf27f33f-390f-44fa-91fb-40f18240d0df","Type":"ContainerStarted","Data":"0027b51b2417f59d3d22d81514161a6ce3c18d08f9aa075cc39d51b35ac31cb8"} Feb 18 14:10:23 crc kubenswrapper[4817]: I0218 14:10:23.118928 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-59d4b4c7c-rvnbw" event={"ID":"bf27f33f-390f-44fa-91fb-40f18240d0df","Type":"ContainerStarted","Data":"00898e4044591aa662aaf714d157194f4eaa2f48ba1381f3fcda3e0718d93370"} Feb 18 14:10:25 crc kubenswrapper[4817]: I0218 14:10:25.893632 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hn6w5" Feb 18 14:10:25 crc kubenswrapper[4817]: I0218 14:10:25.937128 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hn6w5" Feb 18 14:10:26 crc kubenswrapper[4817]: I0218 14:10:26.696528 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hn6w5"] Feb 18 14:10:27 crc kubenswrapper[4817]: I0218 14:10:27.141617 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hn6w5" podUID="20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd" containerName="registry-server" containerID="cri-o://c451f2a6ac96d03e4e06988a89952a7ef4c4308d8c2c0d78913a35d8c066b3a5" gracePeriod=2 Feb 18 14:10:28 crc kubenswrapper[4817]: I0218 14:10:28.149933 4817 generic.go:334] "Generic (PLEG): container finished" podID="20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd" containerID="c451f2a6ac96d03e4e06988a89952a7ef4c4308d8c2c0d78913a35d8c066b3a5" exitCode=0 Feb 18 14:10:28 crc kubenswrapper[4817]: I0218 14:10:28.150163 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hn6w5" event={"ID":"20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd","Type":"ContainerDied","Data":"c451f2a6ac96d03e4e06988a89952a7ef4c4308d8c2c0d78913a35d8c066b3a5"} Feb 18 14:10:28 crc kubenswrapper[4817]: I0218 14:10:28.533757 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hn6w5" Feb 18 14:10:28 crc kubenswrapper[4817]: I0218 14:10:28.646139 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd-catalog-content\") pod \"20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd\" (UID: \"20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd\") " Feb 18 14:10:28 crc kubenswrapper[4817]: I0218 14:10:28.646189 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd-utilities\") pod \"20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd\" (UID: \"20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd\") " Feb 18 14:10:28 crc kubenswrapper[4817]: I0218 14:10:28.646220 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5sfw\" (UniqueName: \"kubernetes.io/projected/20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd-kube-api-access-d5sfw\") pod \"20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd\" (UID: \"20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd\") " Feb 18 14:10:28 crc kubenswrapper[4817]: I0218 14:10:28.647052 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd-utilities" (OuterVolumeSpecName: "utilities") pod "20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd" (UID: "20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:10:28 crc kubenswrapper[4817]: I0218 14:10:28.652081 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd-kube-api-access-d5sfw" (OuterVolumeSpecName: "kube-api-access-d5sfw") pod "20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd" (UID: "20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd"). InnerVolumeSpecName "kube-api-access-d5sfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:10:28 crc kubenswrapper[4817]: I0218 14:10:28.748102 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 14:10:28 crc kubenswrapper[4817]: I0218 14:10:28.748426 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5sfw\" (UniqueName: \"kubernetes.io/projected/20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd-kube-api-access-d5sfw\") on node \"crc\" DevicePath \"\"" Feb 18 14:10:28 crc kubenswrapper[4817]: I0218 14:10:28.773845 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd" (UID: "20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:10:28 crc kubenswrapper[4817]: I0218 14:10:28.849789 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 14:10:29 crc kubenswrapper[4817]: I0218 14:10:29.158358 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-59d4b4c7c-rvnbw" event={"ID":"bf27f33f-390f-44fa-91fb-40f18240d0df","Type":"ContainerStarted","Data":"38862de19d8f6529b968da8ea55c436b5e064c6a6e7279dcdf11c6d1988f73aa"} Feb 18 14:10:29 crc kubenswrapper[4817]: I0218 14:10:29.158607 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-59d4b4c7c-rvnbw" Feb 18 14:10:29 crc kubenswrapper[4817]: I0218 14:10:29.160375 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-59d4b4c7c-rvnbw" Feb 18 14:10:29 crc kubenswrapper[4817]: I0218 14:10:29.161332 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hn6w5" event={"ID":"20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd","Type":"ContainerDied","Data":"37ba06a0d8fc3a52780881c69a42854087c080a821c159241a3c629c1e091271"} Feb 18 14:10:29 crc kubenswrapper[4817]: I0218 14:10:29.161393 4817 scope.go:117] "RemoveContainer" containerID="c451f2a6ac96d03e4e06988a89952a7ef4c4308d8c2c0d78913a35d8c066b3a5" Feb 18 14:10:29 crc kubenswrapper[4817]: I0218 14:10:29.161402 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hn6w5" Feb 18 14:10:29 crc kubenswrapper[4817]: I0218 14:10:29.180074 4817 scope.go:117] "RemoveContainer" containerID="e58f3de69ac37102e81c1448f164365e6c288791356395e4c5ecc45404f7af0e" Feb 18 14:10:29 crc kubenswrapper[4817]: I0218 14:10:29.195375 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-59d4b4c7c-rvnbw" podStartSLOduration=1.7960971890000001 podStartE2EDuration="13.19535568s" podCreationTimestamp="2026-02-18 14:10:16 +0000 UTC" firstStartedPulling="2026-02-18 14:10:17.201302933 +0000 UTC m=+679.776838916" lastFinishedPulling="2026-02-18 14:10:28.600561424 +0000 UTC m=+691.176097407" observedRunningTime="2026-02-18 14:10:29.190847978 +0000 UTC m=+691.766383981" watchObservedRunningTime="2026-02-18 14:10:29.19535568 +0000 UTC m=+691.770891663" Feb 18 14:10:29 crc kubenswrapper[4817]: I0218 14:10:29.202020 4817 scope.go:117] "RemoveContainer" containerID="c91f5b4da90dfed8975ec4e888c0e0a6f79d54d7325bb400670c40c4b26d3d53" Feb 18 14:10:29 crc kubenswrapper[4817]: I0218 14:10:29.227041 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hn6w5"] Feb 18 14:10:29 crc kubenswrapper[4817]: I0218 14:10:29.231224 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hn6w5"] Feb 18 14:10:30 crc kubenswrapper[4817]: I0218 14:10:30.181471 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd" path="/var/lib/kubelet/pods/20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd/volumes" Feb 18 14:10:48 crc kubenswrapper[4817]: I0218 14:10:48.405143 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62"] Feb 18 14:10:48 crc kubenswrapper[4817]: E0218 14:10:48.405827 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd" containerName="extract-utilities" Feb 18 14:10:48 crc kubenswrapper[4817]: I0218 14:10:48.405841 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd" containerName="extract-utilities" Feb 18 14:10:48 crc kubenswrapper[4817]: E0218 14:10:48.405858 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd" containerName="extract-content" Feb 18 14:10:48 crc kubenswrapper[4817]: I0218 14:10:48.405869 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd" containerName="extract-content" Feb 18 14:10:48 crc kubenswrapper[4817]: E0218 14:10:48.405882 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd" containerName="registry-server" Feb 18 14:10:48 crc kubenswrapper[4817]: I0218 14:10:48.405892 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd" containerName="registry-server" Feb 18 14:10:48 crc kubenswrapper[4817]: I0218 14:10:48.406036 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="20dbc44c-4fae-4a32-a7d1-8a02fc2cd9cd" containerName="registry-server" Feb 18 14:10:48 crc kubenswrapper[4817]: I0218 14:10:48.406937 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62" Feb 18 14:10:48 crc kubenswrapper[4817]: I0218 14:10:48.409073 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 18 14:10:48 crc kubenswrapper[4817]: I0218 14:10:48.420964 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62"] Feb 18 14:10:48 crc kubenswrapper[4817]: I0218 14:10:48.505022 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ca2187cb-8ba5-4146-a506-4989f6bade5c-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62\" (UID: \"ca2187cb-8ba5-4146-a506-4989f6bade5c\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62" Feb 18 14:10:48 crc kubenswrapper[4817]: I0218 14:10:48.505054 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzm7p\" (UniqueName: \"kubernetes.io/projected/ca2187cb-8ba5-4146-a506-4989f6bade5c-kube-api-access-lzm7p\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62\" (UID: \"ca2187cb-8ba5-4146-a506-4989f6bade5c\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62" Feb 18 14:10:48 crc kubenswrapper[4817]: I0218 14:10:48.505147 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ca2187cb-8ba5-4146-a506-4989f6bade5c-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62\" (UID: \"ca2187cb-8ba5-4146-a506-4989f6bade5c\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62" Feb 18 14:10:48 crc kubenswrapper[4817]: I0218 14:10:48.606375 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ca2187cb-8ba5-4146-a506-4989f6bade5c-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62\" (UID: \"ca2187cb-8ba5-4146-a506-4989f6bade5c\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62" Feb 18 14:10:48 crc kubenswrapper[4817]: I0218 14:10:48.606415 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzm7p\" (UniqueName: \"kubernetes.io/projected/ca2187cb-8ba5-4146-a506-4989f6bade5c-kube-api-access-lzm7p\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62\" (UID: \"ca2187cb-8ba5-4146-a506-4989f6bade5c\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62" Feb 18 14:10:48 crc kubenswrapper[4817]: I0218 14:10:48.606470 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ca2187cb-8ba5-4146-a506-4989f6bade5c-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62\" (UID: \"ca2187cb-8ba5-4146-a506-4989f6bade5c\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62" Feb 18 14:10:48 crc kubenswrapper[4817]: I0218 14:10:48.606947 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ca2187cb-8ba5-4146-a506-4989f6bade5c-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62\" (UID: \"ca2187cb-8ba5-4146-a506-4989f6bade5c\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62" Feb 18 14:10:48 crc kubenswrapper[4817]: I0218 14:10:48.606952 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ca2187cb-8ba5-4146-a506-4989f6bade5c-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62\" (UID: \"ca2187cb-8ba5-4146-a506-4989f6bade5c\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62" Feb 18 14:10:48 crc kubenswrapper[4817]: I0218 14:10:48.637270 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzm7p\" (UniqueName: \"kubernetes.io/projected/ca2187cb-8ba5-4146-a506-4989f6bade5c-kube-api-access-lzm7p\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62\" (UID: \"ca2187cb-8ba5-4146-a506-4989f6bade5c\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62" Feb 18 14:10:48 crc kubenswrapper[4817]: I0218 14:10:48.727355 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62" Feb 18 14:10:49 crc kubenswrapper[4817]: I0218 14:10:49.146737 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62"] Feb 18 14:10:49 crc kubenswrapper[4817]: I0218 14:10:49.290417 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62" event={"ID":"ca2187cb-8ba5-4146-a506-4989f6bade5c","Type":"ContainerStarted","Data":"eab6cc0ce01b9329bf4d6f12252ea8b5efccb5ab8ae775b12dbf0b09bc40ca86"} Feb 18 14:10:50 crc kubenswrapper[4817]: I0218 14:10:50.300710 4817 generic.go:334] "Generic (PLEG): container finished" podID="ca2187cb-8ba5-4146-a506-4989f6bade5c" containerID="22896ef7fe04d3b852f149faabcc3714fa621ebde191fceae5e935eac4e87973" exitCode=0 Feb 18 14:10:50 crc kubenswrapper[4817]: I0218 14:10:50.300748 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62" event={"ID":"ca2187cb-8ba5-4146-a506-4989f6bade5c","Type":"ContainerDied","Data":"22896ef7fe04d3b852f149faabcc3714fa621ebde191fceae5e935eac4e87973"} Feb 18 14:10:52 crc kubenswrapper[4817]: I0218 14:10:52.318111 4817 generic.go:334] "Generic (PLEG): container finished" podID="ca2187cb-8ba5-4146-a506-4989f6bade5c" containerID="609d62e811bb2fe828edc244db507c6cfbe165d566a1fc8ef867ce90e23053dd" exitCode=0 Feb 18 14:10:52 crc kubenswrapper[4817]: I0218 14:10:52.318228 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62" event={"ID":"ca2187cb-8ba5-4146-a506-4989f6bade5c","Type":"ContainerDied","Data":"609d62e811bb2fe828edc244db507c6cfbe165d566a1fc8ef867ce90e23053dd"} Feb 18 14:10:53 crc kubenswrapper[4817]: I0218 14:10:53.330786 4817 generic.go:334] "Generic (PLEG): container finished" podID="ca2187cb-8ba5-4146-a506-4989f6bade5c" containerID="69bb81185a48d4a70c2e154d97eea133d4841293e4fbc8f641990a680eb285e9" exitCode=0 Feb 18 14:10:53 crc kubenswrapper[4817]: I0218 14:10:53.330867 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62" event={"ID":"ca2187cb-8ba5-4146-a506-4989f6bade5c","Type":"ContainerDied","Data":"69bb81185a48d4a70c2e154d97eea133d4841293e4fbc8f641990a680eb285e9"} Feb 18 14:10:54 crc kubenswrapper[4817]: I0218 14:10:54.568811 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62" Feb 18 14:10:54 crc kubenswrapper[4817]: I0218 14:10:54.693773 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ca2187cb-8ba5-4146-a506-4989f6bade5c-bundle\") pod \"ca2187cb-8ba5-4146-a506-4989f6bade5c\" (UID: \"ca2187cb-8ba5-4146-a506-4989f6bade5c\") " Feb 18 14:10:54 crc kubenswrapper[4817]: I0218 14:10:54.693832 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ca2187cb-8ba5-4146-a506-4989f6bade5c-util\") pod \"ca2187cb-8ba5-4146-a506-4989f6bade5c\" (UID: \"ca2187cb-8ba5-4146-a506-4989f6bade5c\") " Feb 18 14:10:54 crc kubenswrapper[4817]: I0218 14:10:54.693853 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzm7p\" (UniqueName: \"kubernetes.io/projected/ca2187cb-8ba5-4146-a506-4989f6bade5c-kube-api-access-lzm7p\") pod \"ca2187cb-8ba5-4146-a506-4989f6bade5c\" (UID: \"ca2187cb-8ba5-4146-a506-4989f6bade5c\") " Feb 18 14:10:54 crc kubenswrapper[4817]: I0218 14:10:54.694542 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca2187cb-8ba5-4146-a506-4989f6bade5c-bundle" (OuterVolumeSpecName: "bundle") pod "ca2187cb-8ba5-4146-a506-4989f6bade5c" (UID: "ca2187cb-8ba5-4146-a506-4989f6bade5c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:10:54 crc kubenswrapper[4817]: I0218 14:10:54.702106 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca2187cb-8ba5-4146-a506-4989f6bade5c-kube-api-access-lzm7p" (OuterVolumeSpecName: "kube-api-access-lzm7p") pod "ca2187cb-8ba5-4146-a506-4989f6bade5c" (UID: "ca2187cb-8ba5-4146-a506-4989f6bade5c"). InnerVolumeSpecName "kube-api-access-lzm7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:10:54 crc kubenswrapper[4817]: I0218 14:10:54.795647 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzm7p\" (UniqueName: \"kubernetes.io/projected/ca2187cb-8ba5-4146-a506-4989f6bade5c-kube-api-access-lzm7p\") on node \"crc\" DevicePath \"\"" Feb 18 14:10:54 crc kubenswrapper[4817]: I0218 14:10:54.795701 4817 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ca2187cb-8ba5-4146-a506-4989f6bade5c-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:10:54 crc kubenswrapper[4817]: I0218 14:10:54.952124 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca2187cb-8ba5-4146-a506-4989f6bade5c-util" (OuterVolumeSpecName: "util") pod "ca2187cb-8ba5-4146-a506-4989f6bade5c" (UID: "ca2187cb-8ba5-4146-a506-4989f6bade5c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:10:54 crc kubenswrapper[4817]: I0218 14:10:54.999435 4817 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ca2187cb-8ba5-4146-a506-4989f6bade5c-util\") on node \"crc\" DevicePath \"\"" Feb 18 14:10:55 crc kubenswrapper[4817]: I0218 14:10:55.348534 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62" event={"ID":"ca2187cb-8ba5-4146-a506-4989f6bade5c","Type":"ContainerDied","Data":"eab6cc0ce01b9329bf4d6f12252ea8b5efccb5ab8ae775b12dbf0b09bc40ca86"} Feb 18 14:10:55 crc kubenswrapper[4817]: I0218 14:10:55.348580 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eab6cc0ce01b9329bf4d6f12252ea8b5efccb5ab8ae775b12dbf0b09bc40ca86" Feb 18 14:10:55 crc kubenswrapper[4817]: I0218 14:10:55.348592 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62" Feb 18 14:11:00 crc kubenswrapper[4817]: I0218 14:11:00.501673 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-4jkpg"] Feb 18 14:11:00 crc kubenswrapper[4817]: E0218 14:11:00.502535 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca2187cb-8ba5-4146-a506-4989f6bade5c" containerName="extract" Feb 18 14:11:00 crc kubenswrapper[4817]: I0218 14:11:00.502551 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca2187cb-8ba5-4146-a506-4989f6bade5c" containerName="extract" Feb 18 14:11:00 crc kubenswrapper[4817]: E0218 14:11:00.502574 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca2187cb-8ba5-4146-a506-4989f6bade5c" containerName="pull" Feb 18 14:11:00 crc kubenswrapper[4817]: I0218 14:11:00.502583 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca2187cb-8ba5-4146-a506-4989f6bade5c" containerName="pull" Feb 18 14:11:00 crc kubenswrapper[4817]: E0218 14:11:00.502596 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca2187cb-8ba5-4146-a506-4989f6bade5c" containerName="util" Feb 18 14:11:00 crc kubenswrapper[4817]: I0218 14:11:00.502604 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca2187cb-8ba5-4146-a506-4989f6bade5c" containerName="util" Feb 18 14:11:00 crc kubenswrapper[4817]: I0218 14:11:00.502721 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca2187cb-8ba5-4146-a506-4989f6bade5c" containerName="extract" Feb 18 14:11:00 crc kubenswrapper[4817]: I0218 14:11:00.503255 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-4jkpg" Feb 18 14:11:00 crc kubenswrapper[4817]: I0218 14:11:00.506739 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 18 14:11:00 crc kubenswrapper[4817]: I0218 14:11:00.506753 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-slss7" Feb 18 14:11:00 crc kubenswrapper[4817]: I0218 14:11:00.508014 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 18 14:11:00 crc kubenswrapper[4817]: I0218 14:11:00.515954 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-4jkpg"] Feb 18 14:11:00 crc kubenswrapper[4817]: I0218 14:11:00.567163 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62jz7\" (UniqueName: \"kubernetes.io/projected/ca7ab19f-157f-4626-80d3-27ed1a469d95-kube-api-access-62jz7\") pod \"nmstate-operator-694c9596b7-4jkpg\" (UID: \"ca7ab19f-157f-4626-80d3-27ed1a469d95\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-4jkpg" Feb 18 14:11:00 crc kubenswrapper[4817]: I0218 14:11:00.668859 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62jz7\" (UniqueName: \"kubernetes.io/projected/ca7ab19f-157f-4626-80d3-27ed1a469d95-kube-api-access-62jz7\") pod \"nmstate-operator-694c9596b7-4jkpg\" (UID: \"ca7ab19f-157f-4626-80d3-27ed1a469d95\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-4jkpg" Feb 18 14:11:00 crc kubenswrapper[4817]: I0218 14:11:00.694093 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62jz7\" (UniqueName: \"kubernetes.io/projected/ca7ab19f-157f-4626-80d3-27ed1a469d95-kube-api-access-62jz7\") pod \"nmstate-operator-694c9596b7-4jkpg\" (UID: \"ca7ab19f-157f-4626-80d3-27ed1a469d95\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-4jkpg" Feb 18 14:11:00 crc kubenswrapper[4817]: I0218 14:11:00.820004 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-4jkpg" Feb 18 14:11:01 crc kubenswrapper[4817]: I0218 14:11:01.066736 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-4jkpg"] Feb 18 14:11:01 crc kubenswrapper[4817]: W0218 14:11:01.080461 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca7ab19f_157f_4626_80d3_27ed1a469d95.slice/crio-8b8365c22d56922c5406d1834af3a619b24246b798169bce7a41c6785fa1ed5e WatchSource:0}: Error finding container 8b8365c22d56922c5406d1834af3a619b24246b798169bce7a41c6785fa1ed5e: Status 404 returned error can't find the container with id 8b8365c22d56922c5406d1834af3a619b24246b798169bce7a41c6785fa1ed5e Feb 18 14:11:01 crc kubenswrapper[4817]: I0218 14:11:01.394409 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-4jkpg" event={"ID":"ca7ab19f-157f-4626-80d3-27ed1a469d95","Type":"ContainerStarted","Data":"8b8365c22d56922c5406d1834af3a619b24246b798169bce7a41c6785fa1ed5e"} Feb 18 14:11:04 crc kubenswrapper[4817]: I0218 14:11:04.411159 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-4jkpg" event={"ID":"ca7ab19f-157f-4626-80d3-27ed1a469d95","Type":"ContainerStarted","Data":"82e1d6c2c5fce6caa1f08f16a06aa7ceb0eb7ecf79923e070779587bf6778a65"} Feb 18 14:11:04 crc kubenswrapper[4817]: I0218 14:11:04.427953 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-4jkpg" podStartSLOduration=2.170404695 podStartE2EDuration="4.427921446s" podCreationTimestamp="2026-02-18 14:11:00 +0000 UTC" firstStartedPulling="2026-02-18 14:11:01.087196626 +0000 UTC m=+723.662732609" lastFinishedPulling="2026-02-18 14:11:03.344713377 +0000 UTC m=+725.920249360" observedRunningTime="2026-02-18 14:11:04.426514992 +0000 UTC m=+727.002050995" watchObservedRunningTime="2026-02-18 14:11:04.427921446 +0000 UTC m=+727.003457449" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.437063 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-qnftd"] Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.438417 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-qnftd" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.441573 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-rbzvm" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.452032 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-b2vsq"] Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.452964 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-b2vsq" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.455597 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.460015 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-qnftd"] Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.487849 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-b2vsq"] Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.518047 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-l4ntc"] Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.519072 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-l4ntc" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.590020 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhh4s\" (UniqueName: \"kubernetes.io/projected/1cc1bcee-c9a0-4bda-9fb1-0f178d5a938a-kube-api-access-lhh4s\") pod \"nmstate-webhook-866bcb46dc-b2vsq\" (UID: \"1cc1bcee-c9a0-4bda-9fb1-0f178d5a938a\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-b2vsq" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.590079 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/8b4f277d-2b45-43de-b3f7-52e968407f19-ovs-socket\") pod \"nmstate-handler-l4ntc\" (UID: \"8b4f277d-2b45-43de-b3f7-52e968407f19\") " pod="openshift-nmstate/nmstate-handler-l4ntc" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.590155 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/8b4f277d-2b45-43de-b3f7-52e968407f19-dbus-socket\") pod \"nmstate-handler-l4ntc\" (UID: \"8b4f277d-2b45-43de-b3f7-52e968407f19\") " pod="openshift-nmstate/nmstate-handler-l4ntc" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.590303 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/8b4f277d-2b45-43de-b3f7-52e968407f19-nmstate-lock\") pod \"nmstate-handler-l4ntc\" (UID: \"8b4f277d-2b45-43de-b3f7-52e968407f19\") " pod="openshift-nmstate/nmstate-handler-l4ntc" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.590390 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1cc1bcee-c9a0-4bda-9fb1-0f178d5a938a-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-b2vsq\" (UID: \"1cc1bcee-c9a0-4bda-9fb1-0f178d5a938a\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-b2vsq" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.590430 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txjv6\" (UniqueName: \"kubernetes.io/projected/65c92dc4-036d-42e0-baa1-3dc7e23c43b3-kube-api-access-txjv6\") pod \"nmstate-metrics-58c85c668d-qnftd\" (UID: \"65c92dc4-036d-42e0-baa1-3dc7e23c43b3\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-qnftd" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.590457 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f9kc\" (UniqueName: \"kubernetes.io/projected/8b4f277d-2b45-43de-b3f7-52e968407f19-kube-api-access-7f9kc\") pod \"nmstate-handler-l4ntc\" (UID: \"8b4f277d-2b45-43de-b3f7-52e968407f19\") " pod="openshift-nmstate/nmstate-handler-l4ntc" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.613803 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8qsxw"] Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.614496 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8qsxw" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.617251 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-tdcxx" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.617714 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.617951 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.624329 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8qsxw"] Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.692212 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/8b4f277d-2b45-43de-b3f7-52e968407f19-nmstate-lock\") pod \"nmstate-handler-l4ntc\" (UID: \"8b4f277d-2b45-43de-b3f7-52e968407f19\") " pod="openshift-nmstate/nmstate-handler-l4ntc" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.692280 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1cc1bcee-c9a0-4bda-9fb1-0f178d5a938a-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-b2vsq\" (UID: \"1cc1bcee-c9a0-4bda-9fb1-0f178d5a938a\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-b2vsq" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.692313 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txjv6\" (UniqueName: \"kubernetes.io/projected/65c92dc4-036d-42e0-baa1-3dc7e23c43b3-kube-api-access-txjv6\") pod \"nmstate-metrics-58c85c668d-qnftd\" (UID: \"65c92dc4-036d-42e0-baa1-3dc7e23c43b3\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-qnftd" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.692338 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f9kc\" (UniqueName: \"kubernetes.io/projected/8b4f277d-2b45-43de-b3f7-52e968407f19-kube-api-access-7f9kc\") pod \"nmstate-handler-l4ntc\" (UID: \"8b4f277d-2b45-43de-b3f7-52e968407f19\") " pod="openshift-nmstate/nmstate-handler-l4ntc" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.692339 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/8b4f277d-2b45-43de-b3f7-52e968407f19-nmstate-lock\") pod \"nmstate-handler-l4ntc\" (UID: \"8b4f277d-2b45-43de-b3f7-52e968407f19\") " pod="openshift-nmstate/nmstate-handler-l4ntc" Feb 18 14:11:09 crc kubenswrapper[4817]: E0218 14:11:09.692452 4817 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 18 14:11:09 crc kubenswrapper[4817]: E0218 14:11:09.692826 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cc1bcee-c9a0-4bda-9fb1-0f178d5a938a-tls-key-pair podName:1cc1bcee-c9a0-4bda-9fb1-0f178d5a938a nodeName:}" failed. No retries permitted until 2026-02-18 14:11:10.192803372 +0000 UTC m=+732.768339355 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/1cc1bcee-c9a0-4bda-9fb1-0f178d5a938a-tls-key-pair") pod "nmstate-webhook-866bcb46dc-b2vsq" (UID: "1cc1bcee-c9a0-4bda-9fb1-0f178d5a938a") : secret "openshift-nmstate-webhook" not found Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.692713 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhh4s\" (UniqueName: \"kubernetes.io/projected/1cc1bcee-c9a0-4bda-9fb1-0f178d5a938a-kube-api-access-lhh4s\") pod \"nmstate-webhook-866bcb46dc-b2vsq\" (UID: \"1cc1bcee-c9a0-4bda-9fb1-0f178d5a938a\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-b2vsq" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.692938 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/8b4f277d-2b45-43de-b3f7-52e968407f19-ovs-socket\") pod \"nmstate-handler-l4ntc\" (UID: \"8b4f277d-2b45-43de-b3f7-52e968407f19\") " pod="openshift-nmstate/nmstate-handler-l4ntc" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.693014 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/8b4f277d-2b45-43de-b3f7-52e968407f19-ovs-socket\") pod \"nmstate-handler-l4ntc\" (UID: \"8b4f277d-2b45-43de-b3f7-52e968407f19\") " pod="openshift-nmstate/nmstate-handler-l4ntc" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.693031 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/abf003b3-f87b-4907-ad15-59b8f12108b3-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-8qsxw\" (UID: \"abf003b3-f87b-4907-ad15-59b8f12108b3\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8qsxw" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.693076 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25x5d\" (UniqueName: \"kubernetes.io/projected/abf003b3-f87b-4907-ad15-59b8f12108b3-kube-api-access-25x5d\") pod \"nmstate-console-plugin-5c78fc5d65-8qsxw\" (UID: \"abf003b3-f87b-4907-ad15-59b8f12108b3\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8qsxw" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.693107 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/8b4f277d-2b45-43de-b3f7-52e968407f19-dbus-socket\") pod \"nmstate-handler-l4ntc\" (UID: \"8b4f277d-2b45-43de-b3f7-52e968407f19\") " pod="openshift-nmstate/nmstate-handler-l4ntc" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.693149 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/abf003b3-f87b-4907-ad15-59b8f12108b3-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-8qsxw\" (UID: \"abf003b3-f87b-4907-ad15-59b8f12108b3\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8qsxw" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.693450 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/8b4f277d-2b45-43de-b3f7-52e968407f19-dbus-socket\") pod \"nmstate-handler-l4ntc\" (UID: \"8b4f277d-2b45-43de-b3f7-52e968407f19\") " pod="openshift-nmstate/nmstate-handler-l4ntc" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.711604 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhh4s\" (UniqueName: \"kubernetes.io/projected/1cc1bcee-c9a0-4bda-9fb1-0f178d5a938a-kube-api-access-lhh4s\") pod \"nmstate-webhook-866bcb46dc-b2vsq\" (UID: \"1cc1bcee-c9a0-4bda-9fb1-0f178d5a938a\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-b2vsq" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.717746 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f9kc\" (UniqueName: \"kubernetes.io/projected/8b4f277d-2b45-43de-b3f7-52e968407f19-kube-api-access-7f9kc\") pod \"nmstate-handler-l4ntc\" (UID: \"8b4f277d-2b45-43de-b3f7-52e968407f19\") " pod="openshift-nmstate/nmstate-handler-l4ntc" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.720580 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txjv6\" (UniqueName: \"kubernetes.io/projected/65c92dc4-036d-42e0-baa1-3dc7e23c43b3-kube-api-access-txjv6\") pod \"nmstate-metrics-58c85c668d-qnftd\" (UID: \"65c92dc4-036d-42e0-baa1-3dc7e23c43b3\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-qnftd" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.759611 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-qnftd" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.803534 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/abf003b3-f87b-4907-ad15-59b8f12108b3-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-8qsxw\" (UID: \"abf003b3-f87b-4907-ad15-59b8f12108b3\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8qsxw" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.803797 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25x5d\" (UniqueName: \"kubernetes.io/projected/abf003b3-f87b-4907-ad15-59b8f12108b3-kube-api-access-25x5d\") pod \"nmstate-console-plugin-5c78fc5d65-8qsxw\" (UID: \"abf003b3-f87b-4907-ad15-59b8f12108b3\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8qsxw" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.803909 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/abf003b3-f87b-4907-ad15-59b8f12108b3-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-8qsxw\" (UID: \"abf003b3-f87b-4907-ad15-59b8f12108b3\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8qsxw" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.805296 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/abf003b3-f87b-4907-ad15-59b8f12108b3-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-8qsxw\" (UID: \"abf003b3-f87b-4907-ad15-59b8f12108b3\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8qsxw" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.813160 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/abf003b3-f87b-4907-ad15-59b8f12108b3-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-8qsxw\" (UID: \"abf003b3-f87b-4907-ad15-59b8f12108b3\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8qsxw" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.831939 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25x5d\" (UniqueName: \"kubernetes.io/projected/abf003b3-f87b-4907-ad15-59b8f12108b3-kube-api-access-25x5d\") pod \"nmstate-console-plugin-5c78fc5d65-8qsxw\" (UID: \"abf003b3-f87b-4907-ad15-59b8f12108b3\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8qsxw" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.845027 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-l4ntc" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.932373 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8qsxw" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.967080 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-d74b67b44-lgldb"] Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.968248 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d74b67b44-lgldb" Feb 18 14:11:09 crc kubenswrapper[4817]: I0218 14:11:09.975712 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d74b67b44-lgldb"] Feb 18 14:11:10 crc kubenswrapper[4817]: I0218 14:11:10.109603 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23-console-serving-cert\") pod \"console-d74b67b44-lgldb\" (UID: \"ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23\") " pod="openshift-console/console-d74b67b44-lgldb" Feb 18 14:11:10 crc kubenswrapper[4817]: I0218 14:11:10.109650 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndwwc\" (UniqueName: \"kubernetes.io/projected/ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23-kube-api-access-ndwwc\") pod \"console-d74b67b44-lgldb\" (UID: \"ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23\") " pod="openshift-console/console-d74b67b44-lgldb" Feb 18 14:11:10 crc kubenswrapper[4817]: I0218 14:11:10.109907 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23-trusted-ca-bundle\") pod \"console-d74b67b44-lgldb\" (UID: \"ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23\") " pod="openshift-console/console-d74b67b44-lgldb" Feb 18 14:11:10 crc kubenswrapper[4817]: I0218 14:11:10.110006 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23-console-oauth-config\") pod \"console-d74b67b44-lgldb\" (UID: \"ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23\") " pod="openshift-console/console-d74b67b44-lgldb" Feb 18 14:11:10 crc kubenswrapper[4817]: I0218 14:11:10.110040 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23-console-config\") pod \"console-d74b67b44-lgldb\" (UID: \"ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23\") " pod="openshift-console/console-d74b67b44-lgldb" Feb 18 14:11:10 crc kubenswrapper[4817]: I0218 14:11:10.110063 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23-oauth-serving-cert\") pod \"console-d74b67b44-lgldb\" (UID: \"ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23\") " pod="openshift-console/console-d74b67b44-lgldb" Feb 18 14:11:10 crc kubenswrapper[4817]: I0218 14:11:10.110096 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23-service-ca\") pod \"console-d74b67b44-lgldb\" (UID: \"ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23\") " pod="openshift-console/console-d74b67b44-lgldb" Feb 18 14:11:10 crc kubenswrapper[4817]: I0218 14:11:10.212097 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23-console-serving-cert\") pod \"console-d74b67b44-lgldb\" (UID: \"ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23\") " pod="openshift-console/console-d74b67b44-lgldb" Feb 18 14:11:10 crc kubenswrapper[4817]: I0218 14:11:10.212155 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndwwc\" (UniqueName: \"kubernetes.io/projected/ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23-kube-api-access-ndwwc\") pod \"console-d74b67b44-lgldb\" (UID: \"ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23\") " pod="openshift-console/console-d74b67b44-lgldb" Feb 18 14:11:10 crc kubenswrapper[4817]: I0218 14:11:10.212190 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1cc1bcee-c9a0-4bda-9fb1-0f178d5a938a-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-b2vsq\" (UID: \"1cc1bcee-c9a0-4bda-9fb1-0f178d5a938a\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-b2vsq" Feb 18 14:11:10 crc kubenswrapper[4817]: I0218 14:11:10.212227 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23-trusted-ca-bundle\") pod \"console-d74b67b44-lgldb\" (UID: \"ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23\") " pod="openshift-console/console-d74b67b44-lgldb" Feb 18 14:11:10 crc kubenswrapper[4817]: I0218 14:11:10.212250 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23-console-oauth-config\") pod \"console-d74b67b44-lgldb\" (UID: \"ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23\") " pod="openshift-console/console-d74b67b44-lgldb" Feb 18 14:11:10 crc kubenswrapper[4817]: I0218 14:11:10.212269 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23-console-config\") pod \"console-d74b67b44-lgldb\" (UID: \"ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23\") " pod="openshift-console/console-d74b67b44-lgldb" Feb 18 14:11:10 crc kubenswrapper[4817]: I0218 14:11:10.212289 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23-oauth-serving-cert\") pod \"console-d74b67b44-lgldb\" (UID: \"ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23\") " pod="openshift-console/console-d74b67b44-lgldb" Feb 18 14:11:10 crc kubenswrapper[4817]: I0218 14:11:10.212315 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23-service-ca\") pod \"console-d74b67b44-lgldb\" (UID: \"ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23\") " pod="openshift-console/console-d74b67b44-lgldb" Feb 18 14:11:10 crc kubenswrapper[4817]: I0218 14:11:10.213930 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23-service-ca\") pod \"console-d74b67b44-lgldb\" (UID: \"ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23\") " pod="openshift-console/console-d74b67b44-lgldb" Feb 18 14:11:10 crc kubenswrapper[4817]: I0218 14:11:10.214199 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23-console-config\") pod \"console-d74b67b44-lgldb\" (UID: \"ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23\") " pod="openshift-console/console-d74b67b44-lgldb" Feb 18 14:11:10 crc kubenswrapper[4817]: I0218 14:11:10.214507 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23-oauth-serving-cert\") pod \"console-d74b67b44-lgldb\" (UID: \"ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23\") " pod="openshift-console/console-d74b67b44-lgldb" Feb 18 14:11:10 crc kubenswrapper[4817]: I0218 14:11:10.216010 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23-trusted-ca-bundle\") pod \"console-d74b67b44-lgldb\" (UID: \"ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23\") " pod="openshift-console/console-d74b67b44-lgldb" Feb 18 14:11:10 crc kubenswrapper[4817]: I0218 14:11:10.222066 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23-console-oauth-config\") pod \"console-d74b67b44-lgldb\" (UID: \"ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23\") " pod="openshift-console/console-d74b67b44-lgldb" Feb 18 14:11:10 crc kubenswrapper[4817]: I0218 14:11:10.222656 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/1cc1bcee-c9a0-4bda-9fb1-0f178d5a938a-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-b2vsq\" (UID: \"1cc1bcee-c9a0-4bda-9fb1-0f178d5a938a\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-b2vsq" Feb 18 14:11:10 crc kubenswrapper[4817]: I0218 14:11:10.226575 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23-console-serving-cert\") pod \"console-d74b67b44-lgldb\" (UID: \"ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23\") " pod="openshift-console/console-d74b67b44-lgldb" Feb 18 14:11:10 crc kubenswrapper[4817]: I0218 14:11:10.232349 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndwwc\" (UniqueName: \"kubernetes.io/projected/ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23-kube-api-access-ndwwc\") pod \"console-d74b67b44-lgldb\" (UID: \"ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23\") " pod="openshift-console/console-d74b67b44-lgldb" Feb 18 14:11:10 crc kubenswrapper[4817]: I0218 14:11:10.306531 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d74b67b44-lgldb" Feb 18 14:11:10 crc kubenswrapper[4817]: I0218 14:11:10.351498 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-qnftd"] Feb 18 14:11:10 crc kubenswrapper[4817]: W0218 14:11:10.361346 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65c92dc4_036d_42e0_baa1_3dc7e23c43b3.slice/crio-8f5bd6f83618473f1e5f7b908713a93d0fc4cb1e48e7537527250013663cc66d WatchSource:0}: Error finding container 8f5bd6f83618473f1e5f7b908713a93d0fc4cb1e48e7537527250013663cc66d: Status 404 returned error can't find the container with id 8f5bd6f83618473f1e5f7b908713a93d0fc4cb1e48e7537527250013663cc66d Feb 18 14:11:10 crc kubenswrapper[4817]: I0218 14:11:10.397522 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-b2vsq" Feb 18 14:11:10 crc kubenswrapper[4817]: I0218 14:11:10.421677 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8qsxw"] Feb 18 14:11:10 crc kubenswrapper[4817]: W0218 14:11:10.436817 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabf003b3_f87b_4907_ad15_59b8f12108b3.slice/crio-b76ed14823688a48085d8336249188e78a25b3154ca44902334df1524d0d9969 WatchSource:0}: Error finding container b76ed14823688a48085d8336249188e78a25b3154ca44902334df1524d0d9969: Status 404 returned error can't find the container with id b76ed14823688a48085d8336249188e78a25b3154ca44902334df1524d0d9969 Feb 18 14:11:10 crc kubenswrapper[4817]: I0218 14:11:10.456927 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8qsxw" event={"ID":"abf003b3-f87b-4907-ad15-59b8f12108b3","Type":"ContainerStarted","Data":"b76ed14823688a48085d8336249188e78a25b3154ca44902334df1524d0d9969"} Feb 18 14:11:10 crc kubenswrapper[4817]: I0218 14:11:10.459534 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-qnftd" event={"ID":"65c92dc4-036d-42e0-baa1-3dc7e23c43b3","Type":"ContainerStarted","Data":"8f5bd6f83618473f1e5f7b908713a93d0fc4cb1e48e7537527250013663cc66d"} Feb 18 14:11:10 crc kubenswrapper[4817]: I0218 14:11:10.460617 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-l4ntc" event={"ID":"8b4f277d-2b45-43de-b3f7-52e968407f19","Type":"ContainerStarted","Data":"6e3f8354b1cc8a34cc8660a8f6bcb04dc7ff3ebd8a261dee6057ffb3acae592d"} Feb 18 14:11:10 crc kubenswrapper[4817]: I0218 14:11:10.625524 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-b2vsq"] Feb 18 14:11:10 crc kubenswrapper[4817]: W0218 14:11:10.629244 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cc1bcee_c9a0_4bda_9fb1_0f178d5a938a.slice/crio-a8496bff2c75f0685c9129b7e4a8283b6e9de229167775bbceefd7f1e756985c WatchSource:0}: Error finding container a8496bff2c75f0685c9129b7e4a8283b6e9de229167775bbceefd7f1e756985c: Status 404 returned error can't find the container with id a8496bff2c75f0685c9129b7e4a8283b6e9de229167775bbceefd7f1e756985c Feb 18 14:11:10 crc kubenswrapper[4817]: I0218 14:11:10.762869 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d74b67b44-lgldb"] Feb 18 14:11:11 crc kubenswrapper[4817]: I0218 14:11:11.473392 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d74b67b44-lgldb" event={"ID":"ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23","Type":"ContainerStarted","Data":"e9da84d34ab0fff0f527399623160db0570a596a547b658ae00e74c313314e14"} Feb 18 14:11:11 crc kubenswrapper[4817]: I0218 14:11:11.473775 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d74b67b44-lgldb" event={"ID":"ca3c8c4e-7690-4eb5-8d8d-3bf7f7408b23","Type":"ContainerStarted","Data":"872759f350b30c8de852fa3ebeeccdb7d5fcb85cad7587567310b5a0c5c7c4cd"} Feb 18 14:11:11 crc kubenswrapper[4817]: I0218 14:11:11.477564 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-b2vsq" event={"ID":"1cc1bcee-c9a0-4bda-9fb1-0f178d5a938a","Type":"ContainerStarted","Data":"a8496bff2c75f0685c9129b7e4a8283b6e9de229167775bbceefd7f1e756985c"} Feb 18 14:11:11 crc kubenswrapper[4817]: I0218 14:11:11.490575 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-d74b67b44-lgldb" podStartSLOduration=2.490552859 podStartE2EDuration="2.490552859s" podCreationTimestamp="2026-02-18 14:11:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:11:11.490356374 +0000 UTC m=+734.065892367" watchObservedRunningTime="2026-02-18 14:11:11.490552859 +0000 UTC m=+734.066088852" Feb 18 14:11:13 crc kubenswrapper[4817]: I0218 14:11:13.508712 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-l4ntc" event={"ID":"8b4f277d-2b45-43de-b3f7-52e968407f19","Type":"ContainerStarted","Data":"7e0a118065c7610a722c04ac3bc7c7d03e4805a7a52f35f99df82d10433c7135"} Feb 18 14:11:13 crc kubenswrapper[4817]: I0218 14:11:13.509194 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-l4ntc" Feb 18 14:11:13 crc kubenswrapper[4817]: I0218 14:11:13.510157 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8qsxw" event={"ID":"abf003b3-f87b-4907-ad15-59b8f12108b3","Type":"ContainerStarted","Data":"64cc99d1ad391dfc483e1a2f6a6a51c47ab29eb1b90d20f7da94370c3b481396"} Feb 18 14:11:13 crc kubenswrapper[4817]: I0218 14:11:13.511934 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-b2vsq" event={"ID":"1cc1bcee-c9a0-4bda-9fb1-0f178d5a938a","Type":"ContainerStarted","Data":"1d7e02ca38cac1871ae7548381124a34f9530ba46c328936bd6e337dc7d75cb2"} Feb 18 14:11:13 crc kubenswrapper[4817]: I0218 14:11:13.512169 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-b2vsq" Feb 18 14:11:13 crc kubenswrapper[4817]: I0218 14:11:13.514427 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-qnftd" event={"ID":"65c92dc4-036d-42e0-baa1-3dc7e23c43b3","Type":"ContainerStarted","Data":"bf2f72d2a1f6a5eae0309e0997f46c2dc890b4cb6a13786d726989c91a4b6759"} Feb 18 14:11:13 crc kubenswrapper[4817]: I0218 14:11:13.524803 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-l4ntc" podStartSLOduration=1.594429109 podStartE2EDuration="4.524786637s" podCreationTimestamp="2026-02-18 14:11:09 +0000 UTC" firstStartedPulling="2026-02-18 14:11:10.029767449 +0000 UTC m=+732.605303432" lastFinishedPulling="2026-02-18 14:11:12.960124987 +0000 UTC m=+735.535660960" observedRunningTime="2026-02-18 14:11:13.524249193 +0000 UTC m=+736.099785186" watchObservedRunningTime="2026-02-18 14:11:13.524786637 +0000 UTC m=+736.100322610" Feb 18 14:11:13 crc kubenswrapper[4817]: I0218 14:11:13.543775 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-b2vsq" podStartSLOduration=2.188099817 podStartE2EDuration="4.543759166s" podCreationTimestamp="2026-02-18 14:11:09 +0000 UTC" firstStartedPulling="2026-02-18 14:11:10.631900176 +0000 UTC m=+733.207436179" lastFinishedPulling="2026-02-18 14:11:12.987559545 +0000 UTC m=+735.563095528" observedRunningTime="2026-02-18 14:11:13.541123661 +0000 UTC m=+736.116659644" watchObservedRunningTime="2026-02-18 14:11:13.543759166 +0000 UTC m=+736.119295159" Feb 18 14:11:15 crc kubenswrapper[4817]: I0218 14:11:15.531885 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-qnftd" event={"ID":"65c92dc4-036d-42e0-baa1-3dc7e23c43b3","Type":"ContainerStarted","Data":"867aecf679ef7ea119357971b18c04114b872203436f985ddca79acc5869273e"} Feb 18 14:11:15 crc kubenswrapper[4817]: I0218 14:11:15.559051 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-8qsxw" podStartSLOduration=4.042204238 podStartE2EDuration="6.559026695s" podCreationTimestamp="2026-02-18 14:11:09 +0000 UTC" firstStartedPulling="2026-02-18 14:11:10.438658935 +0000 UTC m=+733.014194918" lastFinishedPulling="2026-02-18 14:11:12.955481392 +0000 UTC m=+735.531017375" observedRunningTime="2026-02-18 14:11:13.556077991 +0000 UTC m=+736.131613984" watchObservedRunningTime="2026-02-18 14:11:15.559026695 +0000 UTC m=+738.134562678" Feb 18 14:11:15 crc kubenswrapper[4817]: I0218 14:11:15.567253 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-qnftd" podStartSLOduration=1.83336311 podStartE2EDuration="6.567232028s" podCreationTimestamp="2026-02-18 14:11:09 +0000 UTC" firstStartedPulling="2026-02-18 14:11:10.366865909 +0000 UTC m=+732.942401882" lastFinishedPulling="2026-02-18 14:11:15.100734787 +0000 UTC m=+737.676270800" observedRunningTime="2026-02-18 14:11:15.552918554 +0000 UTC m=+738.128454537" watchObservedRunningTime="2026-02-18 14:11:15.567232028 +0000 UTC m=+738.142768011" Feb 18 14:11:19 crc kubenswrapper[4817]: I0218 14:11:19.868117 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-l4ntc" Feb 18 14:11:20 crc kubenswrapper[4817]: I0218 14:11:20.307768 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-d74b67b44-lgldb" Feb 18 14:11:20 crc kubenswrapper[4817]: I0218 14:11:20.307856 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-d74b67b44-lgldb" Feb 18 14:11:20 crc kubenswrapper[4817]: I0218 14:11:20.315523 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-d74b67b44-lgldb" Feb 18 14:11:20 crc kubenswrapper[4817]: I0218 14:11:20.579304 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-d74b67b44-lgldb" Feb 18 14:11:20 crc kubenswrapper[4817]: I0218 14:11:20.642693 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-v2snv"] Feb 18 14:11:30 crc kubenswrapper[4817]: I0218 14:11:30.405670 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-b2vsq" Feb 18 14:11:40 crc kubenswrapper[4817]: I0218 14:11:40.132382 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-z57r5"] Feb 18 14:11:40 crc kubenswrapper[4817]: I0218 14:11:40.136669 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z57r5" Feb 18 14:11:40 crc kubenswrapper[4817]: I0218 14:11:40.142162 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z57r5"] Feb 18 14:11:40 crc kubenswrapper[4817]: I0218 14:11:40.248323 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60c1a950-68ea-4e2c-8c85-33075644473f-catalog-content\") pod \"community-operators-z57r5\" (UID: \"60c1a950-68ea-4e2c-8c85-33075644473f\") " pod="openshift-marketplace/community-operators-z57r5" Feb 18 14:11:40 crc kubenswrapper[4817]: I0218 14:11:40.248386 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60c1a950-68ea-4e2c-8c85-33075644473f-utilities\") pod \"community-operators-z57r5\" (UID: \"60c1a950-68ea-4e2c-8c85-33075644473f\") " pod="openshift-marketplace/community-operators-z57r5" Feb 18 14:11:40 crc kubenswrapper[4817]: I0218 14:11:40.248406 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv2jf\" (UniqueName: \"kubernetes.io/projected/60c1a950-68ea-4e2c-8c85-33075644473f-kube-api-access-jv2jf\") pod \"community-operators-z57r5\" (UID: \"60c1a950-68ea-4e2c-8c85-33075644473f\") " pod="openshift-marketplace/community-operators-z57r5" Feb 18 14:11:40 crc kubenswrapper[4817]: I0218 14:11:40.349414 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60c1a950-68ea-4e2c-8c85-33075644473f-catalog-content\") pod \"community-operators-z57r5\" (UID: \"60c1a950-68ea-4e2c-8c85-33075644473f\") " pod="openshift-marketplace/community-operators-z57r5" Feb 18 14:11:40 crc kubenswrapper[4817]: I0218 14:11:40.349506 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60c1a950-68ea-4e2c-8c85-33075644473f-utilities\") pod \"community-operators-z57r5\" (UID: \"60c1a950-68ea-4e2c-8c85-33075644473f\") " pod="openshift-marketplace/community-operators-z57r5" Feb 18 14:11:40 crc kubenswrapper[4817]: I0218 14:11:40.349538 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jv2jf\" (UniqueName: \"kubernetes.io/projected/60c1a950-68ea-4e2c-8c85-33075644473f-kube-api-access-jv2jf\") pod \"community-operators-z57r5\" (UID: \"60c1a950-68ea-4e2c-8c85-33075644473f\") " pod="openshift-marketplace/community-operators-z57r5" Feb 18 14:11:40 crc kubenswrapper[4817]: I0218 14:11:40.350738 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60c1a950-68ea-4e2c-8c85-33075644473f-catalog-content\") pod \"community-operators-z57r5\" (UID: \"60c1a950-68ea-4e2c-8c85-33075644473f\") " pod="openshift-marketplace/community-operators-z57r5" Feb 18 14:11:40 crc kubenswrapper[4817]: I0218 14:11:40.351121 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60c1a950-68ea-4e2c-8c85-33075644473f-utilities\") pod \"community-operators-z57r5\" (UID: \"60c1a950-68ea-4e2c-8c85-33075644473f\") " pod="openshift-marketplace/community-operators-z57r5" Feb 18 14:11:40 crc kubenswrapper[4817]: I0218 14:11:40.380576 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv2jf\" (UniqueName: \"kubernetes.io/projected/60c1a950-68ea-4e2c-8c85-33075644473f-kube-api-access-jv2jf\") pod \"community-operators-z57r5\" (UID: \"60c1a950-68ea-4e2c-8c85-33075644473f\") " pod="openshift-marketplace/community-operators-z57r5" Feb 18 14:11:40 crc kubenswrapper[4817]: I0218 14:11:40.514884 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z57r5" Feb 18 14:11:41 crc kubenswrapper[4817]: I0218 14:11:41.071871 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z57r5"] Feb 18 14:11:41 crc kubenswrapper[4817]: I0218 14:11:41.717235 4817 generic.go:334] "Generic (PLEG): container finished" podID="60c1a950-68ea-4e2c-8c85-33075644473f" containerID="60efd4ce70dd6bff614a3aa8980e34d71567c9999a21466d86a1d766e60a2e6c" exitCode=0 Feb 18 14:11:41 crc kubenswrapper[4817]: I0218 14:11:41.717313 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z57r5" event={"ID":"60c1a950-68ea-4e2c-8c85-33075644473f","Type":"ContainerDied","Data":"60efd4ce70dd6bff614a3aa8980e34d71567c9999a21466d86a1d766e60a2e6c"} Feb 18 14:11:41 crc kubenswrapper[4817]: I0218 14:11:41.718048 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z57r5" event={"ID":"60c1a950-68ea-4e2c-8c85-33075644473f","Type":"ContainerStarted","Data":"47a18cd7fb7699432844f780cf0e6e234414bc2cda2b4d484ea60c954546e7c3"} Feb 18 14:11:42 crc kubenswrapper[4817]: I0218 14:11:42.733252 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z57r5" event={"ID":"60c1a950-68ea-4e2c-8c85-33075644473f","Type":"ContainerStarted","Data":"495468f4594461f8244d80d59ce0af40c47dd3b6be9d7811e1ef56bd5b7039a7"} Feb 18 14:11:43 crc kubenswrapper[4817]: I0218 14:11:43.741317 4817 generic.go:334] "Generic (PLEG): container finished" podID="60c1a950-68ea-4e2c-8c85-33075644473f" containerID="495468f4594461f8244d80d59ce0af40c47dd3b6be9d7811e1ef56bd5b7039a7" exitCode=0 Feb 18 14:11:43 crc kubenswrapper[4817]: I0218 14:11:43.741425 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z57r5" event={"ID":"60c1a950-68ea-4e2c-8c85-33075644473f","Type":"ContainerDied","Data":"495468f4594461f8244d80d59ce0af40c47dd3b6be9d7811e1ef56bd5b7039a7"} Feb 18 14:11:44 crc kubenswrapper[4817]: I0218 14:11:44.756599 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z57r5" event={"ID":"60c1a950-68ea-4e2c-8c85-33075644473f","Type":"ContainerStarted","Data":"bba3fafac773a8bb0201f15c4821c7bf6a4fc544777ca83150e3904c232652e7"} Feb 18 14:11:44 crc kubenswrapper[4817]: I0218 14:11:44.776439 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-z57r5" podStartSLOduration=2.384729832 podStartE2EDuration="4.776420994s" podCreationTimestamp="2026-02-18 14:11:40 +0000 UTC" firstStartedPulling="2026-02-18 14:11:41.722811107 +0000 UTC m=+764.298347090" lastFinishedPulling="2026-02-18 14:11:44.114502269 +0000 UTC m=+766.690038252" observedRunningTime="2026-02-18 14:11:44.774376404 +0000 UTC m=+767.349912417" watchObservedRunningTime="2026-02-18 14:11:44.776420994 +0000 UTC m=+767.351956977" Feb 18 14:11:45 crc kubenswrapper[4817]: I0218 14:11:45.690431 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-v2snv" podUID="149bcfc3-9623-403e-8c4c-1019bd5f0c16" containerName="console" containerID="cri-o://1c4b06f7ab77447089188bafbdd63f0f794c321dcb0d86ec86ac187279f2f3b2" gracePeriod=15 Feb 18 14:11:46 crc kubenswrapper[4817]: I0218 14:11:46.061540 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-v2snv_149bcfc3-9623-403e-8c4c-1019bd5f0c16/console/0.log" Feb 18 14:11:46 crc kubenswrapper[4817]: I0218 14:11:46.061656 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-v2snv" Feb 18 14:11:46 crc kubenswrapper[4817]: I0218 14:11:46.131244 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/149bcfc3-9623-403e-8c4c-1019bd5f0c16-console-serving-cert\") pod \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\" (UID: \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\") " Feb 18 14:11:46 crc kubenswrapper[4817]: I0218 14:11:46.131308 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/149bcfc3-9623-403e-8c4c-1019bd5f0c16-trusted-ca-bundle\") pod \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\" (UID: \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\") " Feb 18 14:11:46 crc kubenswrapper[4817]: I0218 14:11:46.131363 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbh4k\" (UniqueName: \"kubernetes.io/projected/149bcfc3-9623-403e-8c4c-1019bd5f0c16-kube-api-access-qbh4k\") pod \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\" (UID: \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\") " Feb 18 14:11:46 crc kubenswrapper[4817]: I0218 14:11:46.132209 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/149bcfc3-9623-403e-8c4c-1019bd5f0c16-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "149bcfc3-9623-403e-8c4c-1019bd5f0c16" (UID: "149bcfc3-9623-403e-8c4c-1019bd5f0c16"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:11:46 crc kubenswrapper[4817]: I0218 14:11:46.131387 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/149bcfc3-9623-403e-8c4c-1019bd5f0c16-console-oauth-config\") pod \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\" (UID: \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\") " Feb 18 14:11:46 crc kubenswrapper[4817]: I0218 14:11:46.132334 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/149bcfc3-9623-403e-8c4c-1019bd5f0c16-console-config\") pod \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\" (UID: \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\") " Feb 18 14:11:46 crc kubenswrapper[4817]: I0218 14:11:46.132629 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/149bcfc3-9623-403e-8c4c-1019bd5f0c16-console-config" (OuterVolumeSpecName: "console-config") pod "149bcfc3-9623-403e-8c4c-1019bd5f0c16" (UID: "149bcfc3-9623-403e-8c4c-1019bd5f0c16"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:11:46 crc kubenswrapper[4817]: I0218 14:11:46.132658 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/149bcfc3-9623-403e-8c4c-1019bd5f0c16-service-ca\") pod \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\" (UID: \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\") " Feb 18 14:11:46 crc kubenswrapper[4817]: I0218 14:11:46.132697 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/149bcfc3-9623-403e-8c4c-1019bd5f0c16-oauth-serving-cert\") pod \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\" (UID: \"149bcfc3-9623-403e-8c4c-1019bd5f0c16\") " Feb 18 14:11:46 crc kubenswrapper[4817]: I0218 14:11:46.132907 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/149bcfc3-9623-403e-8c4c-1019bd5f0c16-service-ca" (OuterVolumeSpecName: "service-ca") pod "149bcfc3-9623-403e-8c4c-1019bd5f0c16" (UID: "149bcfc3-9623-403e-8c4c-1019bd5f0c16"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:11:46 crc kubenswrapper[4817]: I0218 14:11:46.133037 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/149bcfc3-9623-403e-8c4c-1019bd5f0c16-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "149bcfc3-9623-403e-8c4c-1019bd5f0c16" (UID: "149bcfc3-9623-403e-8c4c-1019bd5f0c16"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:11:46 crc kubenswrapper[4817]: I0218 14:11:46.133364 4817 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/149bcfc3-9623-403e-8c4c-1019bd5f0c16-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:11:46 crc kubenswrapper[4817]: I0218 14:11:46.133379 4817 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/149bcfc3-9623-403e-8c4c-1019bd5f0c16-console-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:11:46 crc kubenswrapper[4817]: I0218 14:11:46.133387 4817 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/149bcfc3-9623-403e-8c4c-1019bd5f0c16-service-ca\") on node \"crc\" DevicePath \"\"" Feb 18 14:11:46 crc kubenswrapper[4817]: I0218 14:11:46.133397 4817 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/149bcfc3-9623-403e-8c4c-1019bd5f0c16-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 14:11:46 crc kubenswrapper[4817]: I0218 14:11:46.137365 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/149bcfc3-9623-403e-8c4c-1019bd5f0c16-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "149bcfc3-9623-403e-8c4c-1019bd5f0c16" (UID: "149bcfc3-9623-403e-8c4c-1019bd5f0c16"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:11:46 crc kubenswrapper[4817]: I0218 14:11:46.156195 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/149bcfc3-9623-403e-8c4c-1019bd5f0c16-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "149bcfc3-9623-403e-8c4c-1019bd5f0c16" (UID: "149bcfc3-9623-403e-8c4c-1019bd5f0c16"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:11:46 crc kubenswrapper[4817]: I0218 14:11:46.156315 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/149bcfc3-9623-403e-8c4c-1019bd5f0c16-kube-api-access-qbh4k" (OuterVolumeSpecName: "kube-api-access-qbh4k") pod "149bcfc3-9623-403e-8c4c-1019bd5f0c16" (UID: "149bcfc3-9623-403e-8c4c-1019bd5f0c16"). InnerVolumeSpecName "kube-api-access-qbh4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:11:46 crc kubenswrapper[4817]: I0218 14:11:46.235199 4817 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/149bcfc3-9623-403e-8c4c-1019bd5f0c16-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 18 14:11:46 crc kubenswrapper[4817]: I0218 14:11:46.235241 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbh4k\" (UniqueName: \"kubernetes.io/projected/149bcfc3-9623-403e-8c4c-1019bd5f0c16-kube-api-access-qbh4k\") on node \"crc\" DevicePath \"\"" Feb 18 14:11:46 crc kubenswrapper[4817]: I0218 14:11:46.235275 4817 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/149bcfc3-9623-403e-8c4c-1019bd5f0c16-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:11:46 crc kubenswrapper[4817]: I0218 14:11:46.770037 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-v2snv_149bcfc3-9623-403e-8c4c-1019bd5f0c16/console/0.log" Feb 18 14:11:46 crc kubenswrapper[4817]: I0218 14:11:46.770082 4817 generic.go:334] "Generic (PLEG): container finished" podID="149bcfc3-9623-403e-8c4c-1019bd5f0c16" containerID="1c4b06f7ab77447089188bafbdd63f0f794c321dcb0d86ec86ac187279f2f3b2" exitCode=2 Feb 18 14:11:46 crc kubenswrapper[4817]: I0218 14:11:46.770110 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-v2snv" event={"ID":"149bcfc3-9623-403e-8c4c-1019bd5f0c16","Type":"ContainerDied","Data":"1c4b06f7ab77447089188bafbdd63f0f794c321dcb0d86ec86ac187279f2f3b2"} Feb 18 14:11:46 crc kubenswrapper[4817]: I0218 14:11:46.770138 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-v2snv" event={"ID":"149bcfc3-9623-403e-8c4c-1019bd5f0c16","Type":"ContainerDied","Data":"6ac37bfaa3dea22921cdf36cd458fa7757e771b2e5b3b88783b1d68962ee8573"} Feb 18 14:11:46 crc kubenswrapper[4817]: I0218 14:11:46.770159 4817 scope.go:117] "RemoveContainer" containerID="1c4b06f7ab77447089188bafbdd63f0f794c321dcb0d86ec86ac187279f2f3b2" Feb 18 14:11:46 crc kubenswrapper[4817]: I0218 14:11:46.770278 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-v2snv" Feb 18 14:11:46 crc kubenswrapper[4817]: I0218 14:11:46.790062 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-v2snv"] Feb 18 14:11:46 crc kubenswrapper[4817]: I0218 14:11:46.810867 4817 scope.go:117] "RemoveContainer" containerID="1c4b06f7ab77447089188bafbdd63f0f794c321dcb0d86ec86ac187279f2f3b2" Feb 18 14:11:46 crc kubenswrapper[4817]: E0218 14:11:46.811828 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c4b06f7ab77447089188bafbdd63f0f794c321dcb0d86ec86ac187279f2f3b2\": container with ID starting with 1c4b06f7ab77447089188bafbdd63f0f794c321dcb0d86ec86ac187279f2f3b2 not found: ID does not exist" containerID="1c4b06f7ab77447089188bafbdd63f0f794c321dcb0d86ec86ac187279f2f3b2" Feb 18 14:11:46 crc kubenswrapper[4817]: I0218 14:11:46.812143 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c4b06f7ab77447089188bafbdd63f0f794c321dcb0d86ec86ac187279f2f3b2"} err="failed to get container status \"1c4b06f7ab77447089188bafbdd63f0f794c321dcb0d86ec86ac187279f2f3b2\": rpc error: code = NotFound desc = could not find container \"1c4b06f7ab77447089188bafbdd63f0f794c321dcb0d86ec86ac187279f2f3b2\": container with ID starting with 1c4b06f7ab77447089188bafbdd63f0f794c321dcb0d86ec86ac187279f2f3b2 not found: ID does not exist" Feb 18 14:11:46 crc kubenswrapper[4817]: I0218 14:11:46.823005 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-v2snv"] Feb 18 14:11:48 crc kubenswrapper[4817]: I0218 14:11:48.179204 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="149bcfc3-9623-403e-8c4c-1019bd5f0c16" path="/var/lib/kubelet/pods/149bcfc3-9623-403e-8c4c-1019bd5f0c16/volumes" Feb 18 14:11:50 crc kubenswrapper[4817]: I0218 14:11:50.515645 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-z57r5" Feb 18 14:11:50 crc kubenswrapper[4817]: I0218 14:11:50.517744 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-z57r5" Feb 18 14:11:50 crc kubenswrapper[4817]: I0218 14:11:50.563417 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg"] Feb 18 14:11:50 crc kubenswrapper[4817]: E0218 14:11:50.563759 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="149bcfc3-9623-403e-8c4c-1019bd5f0c16" containerName="console" Feb 18 14:11:50 crc kubenswrapper[4817]: I0218 14:11:50.563788 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="149bcfc3-9623-403e-8c4c-1019bd5f0c16" containerName="console" Feb 18 14:11:50 crc kubenswrapper[4817]: I0218 14:11:50.564012 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="149bcfc3-9623-403e-8c4c-1019bd5f0c16" containerName="console" Feb 18 14:11:50 crc kubenswrapper[4817]: I0218 14:11:50.565397 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg" Feb 18 14:11:50 crc kubenswrapper[4817]: I0218 14:11:50.572461 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 18 14:11:50 crc kubenswrapper[4817]: I0218 14:11:50.584755 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-z57r5" Feb 18 14:11:50 crc kubenswrapper[4817]: I0218 14:11:50.585801 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg"] Feb 18 14:11:50 crc kubenswrapper[4817]: I0218 14:11:50.599962 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d8a0b33c-2815-43e2-bdcc-6a1b99682d34-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg\" (UID: \"d8a0b33c-2815-43e2-bdcc-6a1b99682d34\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg" Feb 18 14:11:50 crc kubenswrapper[4817]: I0218 14:11:50.600013 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzh77\" (UniqueName: \"kubernetes.io/projected/d8a0b33c-2815-43e2-bdcc-6a1b99682d34-kube-api-access-rzh77\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg\" (UID: \"d8a0b33c-2815-43e2-bdcc-6a1b99682d34\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg" Feb 18 14:11:50 crc kubenswrapper[4817]: I0218 14:11:50.600046 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d8a0b33c-2815-43e2-bdcc-6a1b99682d34-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg\" (UID: \"d8a0b33c-2815-43e2-bdcc-6a1b99682d34\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg" Feb 18 14:11:50 crc kubenswrapper[4817]: I0218 14:11:50.701210 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d8a0b33c-2815-43e2-bdcc-6a1b99682d34-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg\" (UID: \"d8a0b33c-2815-43e2-bdcc-6a1b99682d34\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg" Feb 18 14:11:50 crc kubenswrapper[4817]: I0218 14:11:50.701284 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzh77\" (UniqueName: \"kubernetes.io/projected/d8a0b33c-2815-43e2-bdcc-6a1b99682d34-kube-api-access-rzh77\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg\" (UID: \"d8a0b33c-2815-43e2-bdcc-6a1b99682d34\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg" Feb 18 14:11:50 crc kubenswrapper[4817]: I0218 14:11:50.701332 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d8a0b33c-2815-43e2-bdcc-6a1b99682d34-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg\" (UID: \"d8a0b33c-2815-43e2-bdcc-6a1b99682d34\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg" Feb 18 14:11:50 crc kubenswrapper[4817]: I0218 14:11:50.701885 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d8a0b33c-2815-43e2-bdcc-6a1b99682d34-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg\" (UID: \"d8a0b33c-2815-43e2-bdcc-6a1b99682d34\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg" Feb 18 14:11:50 crc kubenswrapper[4817]: I0218 14:11:50.701928 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d8a0b33c-2815-43e2-bdcc-6a1b99682d34-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg\" (UID: \"d8a0b33c-2815-43e2-bdcc-6a1b99682d34\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg" Feb 18 14:11:50 crc kubenswrapper[4817]: I0218 14:11:50.719932 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzh77\" (UniqueName: \"kubernetes.io/projected/d8a0b33c-2815-43e2-bdcc-6a1b99682d34-kube-api-access-rzh77\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg\" (UID: \"d8a0b33c-2815-43e2-bdcc-6a1b99682d34\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg" Feb 18 14:11:50 crc kubenswrapper[4817]: I0218 14:11:50.839092 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-z57r5" Feb 18 14:11:50 crc kubenswrapper[4817]: I0218 14:11:50.886024 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg" Feb 18 14:11:51 crc kubenswrapper[4817]: I0218 14:11:51.293834 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg"] Feb 18 14:11:51 crc kubenswrapper[4817]: I0218 14:11:51.803101 4817 generic.go:334] "Generic (PLEG): container finished" podID="d8a0b33c-2815-43e2-bdcc-6a1b99682d34" containerID="25e11dfaa9da9ec110704a026d011748f83202e8b92d46ec93b797ff8421fe59" exitCode=0 Feb 18 14:11:51 crc kubenswrapper[4817]: I0218 14:11:51.803186 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg" event={"ID":"d8a0b33c-2815-43e2-bdcc-6a1b99682d34","Type":"ContainerDied","Data":"25e11dfaa9da9ec110704a026d011748f83202e8b92d46ec93b797ff8421fe59"} Feb 18 14:11:51 crc kubenswrapper[4817]: I0218 14:11:51.803303 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg" event={"ID":"d8a0b33c-2815-43e2-bdcc-6a1b99682d34","Type":"ContainerStarted","Data":"00e916c329773d2359c7dedc7455eaac494c2243df3e93b49a1fe38fcc779239"} Feb 18 14:11:53 crc kubenswrapper[4817]: I0218 14:11:53.819194 4817 generic.go:334] "Generic (PLEG): container finished" podID="d8a0b33c-2815-43e2-bdcc-6a1b99682d34" containerID="6a11baf5f748dca4399c91f08dcc21037d1a724a63e21f4c400e234cabaf3619" exitCode=0 Feb 18 14:11:53 crc kubenswrapper[4817]: I0218 14:11:53.819329 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg" event={"ID":"d8a0b33c-2815-43e2-bdcc-6a1b99682d34","Type":"ContainerDied","Data":"6a11baf5f748dca4399c91f08dcc21037d1a724a63e21f4c400e234cabaf3619"} Feb 18 14:11:53 crc kubenswrapper[4817]: I0218 14:11:53.914961 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-z57r5"] Feb 18 14:11:53 crc kubenswrapper[4817]: I0218 14:11:53.915195 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-z57r5" podUID="60c1a950-68ea-4e2c-8c85-33075644473f" containerName="registry-server" containerID="cri-o://bba3fafac773a8bb0201f15c4821c7bf6a4fc544777ca83150e3904c232652e7" gracePeriod=2 Feb 18 14:11:54 crc kubenswrapper[4817]: I0218 14:11:54.278435 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z57r5" Feb 18 14:11:54 crc kubenswrapper[4817]: I0218 14:11:54.378651 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60c1a950-68ea-4e2c-8c85-33075644473f-catalog-content\") pod \"60c1a950-68ea-4e2c-8c85-33075644473f\" (UID: \"60c1a950-68ea-4e2c-8c85-33075644473f\") " Feb 18 14:11:54 crc kubenswrapper[4817]: I0218 14:11:54.378731 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jv2jf\" (UniqueName: \"kubernetes.io/projected/60c1a950-68ea-4e2c-8c85-33075644473f-kube-api-access-jv2jf\") pod \"60c1a950-68ea-4e2c-8c85-33075644473f\" (UID: \"60c1a950-68ea-4e2c-8c85-33075644473f\") " Feb 18 14:11:54 crc kubenswrapper[4817]: I0218 14:11:54.378873 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60c1a950-68ea-4e2c-8c85-33075644473f-utilities\") pod \"60c1a950-68ea-4e2c-8c85-33075644473f\" (UID: \"60c1a950-68ea-4e2c-8c85-33075644473f\") " Feb 18 14:11:54 crc kubenswrapper[4817]: I0218 14:11:54.379888 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60c1a950-68ea-4e2c-8c85-33075644473f-utilities" (OuterVolumeSpecName: "utilities") pod "60c1a950-68ea-4e2c-8c85-33075644473f" (UID: "60c1a950-68ea-4e2c-8c85-33075644473f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:11:54 crc kubenswrapper[4817]: I0218 14:11:54.384621 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60c1a950-68ea-4e2c-8c85-33075644473f-kube-api-access-jv2jf" (OuterVolumeSpecName: "kube-api-access-jv2jf") pod "60c1a950-68ea-4e2c-8c85-33075644473f" (UID: "60c1a950-68ea-4e2c-8c85-33075644473f"). InnerVolumeSpecName "kube-api-access-jv2jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:11:54 crc kubenswrapper[4817]: I0218 14:11:54.429670 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60c1a950-68ea-4e2c-8c85-33075644473f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "60c1a950-68ea-4e2c-8c85-33075644473f" (UID: "60c1a950-68ea-4e2c-8c85-33075644473f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:11:54 crc kubenswrapper[4817]: I0218 14:11:54.479933 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60c1a950-68ea-4e2c-8c85-33075644473f-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 14:11:54 crc kubenswrapper[4817]: I0218 14:11:54.479992 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60c1a950-68ea-4e2c-8c85-33075644473f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 14:11:54 crc kubenswrapper[4817]: I0218 14:11:54.480003 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jv2jf\" (UniqueName: \"kubernetes.io/projected/60c1a950-68ea-4e2c-8c85-33075644473f-kube-api-access-jv2jf\") on node \"crc\" DevicePath \"\"" Feb 18 14:11:54 crc kubenswrapper[4817]: I0218 14:11:54.830168 4817 generic.go:334] "Generic (PLEG): container finished" podID="d8a0b33c-2815-43e2-bdcc-6a1b99682d34" containerID="1a6a5949944ce3b41ee88372688307b0bb2ccd2be4605e88e6f21175de7aed67" exitCode=0 Feb 18 14:11:54 crc kubenswrapper[4817]: I0218 14:11:54.830251 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg" event={"ID":"d8a0b33c-2815-43e2-bdcc-6a1b99682d34","Type":"ContainerDied","Data":"1a6a5949944ce3b41ee88372688307b0bb2ccd2be4605e88e6f21175de7aed67"} Feb 18 14:11:54 crc kubenswrapper[4817]: I0218 14:11:54.832462 4817 generic.go:334] "Generic (PLEG): container finished" podID="60c1a950-68ea-4e2c-8c85-33075644473f" containerID="bba3fafac773a8bb0201f15c4821c7bf6a4fc544777ca83150e3904c232652e7" exitCode=0 Feb 18 14:11:54 crc kubenswrapper[4817]: I0218 14:11:54.832515 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z57r5" event={"ID":"60c1a950-68ea-4e2c-8c85-33075644473f","Type":"ContainerDied","Data":"bba3fafac773a8bb0201f15c4821c7bf6a4fc544777ca83150e3904c232652e7"} Feb 18 14:11:54 crc kubenswrapper[4817]: I0218 14:11:54.832538 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z57r5" Feb 18 14:11:54 crc kubenswrapper[4817]: I0218 14:11:54.832564 4817 scope.go:117] "RemoveContainer" containerID="bba3fafac773a8bb0201f15c4821c7bf6a4fc544777ca83150e3904c232652e7" Feb 18 14:11:54 crc kubenswrapper[4817]: I0218 14:11:54.832551 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z57r5" event={"ID":"60c1a950-68ea-4e2c-8c85-33075644473f","Type":"ContainerDied","Data":"47a18cd7fb7699432844f780cf0e6e234414bc2cda2b4d484ea60c954546e7c3"} Feb 18 14:11:54 crc kubenswrapper[4817]: I0218 14:11:54.863161 4817 scope.go:117] "RemoveContainer" containerID="495468f4594461f8244d80d59ce0af40c47dd3b6be9d7811e1ef56bd5b7039a7" Feb 18 14:11:54 crc kubenswrapper[4817]: I0218 14:11:54.896660 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-z57r5"] Feb 18 14:11:54 crc kubenswrapper[4817]: I0218 14:11:54.919812 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-z57r5"] Feb 18 14:11:54 crc kubenswrapper[4817]: I0218 14:11:54.933175 4817 scope.go:117] "RemoveContainer" containerID="60efd4ce70dd6bff614a3aa8980e34d71567c9999a21466d86a1d766e60a2e6c" Feb 18 14:11:54 crc kubenswrapper[4817]: I0218 14:11:54.957533 4817 scope.go:117] "RemoveContainer" containerID="bba3fafac773a8bb0201f15c4821c7bf6a4fc544777ca83150e3904c232652e7" Feb 18 14:11:54 crc kubenswrapper[4817]: E0218 14:11:54.957950 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bba3fafac773a8bb0201f15c4821c7bf6a4fc544777ca83150e3904c232652e7\": container with ID starting with bba3fafac773a8bb0201f15c4821c7bf6a4fc544777ca83150e3904c232652e7 not found: ID does not exist" containerID="bba3fafac773a8bb0201f15c4821c7bf6a4fc544777ca83150e3904c232652e7" Feb 18 14:11:54 crc kubenswrapper[4817]: I0218 14:11:54.957990 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bba3fafac773a8bb0201f15c4821c7bf6a4fc544777ca83150e3904c232652e7"} err="failed to get container status \"bba3fafac773a8bb0201f15c4821c7bf6a4fc544777ca83150e3904c232652e7\": rpc error: code = NotFound desc = could not find container \"bba3fafac773a8bb0201f15c4821c7bf6a4fc544777ca83150e3904c232652e7\": container with ID starting with bba3fafac773a8bb0201f15c4821c7bf6a4fc544777ca83150e3904c232652e7 not found: ID does not exist" Feb 18 14:11:54 crc kubenswrapper[4817]: I0218 14:11:54.958010 4817 scope.go:117] "RemoveContainer" containerID="495468f4594461f8244d80d59ce0af40c47dd3b6be9d7811e1ef56bd5b7039a7" Feb 18 14:11:54 crc kubenswrapper[4817]: E0218 14:11:54.958358 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"495468f4594461f8244d80d59ce0af40c47dd3b6be9d7811e1ef56bd5b7039a7\": container with ID starting with 495468f4594461f8244d80d59ce0af40c47dd3b6be9d7811e1ef56bd5b7039a7 not found: ID does not exist" containerID="495468f4594461f8244d80d59ce0af40c47dd3b6be9d7811e1ef56bd5b7039a7" Feb 18 14:11:54 crc kubenswrapper[4817]: I0218 14:11:54.958378 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"495468f4594461f8244d80d59ce0af40c47dd3b6be9d7811e1ef56bd5b7039a7"} err="failed to get container status \"495468f4594461f8244d80d59ce0af40c47dd3b6be9d7811e1ef56bd5b7039a7\": rpc error: code = NotFound desc = could not find container \"495468f4594461f8244d80d59ce0af40c47dd3b6be9d7811e1ef56bd5b7039a7\": container with ID starting with 495468f4594461f8244d80d59ce0af40c47dd3b6be9d7811e1ef56bd5b7039a7 not found: ID does not exist" Feb 18 14:11:54 crc kubenswrapper[4817]: I0218 14:11:54.958391 4817 scope.go:117] "RemoveContainer" containerID="60efd4ce70dd6bff614a3aa8980e34d71567c9999a21466d86a1d766e60a2e6c" Feb 18 14:11:54 crc kubenswrapper[4817]: E0218 14:11:54.958604 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60efd4ce70dd6bff614a3aa8980e34d71567c9999a21466d86a1d766e60a2e6c\": container with ID starting with 60efd4ce70dd6bff614a3aa8980e34d71567c9999a21466d86a1d766e60a2e6c not found: ID does not exist" containerID="60efd4ce70dd6bff614a3aa8980e34d71567c9999a21466d86a1d766e60a2e6c" Feb 18 14:11:54 crc kubenswrapper[4817]: I0218 14:11:54.958622 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60efd4ce70dd6bff614a3aa8980e34d71567c9999a21466d86a1d766e60a2e6c"} err="failed to get container status \"60efd4ce70dd6bff614a3aa8980e34d71567c9999a21466d86a1d766e60a2e6c\": rpc error: code = NotFound desc = could not find container \"60efd4ce70dd6bff614a3aa8980e34d71567c9999a21466d86a1d766e60a2e6c\": container with ID starting with 60efd4ce70dd6bff614a3aa8980e34d71567c9999a21466d86a1d766e60a2e6c not found: ID does not exist" Feb 18 14:11:56 crc kubenswrapper[4817]: I0218 14:11:56.064822 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg" Feb 18 14:11:56 crc kubenswrapper[4817]: I0218 14:11:56.179016 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60c1a950-68ea-4e2c-8c85-33075644473f" path="/var/lib/kubelet/pods/60c1a950-68ea-4e2c-8c85-33075644473f/volumes" Feb 18 14:11:56 crc kubenswrapper[4817]: I0218 14:11:56.202659 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d8a0b33c-2815-43e2-bdcc-6a1b99682d34-util\") pod \"d8a0b33c-2815-43e2-bdcc-6a1b99682d34\" (UID: \"d8a0b33c-2815-43e2-bdcc-6a1b99682d34\") " Feb 18 14:11:56 crc kubenswrapper[4817]: I0218 14:11:56.202787 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzh77\" (UniqueName: \"kubernetes.io/projected/d8a0b33c-2815-43e2-bdcc-6a1b99682d34-kube-api-access-rzh77\") pod \"d8a0b33c-2815-43e2-bdcc-6a1b99682d34\" (UID: \"d8a0b33c-2815-43e2-bdcc-6a1b99682d34\") " Feb 18 14:11:56 crc kubenswrapper[4817]: I0218 14:11:56.202816 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d8a0b33c-2815-43e2-bdcc-6a1b99682d34-bundle\") pod \"d8a0b33c-2815-43e2-bdcc-6a1b99682d34\" (UID: \"d8a0b33c-2815-43e2-bdcc-6a1b99682d34\") " Feb 18 14:11:56 crc kubenswrapper[4817]: I0218 14:11:56.204331 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8a0b33c-2815-43e2-bdcc-6a1b99682d34-bundle" (OuterVolumeSpecName: "bundle") pod "d8a0b33c-2815-43e2-bdcc-6a1b99682d34" (UID: "d8a0b33c-2815-43e2-bdcc-6a1b99682d34"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:11:56 crc kubenswrapper[4817]: I0218 14:11:56.214278 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8a0b33c-2815-43e2-bdcc-6a1b99682d34-kube-api-access-rzh77" (OuterVolumeSpecName: "kube-api-access-rzh77") pod "d8a0b33c-2815-43e2-bdcc-6a1b99682d34" (UID: "d8a0b33c-2815-43e2-bdcc-6a1b99682d34"). InnerVolumeSpecName "kube-api-access-rzh77". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:11:56 crc kubenswrapper[4817]: I0218 14:11:56.218847 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8a0b33c-2815-43e2-bdcc-6a1b99682d34-util" (OuterVolumeSpecName: "util") pod "d8a0b33c-2815-43e2-bdcc-6a1b99682d34" (UID: "d8a0b33c-2815-43e2-bdcc-6a1b99682d34"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:11:56 crc kubenswrapper[4817]: I0218 14:11:56.304793 4817 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d8a0b33c-2815-43e2-bdcc-6a1b99682d34-util\") on node \"crc\" DevicePath \"\"" Feb 18 14:11:56 crc kubenswrapper[4817]: I0218 14:11:56.304833 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzh77\" (UniqueName: \"kubernetes.io/projected/d8a0b33c-2815-43e2-bdcc-6a1b99682d34-kube-api-access-rzh77\") on node \"crc\" DevicePath \"\"" Feb 18 14:11:56 crc kubenswrapper[4817]: I0218 14:11:56.304844 4817 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d8a0b33c-2815-43e2-bdcc-6a1b99682d34-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:11:56 crc kubenswrapper[4817]: I0218 14:11:56.847934 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg" event={"ID":"d8a0b33c-2815-43e2-bdcc-6a1b99682d34","Type":"ContainerDied","Data":"00e916c329773d2359c7dedc7455eaac494c2243df3e93b49a1fe38fcc779239"} Feb 18 14:11:56 crc kubenswrapper[4817]: I0218 14:11:56.847974 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00e916c329773d2359c7dedc7455eaac494c2243df3e93b49a1fe38fcc779239" Feb 18 14:11:56 crc kubenswrapper[4817]: I0218 14:11:56.848094 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.048997 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7c8dd94b68-49zzw"] Feb 18 14:12:06 crc kubenswrapper[4817]: E0218 14:12:06.049881 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60c1a950-68ea-4e2c-8c85-33075644473f" containerName="extract-content" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.049897 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="60c1a950-68ea-4e2c-8c85-33075644473f" containerName="extract-content" Feb 18 14:12:06 crc kubenswrapper[4817]: E0218 14:12:06.049917 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a0b33c-2815-43e2-bdcc-6a1b99682d34" containerName="pull" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.049926 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a0b33c-2815-43e2-bdcc-6a1b99682d34" containerName="pull" Feb 18 14:12:06 crc kubenswrapper[4817]: E0218 14:12:06.049935 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a0b33c-2815-43e2-bdcc-6a1b99682d34" containerName="extract" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.049989 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a0b33c-2815-43e2-bdcc-6a1b99682d34" containerName="extract" Feb 18 14:12:06 crc kubenswrapper[4817]: E0218 14:12:06.050004 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60c1a950-68ea-4e2c-8c85-33075644473f" containerName="registry-server" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.050012 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="60c1a950-68ea-4e2c-8c85-33075644473f" containerName="registry-server" Feb 18 14:12:06 crc kubenswrapper[4817]: E0218 14:12:06.050027 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60c1a950-68ea-4e2c-8c85-33075644473f" containerName="extract-utilities" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.050034 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="60c1a950-68ea-4e2c-8c85-33075644473f" containerName="extract-utilities" Feb 18 14:12:06 crc kubenswrapper[4817]: E0218 14:12:06.050043 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a0b33c-2815-43e2-bdcc-6a1b99682d34" containerName="util" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.050052 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a0b33c-2815-43e2-bdcc-6a1b99682d34" containerName="util" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.050178 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="60c1a950-68ea-4e2c-8c85-33075644473f" containerName="registry-server" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.050195 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8a0b33c-2815-43e2-bdcc-6a1b99682d34" containerName="extract" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.050699 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7c8dd94b68-49zzw" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.055757 4817 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.055837 4817 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.057461 4817 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-qpkz8" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.057942 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.059366 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.072869 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7c8dd94b68-49zzw"] Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.143600 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ba0591c4-822e-406b-a86b-1f2a6078452c-webhook-cert\") pod \"metallb-operator-controller-manager-7c8dd94b68-49zzw\" (UID: \"ba0591c4-822e-406b-a86b-1f2a6078452c\") " pod="metallb-system/metallb-operator-controller-manager-7c8dd94b68-49zzw" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.143680 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ba0591c4-822e-406b-a86b-1f2a6078452c-apiservice-cert\") pod \"metallb-operator-controller-manager-7c8dd94b68-49zzw\" (UID: \"ba0591c4-822e-406b-a86b-1f2a6078452c\") " pod="metallb-system/metallb-operator-controller-manager-7c8dd94b68-49zzw" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.143712 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n54bg\" (UniqueName: \"kubernetes.io/projected/ba0591c4-822e-406b-a86b-1f2a6078452c-kube-api-access-n54bg\") pod \"metallb-operator-controller-manager-7c8dd94b68-49zzw\" (UID: \"ba0591c4-822e-406b-a86b-1f2a6078452c\") " pod="metallb-system/metallb-operator-controller-manager-7c8dd94b68-49zzw" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.244624 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n54bg\" (UniqueName: \"kubernetes.io/projected/ba0591c4-822e-406b-a86b-1f2a6078452c-kube-api-access-n54bg\") pod \"metallb-operator-controller-manager-7c8dd94b68-49zzw\" (UID: \"ba0591c4-822e-406b-a86b-1f2a6078452c\") " pod="metallb-system/metallb-operator-controller-manager-7c8dd94b68-49zzw" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.244735 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ba0591c4-822e-406b-a86b-1f2a6078452c-webhook-cert\") pod \"metallb-operator-controller-manager-7c8dd94b68-49zzw\" (UID: \"ba0591c4-822e-406b-a86b-1f2a6078452c\") " pod="metallb-system/metallb-operator-controller-manager-7c8dd94b68-49zzw" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.244772 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ba0591c4-822e-406b-a86b-1f2a6078452c-apiservice-cert\") pod \"metallb-operator-controller-manager-7c8dd94b68-49zzw\" (UID: \"ba0591c4-822e-406b-a86b-1f2a6078452c\") " pod="metallb-system/metallb-operator-controller-manager-7c8dd94b68-49zzw" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.251067 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ba0591c4-822e-406b-a86b-1f2a6078452c-webhook-cert\") pod \"metallb-operator-controller-manager-7c8dd94b68-49zzw\" (UID: \"ba0591c4-822e-406b-a86b-1f2a6078452c\") " pod="metallb-system/metallb-operator-controller-manager-7c8dd94b68-49zzw" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.265991 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ba0591c4-822e-406b-a86b-1f2a6078452c-apiservice-cert\") pod \"metallb-operator-controller-manager-7c8dd94b68-49zzw\" (UID: \"ba0591c4-822e-406b-a86b-1f2a6078452c\") " pod="metallb-system/metallb-operator-controller-manager-7c8dd94b68-49zzw" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.278714 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n54bg\" (UniqueName: \"kubernetes.io/projected/ba0591c4-822e-406b-a86b-1f2a6078452c-kube-api-access-n54bg\") pod \"metallb-operator-controller-manager-7c8dd94b68-49zzw\" (UID: \"ba0591c4-822e-406b-a86b-1f2a6078452c\") " pod="metallb-system/metallb-operator-controller-manager-7c8dd94b68-49zzw" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.338398 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-57fdf9bc8-smmwn"] Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.339205 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-57fdf9bc8-smmwn" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.341448 4817 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.341454 4817 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.341685 4817 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-nc46c" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.353730 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-57fdf9bc8-smmwn"] Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.365674 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7c8dd94b68-49zzw" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.448604 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blhmh\" (UniqueName: \"kubernetes.io/projected/4c5c3b60-c65f-4f6c-869b-162ebd95eb32-kube-api-access-blhmh\") pod \"metallb-operator-webhook-server-57fdf9bc8-smmwn\" (UID: \"4c5c3b60-c65f-4f6c-869b-162ebd95eb32\") " pod="metallb-system/metallb-operator-webhook-server-57fdf9bc8-smmwn" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.449052 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c5c3b60-c65f-4f6c-869b-162ebd95eb32-apiservice-cert\") pod \"metallb-operator-webhook-server-57fdf9bc8-smmwn\" (UID: \"4c5c3b60-c65f-4f6c-869b-162ebd95eb32\") " pod="metallb-system/metallb-operator-webhook-server-57fdf9bc8-smmwn" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.449168 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4c5c3b60-c65f-4f6c-869b-162ebd95eb32-webhook-cert\") pod \"metallb-operator-webhook-server-57fdf9bc8-smmwn\" (UID: \"4c5c3b60-c65f-4f6c-869b-162ebd95eb32\") " pod="metallb-system/metallb-operator-webhook-server-57fdf9bc8-smmwn" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.551611 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blhmh\" (UniqueName: \"kubernetes.io/projected/4c5c3b60-c65f-4f6c-869b-162ebd95eb32-kube-api-access-blhmh\") pod \"metallb-operator-webhook-server-57fdf9bc8-smmwn\" (UID: \"4c5c3b60-c65f-4f6c-869b-162ebd95eb32\") " pod="metallb-system/metallb-operator-webhook-server-57fdf9bc8-smmwn" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.551663 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c5c3b60-c65f-4f6c-869b-162ebd95eb32-apiservice-cert\") pod \"metallb-operator-webhook-server-57fdf9bc8-smmwn\" (UID: \"4c5c3b60-c65f-4f6c-869b-162ebd95eb32\") " pod="metallb-system/metallb-operator-webhook-server-57fdf9bc8-smmwn" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.551730 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4c5c3b60-c65f-4f6c-869b-162ebd95eb32-webhook-cert\") pod \"metallb-operator-webhook-server-57fdf9bc8-smmwn\" (UID: \"4c5c3b60-c65f-4f6c-869b-162ebd95eb32\") " pod="metallb-system/metallb-operator-webhook-server-57fdf9bc8-smmwn" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.559866 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c5c3b60-c65f-4f6c-869b-162ebd95eb32-apiservice-cert\") pod \"metallb-operator-webhook-server-57fdf9bc8-smmwn\" (UID: \"4c5c3b60-c65f-4f6c-869b-162ebd95eb32\") " pod="metallb-system/metallb-operator-webhook-server-57fdf9bc8-smmwn" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.559873 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4c5c3b60-c65f-4f6c-869b-162ebd95eb32-webhook-cert\") pod \"metallb-operator-webhook-server-57fdf9bc8-smmwn\" (UID: \"4c5c3b60-c65f-4f6c-869b-162ebd95eb32\") " pod="metallb-system/metallb-operator-webhook-server-57fdf9bc8-smmwn" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.575465 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blhmh\" (UniqueName: \"kubernetes.io/projected/4c5c3b60-c65f-4f6c-869b-162ebd95eb32-kube-api-access-blhmh\") pod \"metallb-operator-webhook-server-57fdf9bc8-smmwn\" (UID: \"4c5c3b60-c65f-4f6c-869b-162ebd95eb32\") " pod="metallb-system/metallb-operator-webhook-server-57fdf9bc8-smmwn" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.654289 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-57fdf9bc8-smmwn" Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.878473 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7c8dd94b68-49zzw"] Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.909998 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-57fdf9bc8-smmwn"] Feb 18 14:12:06 crc kubenswrapper[4817]: I0218 14:12:06.912121 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7c8dd94b68-49zzw" event={"ID":"ba0591c4-822e-406b-a86b-1f2a6078452c","Type":"ContainerStarted","Data":"97191d42d177c7d22f649628c46db0f050a4a3182b2be9da7f070bb2360ff3b2"} Feb 18 14:12:07 crc kubenswrapper[4817]: I0218 14:12:07.920502 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-57fdf9bc8-smmwn" event={"ID":"4c5c3b60-c65f-4f6c-869b-162ebd95eb32","Type":"ContainerStarted","Data":"b126ef0120e372818e71b98494f16a2991cbf0b67fed8d1215c76c5253921c48"} Feb 18 14:12:11 crc kubenswrapper[4817]: I0218 14:12:11.972412 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7c8dd94b68-49zzw" event={"ID":"ba0591c4-822e-406b-a86b-1f2a6078452c","Type":"ContainerStarted","Data":"e0eb61687ec350687a7ada2ee8660d02b6562918bb48051131ccb5bde238b7f7"} Feb 18 14:12:11 crc kubenswrapper[4817]: I0218 14:12:11.973013 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7c8dd94b68-49zzw" Feb 18 14:12:11 crc kubenswrapper[4817]: I0218 14:12:11.975676 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-57fdf9bc8-smmwn" event={"ID":"4c5c3b60-c65f-4f6c-869b-162ebd95eb32","Type":"ContainerStarted","Data":"13f9a33e412bfd56e3f1076aa7d70b81c8d452c61dae4b356898310befb0760a"} Feb 18 14:12:11 crc kubenswrapper[4817]: I0218 14:12:11.975827 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-57fdf9bc8-smmwn" Feb 18 14:12:12 crc kubenswrapper[4817]: I0218 14:12:12.030374 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7c8dd94b68-49zzw" podStartSLOduration=1.963010259 podStartE2EDuration="6.03035254s" podCreationTimestamp="2026-02-18 14:12:06 +0000 UTC" firstStartedPulling="2026-02-18 14:12:06.897822368 +0000 UTC m=+789.473358351" lastFinishedPulling="2026-02-18 14:12:10.965164649 +0000 UTC m=+793.540700632" observedRunningTime="2026-02-18 14:12:12.010120295 +0000 UTC m=+794.585656288" watchObservedRunningTime="2026-02-18 14:12:12.03035254 +0000 UTC m=+794.605888523" Feb 18 14:12:12 crc kubenswrapper[4817]: I0218 14:12:12.863798 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:12:12 crc kubenswrapper[4817]: I0218 14:12:12.863865 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:12:26 crc kubenswrapper[4817]: I0218 14:12:26.659760 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-57fdf9bc8-smmwn" Feb 18 14:12:26 crc kubenswrapper[4817]: I0218 14:12:26.678471 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-57fdf9bc8-smmwn" podStartSLOduration=16.618488658 podStartE2EDuration="20.678453612s" podCreationTimestamp="2026-02-18 14:12:06 +0000 UTC" firstStartedPulling="2026-02-18 14:12:06.925292916 +0000 UTC m=+789.500828899" lastFinishedPulling="2026-02-18 14:12:10.98525787 +0000 UTC m=+793.560793853" observedRunningTime="2026-02-18 14:12:12.028146013 +0000 UTC m=+794.603682016" watchObservedRunningTime="2026-02-18 14:12:26.678453612 +0000 UTC m=+809.253989595" Feb 18 14:12:42 crc kubenswrapper[4817]: I0218 14:12:42.863604 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:12:42 crc kubenswrapper[4817]: I0218 14:12:42.864090 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:12:46 crc kubenswrapper[4817]: I0218 14:12:46.368223 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7c8dd94b68-49zzw" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.019005 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-dzb5b"] Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.021244 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-dzb5b" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.023271 4817 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-5gp59" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.023416 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.023495 4817 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.029092 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-kphxt"] Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.030231 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-kphxt" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.031752 4817 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.046966 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-kphxt"] Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.106142 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md7pj\" (UniqueName: \"kubernetes.io/projected/e2fe6fd7-48f6-47ec-b4b3-60016704bad9-kube-api-access-md7pj\") pod \"frr-k8s-webhook-server-78b44bf5bb-kphxt\" (UID: \"e2fe6fd7-48f6-47ec-b4b3-60016704bad9\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-kphxt" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.106207 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snlht\" (UniqueName: \"kubernetes.io/projected/ea454868-79b0-415d-8c0a-6c176b3ca98b-kube-api-access-snlht\") pod \"frr-k8s-dzb5b\" (UID: \"ea454868-79b0-415d-8c0a-6c176b3ca98b\") " pod="metallb-system/frr-k8s-dzb5b" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.106232 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea454868-79b0-415d-8c0a-6c176b3ca98b-metrics-certs\") pod \"frr-k8s-dzb5b\" (UID: \"ea454868-79b0-415d-8c0a-6c176b3ca98b\") " pod="metallb-system/frr-k8s-dzb5b" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.106260 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ea454868-79b0-415d-8c0a-6c176b3ca98b-metrics\") pod \"frr-k8s-dzb5b\" (UID: \"ea454868-79b0-415d-8c0a-6c176b3ca98b\") " pod="metallb-system/frr-k8s-dzb5b" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.106291 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ea454868-79b0-415d-8c0a-6c176b3ca98b-frr-conf\") pod \"frr-k8s-dzb5b\" (UID: \"ea454868-79b0-415d-8c0a-6c176b3ca98b\") " pod="metallb-system/frr-k8s-dzb5b" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.106320 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ea454868-79b0-415d-8c0a-6c176b3ca98b-frr-sockets\") pod \"frr-k8s-dzb5b\" (UID: \"ea454868-79b0-415d-8c0a-6c176b3ca98b\") " pod="metallb-system/frr-k8s-dzb5b" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.106344 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ea454868-79b0-415d-8c0a-6c176b3ca98b-reloader\") pod \"frr-k8s-dzb5b\" (UID: \"ea454868-79b0-415d-8c0a-6c176b3ca98b\") " pod="metallb-system/frr-k8s-dzb5b" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.106385 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e2fe6fd7-48f6-47ec-b4b3-60016704bad9-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-kphxt\" (UID: \"e2fe6fd7-48f6-47ec-b4b3-60016704bad9\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-kphxt" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.106419 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ea454868-79b0-415d-8c0a-6c176b3ca98b-frr-startup\") pod \"frr-k8s-dzb5b\" (UID: \"ea454868-79b0-415d-8c0a-6c176b3ca98b\") " pod="metallb-system/frr-k8s-dzb5b" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.116518 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-rzsqh"] Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.117740 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-rzsqh" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.125023 4817 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.125337 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.125368 4817 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-hzrnz" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.125446 4817 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.133823 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-z6fs7"] Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.134991 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-z6fs7" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.136998 4817 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.148502 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-z6fs7"] Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.207660 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md7pj\" (UniqueName: \"kubernetes.io/projected/e2fe6fd7-48f6-47ec-b4b3-60016704bad9-kube-api-access-md7pj\") pod \"frr-k8s-webhook-server-78b44bf5bb-kphxt\" (UID: \"e2fe6fd7-48f6-47ec-b4b3-60016704bad9\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-kphxt" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.207717 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snlht\" (UniqueName: \"kubernetes.io/projected/ea454868-79b0-415d-8c0a-6c176b3ca98b-kube-api-access-snlht\") pod \"frr-k8s-dzb5b\" (UID: \"ea454868-79b0-415d-8c0a-6c176b3ca98b\") " pod="metallb-system/frr-k8s-dzb5b" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.207746 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea454868-79b0-415d-8c0a-6c176b3ca98b-metrics-certs\") pod \"frr-k8s-dzb5b\" (UID: \"ea454868-79b0-415d-8c0a-6c176b3ca98b\") " pod="metallb-system/frr-k8s-dzb5b" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.207780 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ea454868-79b0-415d-8c0a-6c176b3ca98b-metrics\") pod \"frr-k8s-dzb5b\" (UID: \"ea454868-79b0-415d-8c0a-6c176b3ca98b\") " pod="metallb-system/frr-k8s-dzb5b" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.207804 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5889628d-b78a-4279-95fd-ec441aac9d34-cert\") pod \"controller-69bbfbf88f-z6fs7\" (UID: \"5889628d-b78a-4279-95fd-ec441aac9d34\") " pod="metallb-system/controller-69bbfbf88f-z6fs7" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.207828 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ea454868-79b0-415d-8c0a-6c176b3ca98b-frr-conf\") pod \"frr-k8s-dzb5b\" (UID: \"ea454868-79b0-415d-8c0a-6c176b3ca98b\") " pod="metallb-system/frr-k8s-dzb5b" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.207856 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvvrw\" (UniqueName: \"kubernetes.io/projected/5889628d-b78a-4279-95fd-ec441aac9d34-kube-api-access-cvvrw\") pod \"controller-69bbfbf88f-z6fs7\" (UID: \"5889628d-b78a-4279-95fd-ec441aac9d34\") " pod="metallb-system/controller-69bbfbf88f-z6fs7" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.207881 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ea454868-79b0-415d-8c0a-6c176b3ca98b-frr-sockets\") pod \"frr-k8s-dzb5b\" (UID: \"ea454868-79b0-415d-8c0a-6c176b3ca98b\") " pod="metallb-system/frr-k8s-dzb5b" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.207904 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2ce92458-8bf0-41c0-95d1-219f6c35cdf5-memberlist\") pod \"speaker-rzsqh\" (UID: \"2ce92458-8bf0-41c0-95d1-219f6c35cdf5\") " pod="metallb-system/speaker-rzsqh" Feb 18 14:12:47 crc kubenswrapper[4817]: E0218 14:12:47.207907 4817 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.207930 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ea454868-79b0-415d-8c0a-6c176b3ca98b-reloader\") pod \"frr-k8s-dzb5b\" (UID: \"ea454868-79b0-415d-8c0a-6c176b3ca98b\") " pod="metallb-system/frr-k8s-dzb5b" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.207999 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmx6m\" (UniqueName: \"kubernetes.io/projected/2ce92458-8bf0-41c0-95d1-219f6c35cdf5-kube-api-access-wmx6m\") pod \"speaker-rzsqh\" (UID: \"2ce92458-8bf0-41c0-95d1-219f6c35cdf5\") " pod="metallb-system/speaker-rzsqh" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.208035 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ce92458-8bf0-41c0-95d1-219f6c35cdf5-metrics-certs\") pod \"speaker-rzsqh\" (UID: \"2ce92458-8bf0-41c0-95d1-219f6c35cdf5\") " pod="metallb-system/speaker-rzsqh" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.208063 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5889628d-b78a-4279-95fd-ec441aac9d34-metrics-certs\") pod \"controller-69bbfbf88f-z6fs7\" (UID: \"5889628d-b78a-4279-95fd-ec441aac9d34\") " pod="metallb-system/controller-69bbfbf88f-z6fs7" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.208107 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e2fe6fd7-48f6-47ec-b4b3-60016704bad9-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-kphxt\" (UID: \"e2fe6fd7-48f6-47ec-b4b3-60016704bad9\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-kphxt" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.208134 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2ce92458-8bf0-41c0-95d1-219f6c35cdf5-metallb-excludel2\") pod \"speaker-rzsqh\" (UID: \"2ce92458-8bf0-41c0-95d1-219f6c35cdf5\") " pod="metallb-system/speaker-rzsqh" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.208168 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ea454868-79b0-415d-8c0a-6c176b3ca98b-frr-startup\") pod \"frr-k8s-dzb5b\" (UID: \"ea454868-79b0-415d-8c0a-6c176b3ca98b\") " pod="metallb-system/frr-k8s-dzb5b" Feb 18 14:12:47 crc kubenswrapper[4817]: E0218 14:12:47.208270 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea454868-79b0-415d-8c0a-6c176b3ca98b-metrics-certs podName:ea454868-79b0-415d-8c0a-6c176b3ca98b nodeName:}" failed. No retries permitted until 2026-02-18 14:12:47.708250453 +0000 UTC m=+830.283786436 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea454868-79b0-415d-8c0a-6c176b3ca98b-metrics-certs") pod "frr-k8s-dzb5b" (UID: "ea454868-79b0-415d-8c0a-6c176b3ca98b") : secret "frr-k8s-certs-secret" not found Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.208289 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ea454868-79b0-415d-8c0a-6c176b3ca98b-metrics\") pod \"frr-k8s-dzb5b\" (UID: \"ea454868-79b0-415d-8c0a-6c176b3ca98b\") " pod="metallb-system/frr-k8s-dzb5b" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.208470 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ea454868-79b0-415d-8c0a-6c176b3ca98b-frr-conf\") pod \"frr-k8s-dzb5b\" (UID: \"ea454868-79b0-415d-8c0a-6c176b3ca98b\") " pod="metallb-system/frr-k8s-dzb5b" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.208587 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ea454868-79b0-415d-8c0a-6c176b3ca98b-frr-sockets\") pod \"frr-k8s-dzb5b\" (UID: \"ea454868-79b0-415d-8c0a-6c176b3ca98b\") " pod="metallb-system/frr-k8s-dzb5b" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.208664 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ea454868-79b0-415d-8c0a-6c176b3ca98b-reloader\") pod \"frr-k8s-dzb5b\" (UID: \"ea454868-79b0-415d-8c0a-6c176b3ca98b\") " pod="metallb-system/frr-k8s-dzb5b" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.209450 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ea454868-79b0-415d-8c0a-6c176b3ca98b-frr-startup\") pod \"frr-k8s-dzb5b\" (UID: \"ea454868-79b0-415d-8c0a-6c176b3ca98b\") " pod="metallb-system/frr-k8s-dzb5b" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.215827 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e2fe6fd7-48f6-47ec-b4b3-60016704bad9-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-kphxt\" (UID: \"e2fe6fd7-48f6-47ec-b4b3-60016704bad9\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-kphxt" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.226842 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snlht\" (UniqueName: \"kubernetes.io/projected/ea454868-79b0-415d-8c0a-6c176b3ca98b-kube-api-access-snlht\") pod \"frr-k8s-dzb5b\" (UID: \"ea454868-79b0-415d-8c0a-6c176b3ca98b\") " pod="metallb-system/frr-k8s-dzb5b" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.236925 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md7pj\" (UniqueName: \"kubernetes.io/projected/e2fe6fd7-48f6-47ec-b4b3-60016704bad9-kube-api-access-md7pj\") pod \"frr-k8s-webhook-server-78b44bf5bb-kphxt\" (UID: \"e2fe6fd7-48f6-47ec-b4b3-60016704bad9\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-kphxt" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.309124 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5889628d-b78a-4279-95fd-ec441aac9d34-cert\") pod \"controller-69bbfbf88f-z6fs7\" (UID: \"5889628d-b78a-4279-95fd-ec441aac9d34\") " pod="metallb-system/controller-69bbfbf88f-z6fs7" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.309466 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvvrw\" (UniqueName: \"kubernetes.io/projected/5889628d-b78a-4279-95fd-ec441aac9d34-kube-api-access-cvvrw\") pod \"controller-69bbfbf88f-z6fs7\" (UID: \"5889628d-b78a-4279-95fd-ec441aac9d34\") " pod="metallb-system/controller-69bbfbf88f-z6fs7" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.309489 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2ce92458-8bf0-41c0-95d1-219f6c35cdf5-memberlist\") pod \"speaker-rzsqh\" (UID: \"2ce92458-8bf0-41c0-95d1-219f6c35cdf5\") " pod="metallb-system/speaker-rzsqh" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.309517 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmx6m\" (UniqueName: \"kubernetes.io/projected/2ce92458-8bf0-41c0-95d1-219f6c35cdf5-kube-api-access-wmx6m\") pod \"speaker-rzsqh\" (UID: \"2ce92458-8bf0-41c0-95d1-219f6c35cdf5\") " pod="metallb-system/speaker-rzsqh" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.309544 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ce92458-8bf0-41c0-95d1-219f6c35cdf5-metrics-certs\") pod \"speaker-rzsqh\" (UID: \"2ce92458-8bf0-41c0-95d1-219f6c35cdf5\") " pod="metallb-system/speaker-rzsqh" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.309559 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5889628d-b78a-4279-95fd-ec441aac9d34-metrics-certs\") pod \"controller-69bbfbf88f-z6fs7\" (UID: \"5889628d-b78a-4279-95fd-ec441aac9d34\") " pod="metallb-system/controller-69bbfbf88f-z6fs7" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.309575 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2ce92458-8bf0-41c0-95d1-219f6c35cdf5-metallb-excludel2\") pod \"speaker-rzsqh\" (UID: \"2ce92458-8bf0-41c0-95d1-219f6c35cdf5\") " pod="metallb-system/speaker-rzsqh" Feb 18 14:12:47 crc kubenswrapper[4817]: E0218 14:12:47.309778 4817 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 18 14:12:47 crc kubenswrapper[4817]: E0218 14:12:47.309868 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ce92458-8bf0-41c0-95d1-219f6c35cdf5-memberlist podName:2ce92458-8bf0-41c0-95d1-219f6c35cdf5 nodeName:}" failed. No retries permitted until 2026-02-18 14:12:47.809843457 +0000 UTC m=+830.385379520 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/2ce92458-8bf0-41c0-95d1-219f6c35cdf5-memberlist") pod "speaker-rzsqh" (UID: "2ce92458-8bf0-41c0-95d1-219f6c35cdf5") : secret "metallb-memberlist" not found Feb 18 14:12:47 crc kubenswrapper[4817]: E0218 14:12:47.309944 4817 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Feb 18 14:12:47 crc kubenswrapper[4817]: E0218 14:12:47.310013 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ce92458-8bf0-41c0-95d1-219f6c35cdf5-metrics-certs podName:2ce92458-8bf0-41c0-95d1-219f6c35cdf5 nodeName:}" failed. No retries permitted until 2026-02-18 14:12:47.809995301 +0000 UTC m=+830.385531364 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ce92458-8bf0-41c0-95d1-219f6c35cdf5-metrics-certs") pod "speaker-rzsqh" (UID: "2ce92458-8bf0-41c0-95d1-219f6c35cdf5") : secret "speaker-certs-secret" not found Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.310868 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2ce92458-8bf0-41c0-95d1-219f6c35cdf5-metallb-excludel2\") pod \"speaker-rzsqh\" (UID: \"2ce92458-8bf0-41c0-95d1-219f6c35cdf5\") " pod="metallb-system/speaker-rzsqh" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.313838 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5889628d-b78a-4279-95fd-ec441aac9d34-cert\") pod \"controller-69bbfbf88f-z6fs7\" (UID: \"5889628d-b78a-4279-95fd-ec441aac9d34\") " pod="metallb-system/controller-69bbfbf88f-z6fs7" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.315609 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5889628d-b78a-4279-95fd-ec441aac9d34-metrics-certs\") pod \"controller-69bbfbf88f-z6fs7\" (UID: \"5889628d-b78a-4279-95fd-ec441aac9d34\") " pod="metallb-system/controller-69bbfbf88f-z6fs7" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.330627 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmx6m\" (UniqueName: \"kubernetes.io/projected/2ce92458-8bf0-41c0-95d1-219f6c35cdf5-kube-api-access-wmx6m\") pod \"speaker-rzsqh\" (UID: \"2ce92458-8bf0-41c0-95d1-219f6c35cdf5\") " pod="metallb-system/speaker-rzsqh" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.335875 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvvrw\" (UniqueName: \"kubernetes.io/projected/5889628d-b78a-4279-95fd-ec441aac9d34-kube-api-access-cvvrw\") pod \"controller-69bbfbf88f-z6fs7\" (UID: \"5889628d-b78a-4279-95fd-ec441aac9d34\") " pod="metallb-system/controller-69bbfbf88f-z6fs7" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.348136 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-kphxt" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.459361 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-z6fs7" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.683696 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-z6fs7"] Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.715061 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea454868-79b0-415d-8c0a-6c176b3ca98b-metrics-certs\") pod \"frr-k8s-dzb5b\" (UID: \"ea454868-79b0-415d-8c0a-6c176b3ca98b\") " pod="metallb-system/frr-k8s-dzb5b" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.722082 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea454868-79b0-415d-8c0a-6c176b3ca98b-metrics-certs\") pod \"frr-k8s-dzb5b\" (UID: \"ea454868-79b0-415d-8c0a-6c176b3ca98b\") " pod="metallb-system/frr-k8s-dzb5b" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.768173 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-kphxt"] Feb 18 14:12:47 crc kubenswrapper[4817]: W0218 14:12:47.772422 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2fe6fd7_48f6_47ec_b4b3_60016704bad9.slice/crio-1d0b4110a3f9b060e6ec91c8b4182d59e7593da2a6ce97955a7ce559d267d616 WatchSource:0}: Error finding container 1d0b4110a3f9b060e6ec91c8b4182d59e7593da2a6ce97955a7ce559d267d616: Status 404 returned error can't find the container with id 1d0b4110a3f9b060e6ec91c8b4182d59e7593da2a6ce97955a7ce559d267d616 Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.816761 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2ce92458-8bf0-41c0-95d1-219f6c35cdf5-memberlist\") pod \"speaker-rzsqh\" (UID: \"2ce92458-8bf0-41c0-95d1-219f6c35cdf5\") " pod="metallb-system/speaker-rzsqh" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.816831 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ce92458-8bf0-41c0-95d1-219f6c35cdf5-metrics-certs\") pod \"speaker-rzsqh\" (UID: \"2ce92458-8bf0-41c0-95d1-219f6c35cdf5\") " pod="metallb-system/speaker-rzsqh" Feb 18 14:12:47 crc kubenswrapper[4817]: E0218 14:12:47.816967 4817 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 18 14:12:47 crc kubenswrapper[4817]: E0218 14:12:47.817067 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ce92458-8bf0-41c0-95d1-219f6c35cdf5-memberlist podName:2ce92458-8bf0-41c0-95d1-219f6c35cdf5 nodeName:}" failed. No retries permitted until 2026-02-18 14:12:48.817045396 +0000 UTC m=+831.392581429 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/2ce92458-8bf0-41c0-95d1-219f6c35cdf5-memberlist") pod "speaker-rzsqh" (UID: "2ce92458-8bf0-41c0-95d1-219f6c35cdf5") : secret "metallb-memberlist" not found Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.820838 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ce92458-8bf0-41c0-95d1-219f6c35cdf5-metrics-certs\") pod \"speaker-rzsqh\" (UID: \"2ce92458-8bf0-41c0-95d1-219f6c35cdf5\") " pod="metallb-system/speaker-rzsqh" Feb 18 14:12:47 crc kubenswrapper[4817]: I0218 14:12:47.940301 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-dzb5b" Feb 18 14:12:48 crc kubenswrapper[4817]: I0218 14:12:48.197255 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dzb5b" event={"ID":"ea454868-79b0-415d-8c0a-6c176b3ca98b","Type":"ContainerStarted","Data":"403892e0947bb458b3eb846786388c43e0c3818642b239d7022ded50419e5f04"} Feb 18 14:12:48 crc kubenswrapper[4817]: I0218 14:12:48.198882 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-kphxt" event={"ID":"e2fe6fd7-48f6-47ec-b4b3-60016704bad9","Type":"ContainerStarted","Data":"1d0b4110a3f9b060e6ec91c8b4182d59e7593da2a6ce97955a7ce559d267d616"} Feb 18 14:12:48 crc kubenswrapper[4817]: I0218 14:12:48.201148 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-z6fs7" event={"ID":"5889628d-b78a-4279-95fd-ec441aac9d34","Type":"ContainerStarted","Data":"71c0038d0951a95166f981784555b520da8cb4932333ea71b4106b2ca8f72c06"} Feb 18 14:12:48 crc kubenswrapper[4817]: I0218 14:12:48.201196 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-z6fs7" event={"ID":"5889628d-b78a-4279-95fd-ec441aac9d34","Type":"ContainerStarted","Data":"462c6e4a6cd32b7c13521379a89f845d7b51c5cbe9df94dc313bcc2c5253b740"} Feb 18 14:12:48 crc kubenswrapper[4817]: I0218 14:12:48.201213 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-z6fs7" event={"ID":"5889628d-b78a-4279-95fd-ec441aac9d34","Type":"ContainerStarted","Data":"f5f891cc677aef68a19763a88ba3d4460d1cf378d0b7a6c8044f832bc7475663"} Feb 18 14:12:48 crc kubenswrapper[4817]: I0218 14:12:48.201322 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-z6fs7" Feb 18 14:12:48 crc kubenswrapper[4817]: I0218 14:12:48.228133 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-z6fs7" podStartSLOduration=1.228114831 podStartE2EDuration="1.228114831s" podCreationTimestamp="2026-02-18 14:12:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:12:48.226628803 +0000 UTC m=+830.802164786" watchObservedRunningTime="2026-02-18 14:12:48.228114831 +0000 UTC m=+830.803650814" Feb 18 14:12:48 crc kubenswrapper[4817]: I0218 14:12:48.830674 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2ce92458-8bf0-41c0-95d1-219f6c35cdf5-memberlist\") pod \"speaker-rzsqh\" (UID: \"2ce92458-8bf0-41c0-95d1-219f6c35cdf5\") " pod="metallb-system/speaker-rzsqh" Feb 18 14:12:48 crc kubenswrapper[4817]: I0218 14:12:48.834419 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2ce92458-8bf0-41c0-95d1-219f6c35cdf5-memberlist\") pod \"speaker-rzsqh\" (UID: \"2ce92458-8bf0-41c0-95d1-219f6c35cdf5\") " pod="metallb-system/speaker-rzsqh" Feb 18 14:12:48 crc kubenswrapper[4817]: I0218 14:12:48.936678 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-rzsqh" Feb 18 14:12:49 crc kubenswrapper[4817]: I0218 14:12:49.208719 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-rzsqh" event={"ID":"2ce92458-8bf0-41c0-95d1-219f6c35cdf5","Type":"ContainerStarted","Data":"d896e8f69347e6322a34cd0d76e62d05dec4e33dbba2f7edf3ffc02c17aa8020"} Feb 18 14:12:49 crc kubenswrapper[4817]: I0218 14:12:49.209327 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-rzsqh" event={"ID":"2ce92458-8bf0-41c0-95d1-219f6c35cdf5","Type":"ContainerStarted","Data":"7ca102da52b8d22fd321056306c3ba882f856ca372c2e2e67ed2e1787483b8aa"} Feb 18 14:12:50 crc kubenswrapper[4817]: I0218 14:12:50.218372 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-rzsqh" event={"ID":"2ce92458-8bf0-41c0-95d1-219f6c35cdf5","Type":"ContainerStarted","Data":"736cfe52e2368ac9a0f91fe13949f7469caed6028d529b0bb9dcabe36758fc9a"} Feb 18 14:12:50 crc kubenswrapper[4817]: I0218 14:12:50.219205 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-rzsqh" Feb 18 14:12:50 crc kubenswrapper[4817]: I0218 14:12:50.243019 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-rzsqh" podStartSLOduration=3.243005313 podStartE2EDuration="3.243005313s" podCreationTimestamp="2026-02-18 14:12:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:12:50.238526449 +0000 UTC m=+832.814062432" watchObservedRunningTime="2026-02-18 14:12:50.243005313 +0000 UTC m=+832.818541296" Feb 18 14:12:56 crc kubenswrapper[4817]: I0218 14:12:56.282085 4817 generic.go:334] "Generic (PLEG): container finished" podID="ea454868-79b0-415d-8c0a-6c176b3ca98b" containerID="8d8c4bfa37d5f37faa84b46f68e64a658bd27539cd7c25dae45476c2e3ec0a77" exitCode=0 Feb 18 14:12:56 crc kubenswrapper[4817]: I0218 14:12:56.282180 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dzb5b" event={"ID":"ea454868-79b0-415d-8c0a-6c176b3ca98b","Type":"ContainerDied","Data":"8d8c4bfa37d5f37faa84b46f68e64a658bd27539cd7c25dae45476c2e3ec0a77"} Feb 18 14:12:56 crc kubenswrapper[4817]: I0218 14:12:56.284162 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-kphxt" event={"ID":"e2fe6fd7-48f6-47ec-b4b3-60016704bad9","Type":"ContainerStarted","Data":"2971456eb21dcf885251f5368e137b036075d60a578d7b1989f4a648f9266200"} Feb 18 14:12:56 crc kubenswrapper[4817]: I0218 14:12:56.284339 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-kphxt" Feb 18 14:12:56 crc kubenswrapper[4817]: I0218 14:12:56.317816 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-kphxt" podStartSLOduration=1.815856878 podStartE2EDuration="9.317798549s" podCreationTimestamp="2026-02-18 14:12:47 +0000 UTC" firstStartedPulling="2026-02-18 14:12:47.77474623 +0000 UTC m=+830.350282213" lastFinishedPulling="2026-02-18 14:12:55.276687901 +0000 UTC m=+837.852223884" observedRunningTime="2026-02-18 14:12:56.317223084 +0000 UTC m=+838.892759067" watchObservedRunningTime="2026-02-18 14:12:56.317798549 +0000 UTC m=+838.893334532" Feb 18 14:12:57 crc kubenswrapper[4817]: I0218 14:12:57.290509 4817 generic.go:334] "Generic (PLEG): container finished" podID="ea454868-79b0-415d-8c0a-6c176b3ca98b" containerID="b891b7f4d681321da3c7398e8a9cc7a9d0d1006f41ad2439f8279f7093f3273c" exitCode=0 Feb 18 14:12:57 crc kubenswrapper[4817]: I0218 14:12:57.291390 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dzb5b" event={"ID":"ea454868-79b0-415d-8c0a-6c176b3ca98b","Type":"ContainerDied","Data":"b891b7f4d681321da3c7398e8a9cc7a9d0d1006f41ad2439f8279f7093f3273c"} Feb 18 14:12:57 crc kubenswrapper[4817]: I0218 14:12:57.463719 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-z6fs7" Feb 18 14:12:58 crc kubenswrapper[4817]: I0218 14:12:58.298443 4817 generic.go:334] "Generic (PLEG): container finished" podID="ea454868-79b0-415d-8c0a-6c176b3ca98b" containerID="080cf68c7a202a1ffa7a347be9fee9c3d4fd2b6763ab4ea048d5f184a9572a14" exitCode=0 Feb 18 14:12:58 crc kubenswrapper[4817]: I0218 14:12:58.298498 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dzb5b" event={"ID":"ea454868-79b0-415d-8c0a-6c176b3ca98b","Type":"ContainerDied","Data":"080cf68c7a202a1ffa7a347be9fee9c3d4fd2b6763ab4ea048d5f184a9572a14"} Feb 18 14:12:59 crc kubenswrapper[4817]: I0218 14:12:59.311756 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dzb5b" event={"ID":"ea454868-79b0-415d-8c0a-6c176b3ca98b","Type":"ContainerStarted","Data":"16ce1e691efc6e279fd26aa37c1c5f96b3139e0f4a2a53ae8392f747a2c9b572"} Feb 18 14:12:59 crc kubenswrapper[4817]: I0218 14:12:59.312171 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dzb5b" event={"ID":"ea454868-79b0-415d-8c0a-6c176b3ca98b","Type":"ContainerStarted","Data":"328fd08f7dc10a1c6b28326b6de3a9f9a756446be3e0f6dc018e17834f6b383e"} Feb 18 14:12:59 crc kubenswrapper[4817]: I0218 14:12:59.312185 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dzb5b" event={"ID":"ea454868-79b0-415d-8c0a-6c176b3ca98b","Type":"ContainerStarted","Data":"630725cbf189addfc328d6f680220b6800486546038a2b800a1438eab29bf52f"} Feb 18 14:12:59 crc kubenswrapper[4817]: I0218 14:12:59.312199 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dzb5b" event={"ID":"ea454868-79b0-415d-8c0a-6c176b3ca98b","Type":"ContainerStarted","Data":"c1a9f42ce38c1c2ee498a873cc3ef261452e4414f9441c6a55dbef677dba833a"} Feb 18 14:12:59 crc kubenswrapper[4817]: I0218 14:12:59.312213 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dzb5b" event={"ID":"ea454868-79b0-415d-8c0a-6c176b3ca98b","Type":"ContainerStarted","Data":"8c39cbc3c6f8987a411977f9e4638c56098a8d52fae345e4649a0cf34e4ba4f4"} Feb 18 14:13:00 crc kubenswrapper[4817]: I0218 14:13:00.322069 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dzb5b" event={"ID":"ea454868-79b0-415d-8c0a-6c176b3ca98b","Type":"ContainerStarted","Data":"0efba055f43c5d8b3faea42c7b2a6ec054ed4c61eadfdc73a5440c4094ba3546"} Feb 18 14:13:00 crc kubenswrapper[4817]: I0218 14:13:00.322464 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-dzb5b" Feb 18 14:13:00 crc kubenswrapper[4817]: I0218 14:13:00.351909 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-dzb5b" podStartSLOduration=7.090343724 podStartE2EDuration="14.35189078s" podCreationTimestamp="2026-02-18 14:12:46 +0000 UTC" firstStartedPulling="2026-02-18 14:12:48.039495744 +0000 UTC m=+830.615031727" lastFinishedPulling="2026-02-18 14:12:55.3010428 +0000 UTC m=+837.876578783" observedRunningTime="2026-02-18 14:13:00.345882129 +0000 UTC m=+842.921418132" watchObservedRunningTime="2026-02-18 14:13:00.35189078 +0000 UTC m=+842.927426773" Feb 18 14:13:02 crc kubenswrapper[4817]: I0218 14:13:02.941035 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-dzb5b" Feb 18 14:13:02 crc kubenswrapper[4817]: I0218 14:13:02.976590 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-dzb5b" Feb 18 14:13:07 crc kubenswrapper[4817]: I0218 14:13:07.356839 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-kphxt" Feb 18 14:13:08 crc kubenswrapper[4817]: I0218 14:13:08.940116 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-rzsqh" Feb 18 14:13:12 crc kubenswrapper[4817]: I0218 14:13:12.863377 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:13:12 crc kubenswrapper[4817]: I0218 14:13:12.864101 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:13:12 crc kubenswrapper[4817]: I0218 14:13:12.864149 4817 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" Feb 18 14:13:12 crc kubenswrapper[4817]: I0218 14:13:12.864885 4817 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"45f4df11b9cafd0abed8804744792dcd58abd224e061fc9294ed85d9ec653f5f"} pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 14:13:12 crc kubenswrapper[4817]: I0218 14:13:12.864949 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" containerID="cri-o://45f4df11b9cafd0abed8804744792dcd58abd224e061fc9294ed85d9ec653f5f" gracePeriod=600 Feb 18 14:13:13 crc kubenswrapper[4817]: I0218 14:13:13.401885 4817 generic.go:334] "Generic (PLEG): container finished" podID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerID="45f4df11b9cafd0abed8804744792dcd58abd224e061fc9294ed85d9ec653f5f" exitCode=0 Feb 18 14:13:13 crc kubenswrapper[4817]: I0218 14:13:13.401970 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerDied","Data":"45f4df11b9cafd0abed8804744792dcd58abd224e061fc9294ed85d9ec653f5f"} Feb 18 14:13:13 crc kubenswrapper[4817]: I0218 14:13:13.402291 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerStarted","Data":"2a45288dd8059ad4005579ccd7ba9584a44ec34777e8d02ff7b0f8c874cff3f7"} Feb 18 14:13:13 crc kubenswrapper[4817]: I0218 14:13:13.402314 4817 scope.go:117] "RemoveContainer" containerID="7dfc7dd34d408c82e87d251482328355deda32f5409047841a6c0bd478ccafc4" Feb 18 14:13:14 crc kubenswrapper[4817]: I0218 14:13:14.988497 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-qvccv"] Feb 18 14:13:14 crc kubenswrapper[4817]: I0218 14:13:14.989672 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qvccv" Feb 18 14:13:14 crc kubenswrapper[4817]: I0218 14:13:14.991684 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 18 14:13:14 crc kubenswrapper[4817]: I0218 14:13:14.992514 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-fpx6w" Feb 18 14:13:14 crc kubenswrapper[4817]: I0218 14:13:14.995353 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qvccv"] Feb 18 14:13:15 crc kubenswrapper[4817]: I0218 14:13:15.055698 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 18 14:13:15 crc kubenswrapper[4817]: I0218 14:13:15.102965 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jklpz\" (UniqueName: \"kubernetes.io/projected/c6647561-da17-4635-98dd-b0296e327e7c-kube-api-access-jklpz\") pod \"openstack-operator-index-qvccv\" (UID: \"c6647561-da17-4635-98dd-b0296e327e7c\") " pod="openstack-operators/openstack-operator-index-qvccv" Feb 18 14:13:15 crc kubenswrapper[4817]: I0218 14:13:15.205062 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jklpz\" (UniqueName: \"kubernetes.io/projected/c6647561-da17-4635-98dd-b0296e327e7c-kube-api-access-jklpz\") pod \"openstack-operator-index-qvccv\" (UID: \"c6647561-da17-4635-98dd-b0296e327e7c\") " pod="openstack-operators/openstack-operator-index-qvccv" Feb 18 14:13:15 crc kubenswrapper[4817]: I0218 14:13:15.223014 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jklpz\" (UniqueName: \"kubernetes.io/projected/c6647561-da17-4635-98dd-b0296e327e7c-kube-api-access-jklpz\") pod \"openstack-operator-index-qvccv\" (UID: \"c6647561-da17-4635-98dd-b0296e327e7c\") " pod="openstack-operators/openstack-operator-index-qvccv" Feb 18 14:13:15 crc kubenswrapper[4817]: I0218 14:13:15.365111 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qvccv" Feb 18 14:13:15 crc kubenswrapper[4817]: I0218 14:13:15.806839 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qvccv"] Feb 18 14:13:15 crc kubenswrapper[4817]: W0218 14:13:15.821597 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6647561_da17_4635_98dd_b0296e327e7c.slice/crio-ba3e47579f56711ae4e9de6001fb0b7005b1ff9c768409156ec8a4064465a7dd WatchSource:0}: Error finding container ba3e47579f56711ae4e9de6001fb0b7005b1ff9c768409156ec8a4064465a7dd: Status 404 returned error can't find the container with id ba3e47579f56711ae4e9de6001fb0b7005b1ff9c768409156ec8a4064465a7dd Feb 18 14:13:16 crc kubenswrapper[4817]: I0218 14:13:16.434618 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qvccv" event={"ID":"c6647561-da17-4635-98dd-b0296e327e7c","Type":"ContainerStarted","Data":"ba3e47579f56711ae4e9de6001fb0b7005b1ff9c768409156ec8a4064465a7dd"} Feb 18 14:13:17 crc kubenswrapper[4817]: I0218 14:13:17.947371 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-dzb5b" Feb 18 14:13:19 crc kubenswrapper[4817]: I0218 14:13:19.459959 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qvccv" event={"ID":"c6647561-da17-4635-98dd-b0296e327e7c","Type":"ContainerStarted","Data":"bfb5e0b57becca89b52ed4ca5d5fbb7d0e79fca0812d77673cba4b0cf74536b7"} Feb 18 14:13:19 crc kubenswrapper[4817]: I0218 14:13:19.474711 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-qvccv" podStartSLOduration=2.882865391 podStartE2EDuration="5.474692589s" podCreationTimestamp="2026-02-18 14:13:14 +0000 UTC" firstStartedPulling="2026-02-18 14:13:15.82875146 +0000 UTC m=+858.404287433" lastFinishedPulling="2026-02-18 14:13:18.420578648 +0000 UTC m=+860.996114631" observedRunningTime="2026-02-18 14:13:19.473299074 +0000 UTC m=+862.048835057" watchObservedRunningTime="2026-02-18 14:13:19.474692589 +0000 UTC m=+862.050228572" Feb 18 14:13:20 crc kubenswrapper[4817]: I0218 14:13:20.186720 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-qvccv"] Feb 18 14:13:20 crc kubenswrapper[4817]: I0218 14:13:20.787092 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-hfr8r"] Feb 18 14:13:20 crc kubenswrapper[4817]: I0218 14:13:20.788817 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hfr8r" Feb 18 14:13:20 crc kubenswrapper[4817]: I0218 14:13:20.801911 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hfr8r"] Feb 18 14:13:20 crc kubenswrapper[4817]: I0218 14:13:20.894028 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hklm5\" (UniqueName: \"kubernetes.io/projected/51aa0947-7a1c-4a40-bd45-299bb95ff9f1-kube-api-access-hklm5\") pod \"openstack-operator-index-hfr8r\" (UID: \"51aa0947-7a1c-4a40-bd45-299bb95ff9f1\") " pod="openstack-operators/openstack-operator-index-hfr8r" Feb 18 14:13:20 crc kubenswrapper[4817]: I0218 14:13:20.994818 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hklm5\" (UniqueName: \"kubernetes.io/projected/51aa0947-7a1c-4a40-bd45-299bb95ff9f1-kube-api-access-hklm5\") pod \"openstack-operator-index-hfr8r\" (UID: \"51aa0947-7a1c-4a40-bd45-299bb95ff9f1\") " pod="openstack-operators/openstack-operator-index-hfr8r" Feb 18 14:13:21 crc kubenswrapper[4817]: I0218 14:13:21.016886 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hklm5\" (UniqueName: \"kubernetes.io/projected/51aa0947-7a1c-4a40-bd45-299bb95ff9f1-kube-api-access-hklm5\") pod \"openstack-operator-index-hfr8r\" (UID: \"51aa0947-7a1c-4a40-bd45-299bb95ff9f1\") " pod="openstack-operators/openstack-operator-index-hfr8r" Feb 18 14:13:21 crc kubenswrapper[4817]: I0218 14:13:21.107495 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hfr8r" Feb 18 14:13:21 crc kubenswrapper[4817]: I0218 14:13:21.474636 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-qvccv" podUID="c6647561-da17-4635-98dd-b0296e327e7c" containerName="registry-server" containerID="cri-o://bfb5e0b57becca89b52ed4ca5d5fbb7d0e79fca0812d77673cba4b0cf74536b7" gracePeriod=2 Feb 18 14:13:21 crc kubenswrapper[4817]: I0218 14:13:21.529479 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hfr8r"] Feb 18 14:13:21 crc kubenswrapper[4817]: I0218 14:13:21.842724 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qvccv" Feb 18 14:13:21 crc kubenswrapper[4817]: I0218 14:13:21.908214 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jklpz\" (UniqueName: \"kubernetes.io/projected/c6647561-da17-4635-98dd-b0296e327e7c-kube-api-access-jklpz\") pod \"c6647561-da17-4635-98dd-b0296e327e7c\" (UID: \"c6647561-da17-4635-98dd-b0296e327e7c\") " Feb 18 14:13:21 crc kubenswrapper[4817]: I0218 14:13:21.912795 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6647561-da17-4635-98dd-b0296e327e7c-kube-api-access-jklpz" (OuterVolumeSpecName: "kube-api-access-jklpz") pod "c6647561-da17-4635-98dd-b0296e327e7c" (UID: "c6647561-da17-4635-98dd-b0296e327e7c"). InnerVolumeSpecName "kube-api-access-jklpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:13:22 crc kubenswrapper[4817]: I0218 14:13:22.009881 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jklpz\" (UniqueName: \"kubernetes.io/projected/c6647561-da17-4635-98dd-b0296e327e7c-kube-api-access-jklpz\") on node \"crc\" DevicePath \"\"" Feb 18 14:13:22 crc kubenswrapper[4817]: I0218 14:13:22.484740 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hfr8r" event={"ID":"51aa0947-7a1c-4a40-bd45-299bb95ff9f1","Type":"ContainerStarted","Data":"e39864f0bba8c4c71f44d8651bb6d508b21a006e5365fc8fcac53cf5c1a2f080"} Feb 18 14:13:22 crc kubenswrapper[4817]: I0218 14:13:22.485191 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hfr8r" event={"ID":"51aa0947-7a1c-4a40-bd45-299bb95ff9f1","Type":"ContainerStarted","Data":"1d1e230cf1a9739a51fe2e10c9e54266d54b459778576ca41102bcd5828ec48f"} Feb 18 14:13:22 crc kubenswrapper[4817]: I0218 14:13:22.487296 4817 generic.go:334] "Generic (PLEG): container finished" podID="c6647561-da17-4635-98dd-b0296e327e7c" containerID="bfb5e0b57becca89b52ed4ca5d5fbb7d0e79fca0812d77673cba4b0cf74536b7" exitCode=0 Feb 18 14:13:22 crc kubenswrapper[4817]: I0218 14:13:22.487334 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qvccv" event={"ID":"c6647561-da17-4635-98dd-b0296e327e7c","Type":"ContainerDied","Data":"bfb5e0b57becca89b52ed4ca5d5fbb7d0e79fca0812d77673cba4b0cf74536b7"} Feb 18 14:13:22 crc kubenswrapper[4817]: I0218 14:13:22.487356 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qvccv" event={"ID":"c6647561-da17-4635-98dd-b0296e327e7c","Type":"ContainerDied","Data":"ba3e47579f56711ae4e9de6001fb0b7005b1ff9c768409156ec8a4064465a7dd"} Feb 18 14:13:22 crc kubenswrapper[4817]: I0218 14:13:22.487376 4817 scope.go:117] "RemoveContainer" containerID="bfb5e0b57becca89b52ed4ca5d5fbb7d0e79fca0812d77673cba4b0cf74536b7" Feb 18 14:13:22 crc kubenswrapper[4817]: I0218 14:13:22.487485 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qvccv" Feb 18 14:13:22 crc kubenswrapper[4817]: I0218 14:13:22.507155 4817 scope.go:117] "RemoveContainer" containerID="bfb5e0b57becca89b52ed4ca5d5fbb7d0e79fca0812d77673cba4b0cf74536b7" Feb 18 14:13:22 crc kubenswrapper[4817]: E0218 14:13:22.507727 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfb5e0b57becca89b52ed4ca5d5fbb7d0e79fca0812d77673cba4b0cf74536b7\": container with ID starting with bfb5e0b57becca89b52ed4ca5d5fbb7d0e79fca0812d77673cba4b0cf74536b7 not found: ID does not exist" containerID="bfb5e0b57becca89b52ed4ca5d5fbb7d0e79fca0812d77673cba4b0cf74536b7" Feb 18 14:13:22 crc kubenswrapper[4817]: I0218 14:13:22.507781 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfb5e0b57becca89b52ed4ca5d5fbb7d0e79fca0812d77673cba4b0cf74536b7"} err="failed to get container status \"bfb5e0b57becca89b52ed4ca5d5fbb7d0e79fca0812d77673cba4b0cf74536b7\": rpc error: code = NotFound desc = could not find container \"bfb5e0b57becca89b52ed4ca5d5fbb7d0e79fca0812d77673cba4b0cf74536b7\": container with ID starting with bfb5e0b57becca89b52ed4ca5d5fbb7d0e79fca0812d77673cba4b0cf74536b7 not found: ID does not exist" Feb 18 14:13:22 crc kubenswrapper[4817]: I0218 14:13:22.519098 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-hfr8r" podStartSLOduration=2.456830993 podStartE2EDuration="2.519068736s" podCreationTimestamp="2026-02-18 14:13:20 +0000 UTC" firstStartedPulling="2026-02-18 14:13:21.594815089 +0000 UTC m=+864.170351082" lastFinishedPulling="2026-02-18 14:13:21.657052842 +0000 UTC m=+864.232588825" observedRunningTime="2026-02-18 14:13:22.50410106 +0000 UTC m=+865.079637053" watchObservedRunningTime="2026-02-18 14:13:22.519068736 +0000 UTC m=+865.094604729" Feb 18 14:13:22 crc kubenswrapper[4817]: I0218 14:13:22.523004 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-qvccv"] Feb 18 14:13:22 crc kubenswrapper[4817]: I0218 14:13:22.528527 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-qvccv"] Feb 18 14:13:24 crc kubenswrapper[4817]: I0218 14:13:24.181120 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6647561-da17-4635-98dd-b0296e327e7c" path="/var/lib/kubelet/pods/c6647561-da17-4635-98dd-b0296e327e7c/volumes" Feb 18 14:13:31 crc kubenswrapper[4817]: I0218 14:13:31.107755 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-hfr8r" Feb 18 14:13:31 crc kubenswrapper[4817]: I0218 14:13:31.108225 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-hfr8r" Feb 18 14:13:31 crc kubenswrapper[4817]: I0218 14:13:31.131346 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-hfr8r" Feb 18 14:13:31 crc kubenswrapper[4817]: I0218 14:13:31.577114 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-hfr8r" Feb 18 14:13:34 crc kubenswrapper[4817]: I0218 14:13:34.435732 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-g4shq"] Feb 18 14:13:34 crc kubenswrapper[4817]: E0218 14:13:34.436370 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6647561-da17-4635-98dd-b0296e327e7c" containerName="registry-server" Feb 18 14:13:34 crc kubenswrapper[4817]: I0218 14:13:34.436386 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6647561-da17-4635-98dd-b0296e327e7c" containerName="registry-server" Feb 18 14:13:34 crc kubenswrapper[4817]: I0218 14:13:34.436529 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6647561-da17-4635-98dd-b0296e327e7c" containerName="registry-server" Feb 18 14:13:34 crc kubenswrapper[4817]: I0218 14:13:34.437586 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g4shq" Feb 18 14:13:34 crc kubenswrapper[4817]: I0218 14:13:34.450498 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g4shq"] Feb 18 14:13:34 crc kubenswrapper[4817]: I0218 14:13:34.488460 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca-utilities\") pod \"redhat-marketplace-g4shq\" (UID: \"d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca\") " pod="openshift-marketplace/redhat-marketplace-g4shq" Feb 18 14:13:34 crc kubenswrapper[4817]: I0218 14:13:34.488514 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdl86\" (UniqueName: \"kubernetes.io/projected/d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca-kube-api-access-gdl86\") pod \"redhat-marketplace-g4shq\" (UID: \"d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca\") " pod="openshift-marketplace/redhat-marketplace-g4shq" Feb 18 14:13:34 crc kubenswrapper[4817]: I0218 14:13:34.488544 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca-catalog-content\") pod \"redhat-marketplace-g4shq\" (UID: \"d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca\") " pod="openshift-marketplace/redhat-marketplace-g4shq" Feb 18 14:13:34 crc kubenswrapper[4817]: I0218 14:13:34.589697 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdl86\" (UniqueName: \"kubernetes.io/projected/d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca-kube-api-access-gdl86\") pod \"redhat-marketplace-g4shq\" (UID: \"d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca\") " pod="openshift-marketplace/redhat-marketplace-g4shq" Feb 18 14:13:34 crc kubenswrapper[4817]: I0218 14:13:34.589750 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca-catalog-content\") pod \"redhat-marketplace-g4shq\" (UID: \"d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca\") " pod="openshift-marketplace/redhat-marketplace-g4shq" Feb 18 14:13:34 crc kubenswrapper[4817]: I0218 14:13:34.589827 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca-utilities\") pod \"redhat-marketplace-g4shq\" (UID: \"d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca\") " pod="openshift-marketplace/redhat-marketplace-g4shq" Feb 18 14:13:34 crc kubenswrapper[4817]: I0218 14:13:34.590238 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca-utilities\") pod \"redhat-marketplace-g4shq\" (UID: \"d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca\") " pod="openshift-marketplace/redhat-marketplace-g4shq" Feb 18 14:13:34 crc kubenswrapper[4817]: I0218 14:13:34.590692 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca-catalog-content\") pod \"redhat-marketplace-g4shq\" (UID: \"d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca\") " pod="openshift-marketplace/redhat-marketplace-g4shq" Feb 18 14:13:34 crc kubenswrapper[4817]: I0218 14:13:34.612395 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdl86\" (UniqueName: \"kubernetes.io/projected/d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca-kube-api-access-gdl86\") pod \"redhat-marketplace-g4shq\" (UID: \"d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca\") " pod="openshift-marketplace/redhat-marketplace-g4shq" Feb 18 14:13:34 crc kubenswrapper[4817]: I0218 14:13:34.754753 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g4shq" Feb 18 14:13:34 crc kubenswrapper[4817]: I0218 14:13:34.970389 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g4shq"] Feb 18 14:13:35 crc kubenswrapper[4817]: I0218 14:13:35.575205 4817 generic.go:334] "Generic (PLEG): container finished" podID="d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca" containerID="4be870f74bf7dd7c165bd7b0ebd446d1ad2a492a09b439f182eeb508807b001c" exitCode=0 Feb 18 14:13:35 crc kubenswrapper[4817]: I0218 14:13:35.575308 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g4shq" event={"ID":"d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca","Type":"ContainerDied","Data":"4be870f74bf7dd7c165bd7b0ebd446d1ad2a492a09b439f182eeb508807b001c"} Feb 18 14:13:35 crc kubenswrapper[4817]: I0218 14:13:35.575531 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g4shq" event={"ID":"d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca","Type":"ContainerStarted","Data":"be4f3afeead4a0fc4447950802fee91e1cc47c20e42004a48138d40e0bbde1ed"} Feb 18 14:13:35 crc kubenswrapper[4817]: I0218 14:13:35.576783 4817 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 14:13:37 crc kubenswrapper[4817]: I0218 14:13:37.588085 4817 generic.go:334] "Generic (PLEG): container finished" podID="d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca" containerID="5f8ceb388970320522bb45007183fba546683114f54fb55f26c5cec40cfd5958" exitCode=0 Feb 18 14:13:37 crc kubenswrapper[4817]: I0218 14:13:37.588148 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g4shq" event={"ID":"d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca","Type":"ContainerDied","Data":"5f8ceb388970320522bb45007183fba546683114f54fb55f26c5cec40cfd5958"} Feb 18 14:13:38 crc kubenswrapper[4817]: I0218 14:13:38.601435 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g4shq" event={"ID":"d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca","Type":"ContainerStarted","Data":"fa9fcb711a8bd09cd7caa8de0090efb3985a70bd507f9ea23f6dd1635c0f5233"} Feb 18 14:13:38 crc kubenswrapper[4817]: I0218 14:13:38.617597 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-g4shq" podStartSLOduration=2.182401441 podStartE2EDuration="4.617580335s" podCreationTimestamp="2026-02-18 14:13:34 +0000 UTC" firstStartedPulling="2026-02-18 14:13:35.576514221 +0000 UTC m=+878.152050204" lastFinishedPulling="2026-02-18 14:13:38.011693115 +0000 UTC m=+880.587229098" observedRunningTime="2026-02-18 14:13:38.614951659 +0000 UTC m=+881.190487642" watchObservedRunningTime="2026-02-18 14:13:38.617580335 +0000 UTC m=+881.193116318" Feb 18 14:13:44 crc kubenswrapper[4817]: I0218 14:13:44.755058 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-g4shq" Feb 18 14:13:44 crc kubenswrapper[4817]: I0218 14:13:44.755472 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-g4shq" Feb 18 14:13:44 crc kubenswrapper[4817]: I0218 14:13:44.804401 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-g4shq" Feb 18 14:13:45 crc kubenswrapper[4817]: I0218 14:13:45.442542 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc"] Feb 18 14:13:45 crc kubenswrapper[4817]: I0218 14:13:45.443701 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc" Feb 18 14:13:45 crc kubenswrapper[4817]: I0218 14:13:45.450650 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-dwkzz" Feb 18 14:13:45 crc kubenswrapper[4817]: I0218 14:13:45.456932 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc"] Feb 18 14:13:45 crc kubenswrapper[4817]: I0218 14:13:45.534248 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/719bba7d-4fc5-4a70-9711-ecb679a5055a-util\") pod \"5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc\" (UID: \"719bba7d-4fc5-4a70-9711-ecb679a5055a\") " pod="openstack-operators/5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc" Feb 18 14:13:45 crc kubenswrapper[4817]: I0218 14:13:45.534632 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvcmh\" (UniqueName: \"kubernetes.io/projected/719bba7d-4fc5-4a70-9711-ecb679a5055a-kube-api-access-qvcmh\") pod \"5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc\" (UID: \"719bba7d-4fc5-4a70-9711-ecb679a5055a\") " pod="openstack-operators/5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc" Feb 18 14:13:45 crc kubenswrapper[4817]: I0218 14:13:45.534789 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/719bba7d-4fc5-4a70-9711-ecb679a5055a-bundle\") pod \"5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc\" (UID: \"719bba7d-4fc5-4a70-9711-ecb679a5055a\") " pod="openstack-operators/5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc" Feb 18 14:13:45 crc kubenswrapper[4817]: I0218 14:13:45.635912 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvcmh\" (UniqueName: \"kubernetes.io/projected/719bba7d-4fc5-4a70-9711-ecb679a5055a-kube-api-access-qvcmh\") pod \"5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc\" (UID: \"719bba7d-4fc5-4a70-9711-ecb679a5055a\") " pod="openstack-operators/5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc" Feb 18 14:13:45 crc kubenswrapper[4817]: I0218 14:13:45.636001 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/719bba7d-4fc5-4a70-9711-ecb679a5055a-bundle\") pod \"5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc\" (UID: \"719bba7d-4fc5-4a70-9711-ecb679a5055a\") " pod="openstack-operators/5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc" Feb 18 14:13:45 crc kubenswrapper[4817]: I0218 14:13:45.636054 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/719bba7d-4fc5-4a70-9711-ecb679a5055a-util\") pod \"5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc\" (UID: \"719bba7d-4fc5-4a70-9711-ecb679a5055a\") " pod="openstack-operators/5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc" Feb 18 14:13:45 crc kubenswrapper[4817]: I0218 14:13:45.636667 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/719bba7d-4fc5-4a70-9711-ecb679a5055a-bundle\") pod \"5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc\" (UID: \"719bba7d-4fc5-4a70-9711-ecb679a5055a\") " pod="openstack-operators/5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc" Feb 18 14:13:45 crc kubenswrapper[4817]: I0218 14:13:45.636683 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/719bba7d-4fc5-4a70-9711-ecb679a5055a-util\") pod \"5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc\" (UID: \"719bba7d-4fc5-4a70-9711-ecb679a5055a\") " pod="openstack-operators/5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc" Feb 18 14:13:45 crc kubenswrapper[4817]: I0218 14:13:45.665817 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvcmh\" (UniqueName: \"kubernetes.io/projected/719bba7d-4fc5-4a70-9711-ecb679a5055a-kube-api-access-qvcmh\") pod \"5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc\" (UID: \"719bba7d-4fc5-4a70-9711-ecb679a5055a\") " pod="openstack-operators/5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc" Feb 18 14:13:45 crc kubenswrapper[4817]: I0218 14:13:45.691160 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-g4shq" Feb 18 14:13:45 crc kubenswrapper[4817]: I0218 14:13:45.763066 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc" Feb 18 14:13:46 crc kubenswrapper[4817]: I0218 14:13:46.209393 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc"] Feb 18 14:13:46 crc kubenswrapper[4817]: I0218 14:13:46.651784 4817 generic.go:334] "Generic (PLEG): container finished" podID="719bba7d-4fc5-4a70-9711-ecb679a5055a" containerID="7f2d40e68bf0f8809d712ba4af0d9407a60be23cbcf80c2cf1a2b987a84c1ddd" exitCode=0 Feb 18 14:13:46 crc kubenswrapper[4817]: I0218 14:13:46.652170 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc" event={"ID":"719bba7d-4fc5-4a70-9711-ecb679a5055a","Type":"ContainerDied","Data":"7f2d40e68bf0f8809d712ba4af0d9407a60be23cbcf80c2cf1a2b987a84c1ddd"} Feb 18 14:13:46 crc kubenswrapper[4817]: I0218 14:13:46.652321 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc" event={"ID":"719bba7d-4fc5-4a70-9711-ecb679a5055a","Type":"ContainerStarted","Data":"935f825244c9252c70eb4e41b2b74cf65b5d3ab97917244d59cfe026ca22b16a"} Feb 18 14:13:47 crc kubenswrapper[4817]: I0218 14:13:47.660867 4817 generic.go:334] "Generic (PLEG): container finished" podID="719bba7d-4fc5-4a70-9711-ecb679a5055a" containerID="b725da32f2b38dc8b97f7b6062a6290e327e3b7e807afd1a6c65f332094d6395" exitCode=0 Feb 18 14:13:47 crc kubenswrapper[4817]: I0218 14:13:47.660923 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc" event={"ID":"719bba7d-4fc5-4a70-9711-ecb679a5055a","Type":"ContainerDied","Data":"b725da32f2b38dc8b97f7b6062a6290e327e3b7e807afd1a6c65f332094d6395"} Feb 18 14:13:48 crc kubenswrapper[4817]: I0218 14:13:48.670344 4817 generic.go:334] "Generic (PLEG): container finished" podID="719bba7d-4fc5-4a70-9711-ecb679a5055a" containerID="2988b35b4304a3f5e87f6dd2bbbf7147e475eaea3ceb592c1150a56079a48f33" exitCode=0 Feb 18 14:13:48 crc kubenswrapper[4817]: I0218 14:13:48.670445 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc" event={"ID":"719bba7d-4fc5-4a70-9711-ecb679a5055a","Type":"ContainerDied","Data":"2988b35b4304a3f5e87f6dd2bbbf7147e475eaea3ceb592c1150a56079a48f33"} Feb 18 14:13:49 crc kubenswrapper[4817]: I0218 14:13:49.593531 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g4shq"] Feb 18 14:13:49 crc kubenswrapper[4817]: I0218 14:13:49.594343 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-g4shq" podUID="d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca" containerName="registry-server" containerID="cri-o://fa9fcb711a8bd09cd7caa8de0090efb3985a70bd507f9ea23f6dd1635c0f5233" gracePeriod=2 Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.017615 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc" Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.020091 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g4shq" Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.097889 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdl86\" (UniqueName: \"kubernetes.io/projected/d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca-kube-api-access-gdl86\") pod \"d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca\" (UID: \"d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca\") " Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.097959 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/719bba7d-4fc5-4a70-9711-ecb679a5055a-util\") pod \"719bba7d-4fc5-4a70-9711-ecb679a5055a\" (UID: \"719bba7d-4fc5-4a70-9711-ecb679a5055a\") " Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.097998 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/719bba7d-4fc5-4a70-9711-ecb679a5055a-bundle\") pod \"719bba7d-4fc5-4a70-9711-ecb679a5055a\" (UID: \"719bba7d-4fc5-4a70-9711-ecb679a5055a\") " Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.098050 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvcmh\" (UniqueName: \"kubernetes.io/projected/719bba7d-4fc5-4a70-9711-ecb679a5055a-kube-api-access-qvcmh\") pod \"719bba7d-4fc5-4a70-9711-ecb679a5055a\" (UID: \"719bba7d-4fc5-4a70-9711-ecb679a5055a\") " Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.098076 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca-utilities\") pod \"d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca\" (UID: \"d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca\") " Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.098125 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca-catalog-content\") pod \"d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca\" (UID: \"d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca\") " Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.099041 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/719bba7d-4fc5-4a70-9711-ecb679a5055a-bundle" (OuterVolumeSpecName: "bundle") pod "719bba7d-4fc5-4a70-9711-ecb679a5055a" (UID: "719bba7d-4fc5-4a70-9711-ecb679a5055a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.103951 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/719bba7d-4fc5-4a70-9711-ecb679a5055a-kube-api-access-qvcmh" (OuterVolumeSpecName: "kube-api-access-qvcmh") pod "719bba7d-4fc5-4a70-9711-ecb679a5055a" (UID: "719bba7d-4fc5-4a70-9711-ecb679a5055a"). InnerVolumeSpecName "kube-api-access-qvcmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.105106 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca-utilities" (OuterVolumeSpecName: "utilities") pod "d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca" (UID: "d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.107351 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca-kube-api-access-gdl86" (OuterVolumeSpecName: "kube-api-access-gdl86") pod "d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca" (UID: "d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca"). InnerVolumeSpecName "kube-api-access-gdl86". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.113423 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/719bba7d-4fc5-4a70-9711-ecb679a5055a-util" (OuterVolumeSpecName: "util") pod "719bba7d-4fc5-4a70-9711-ecb679a5055a" (UID: "719bba7d-4fc5-4a70-9711-ecb679a5055a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.126927 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca" (UID: "d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.199659 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdl86\" (UniqueName: \"kubernetes.io/projected/d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca-kube-api-access-gdl86\") on node \"crc\" DevicePath \"\"" Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.199697 4817 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/719bba7d-4fc5-4a70-9711-ecb679a5055a-util\") on node \"crc\" DevicePath \"\"" Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.199716 4817 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/719bba7d-4fc5-4a70-9711-ecb679a5055a-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.199728 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvcmh\" (UniqueName: \"kubernetes.io/projected/719bba7d-4fc5-4a70-9711-ecb679a5055a-kube-api-access-qvcmh\") on node \"crc\" DevicePath \"\"" Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.199741 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.199752 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.684523 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc" event={"ID":"719bba7d-4fc5-4a70-9711-ecb679a5055a","Type":"ContainerDied","Data":"935f825244c9252c70eb4e41b2b74cf65b5d3ab97917244d59cfe026ca22b16a"} Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.684561 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc" Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.684566 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="935f825244c9252c70eb4e41b2b74cf65b5d3ab97917244d59cfe026ca22b16a" Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.686607 4817 generic.go:334] "Generic (PLEG): container finished" podID="d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca" containerID="fa9fcb711a8bd09cd7caa8de0090efb3985a70bd507f9ea23f6dd1635c0f5233" exitCode=0 Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.686640 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g4shq" event={"ID":"d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca","Type":"ContainerDied","Data":"fa9fcb711a8bd09cd7caa8de0090efb3985a70bd507f9ea23f6dd1635c0f5233"} Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.686665 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g4shq" event={"ID":"d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca","Type":"ContainerDied","Data":"be4f3afeead4a0fc4447950802fee91e1cc47c20e42004a48138d40e0bbde1ed"} Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.686681 4817 scope.go:117] "RemoveContainer" containerID="fa9fcb711a8bd09cd7caa8de0090efb3985a70bd507f9ea23f6dd1635c0f5233" Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.686768 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g4shq" Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.709883 4817 scope.go:117] "RemoveContainer" containerID="5f8ceb388970320522bb45007183fba546683114f54fb55f26c5cec40cfd5958" Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.726461 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g4shq"] Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.731492 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-g4shq"] Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.746137 4817 scope.go:117] "RemoveContainer" containerID="4be870f74bf7dd7c165bd7b0ebd446d1ad2a492a09b439f182eeb508807b001c" Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.769268 4817 scope.go:117] "RemoveContainer" containerID="fa9fcb711a8bd09cd7caa8de0090efb3985a70bd507f9ea23f6dd1635c0f5233" Feb 18 14:13:50 crc kubenswrapper[4817]: E0218 14:13:50.769752 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa9fcb711a8bd09cd7caa8de0090efb3985a70bd507f9ea23f6dd1635c0f5233\": container with ID starting with fa9fcb711a8bd09cd7caa8de0090efb3985a70bd507f9ea23f6dd1635c0f5233 not found: ID does not exist" containerID="fa9fcb711a8bd09cd7caa8de0090efb3985a70bd507f9ea23f6dd1635c0f5233" Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.769858 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa9fcb711a8bd09cd7caa8de0090efb3985a70bd507f9ea23f6dd1635c0f5233"} err="failed to get container status \"fa9fcb711a8bd09cd7caa8de0090efb3985a70bd507f9ea23f6dd1635c0f5233\": rpc error: code = NotFound desc = could not find container \"fa9fcb711a8bd09cd7caa8de0090efb3985a70bd507f9ea23f6dd1635c0f5233\": container with ID starting with fa9fcb711a8bd09cd7caa8de0090efb3985a70bd507f9ea23f6dd1635c0f5233 not found: ID does not exist" Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.769945 4817 scope.go:117] "RemoveContainer" containerID="5f8ceb388970320522bb45007183fba546683114f54fb55f26c5cec40cfd5958" Feb 18 14:13:50 crc kubenswrapper[4817]: E0218 14:13:50.770426 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f8ceb388970320522bb45007183fba546683114f54fb55f26c5cec40cfd5958\": container with ID starting with 5f8ceb388970320522bb45007183fba546683114f54fb55f26c5cec40cfd5958 not found: ID does not exist" containerID="5f8ceb388970320522bb45007183fba546683114f54fb55f26c5cec40cfd5958" Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.770460 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f8ceb388970320522bb45007183fba546683114f54fb55f26c5cec40cfd5958"} err="failed to get container status \"5f8ceb388970320522bb45007183fba546683114f54fb55f26c5cec40cfd5958\": rpc error: code = NotFound desc = could not find container \"5f8ceb388970320522bb45007183fba546683114f54fb55f26c5cec40cfd5958\": container with ID starting with 5f8ceb388970320522bb45007183fba546683114f54fb55f26c5cec40cfd5958 not found: ID does not exist" Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.770483 4817 scope.go:117] "RemoveContainer" containerID="4be870f74bf7dd7c165bd7b0ebd446d1ad2a492a09b439f182eeb508807b001c" Feb 18 14:13:50 crc kubenswrapper[4817]: E0218 14:13:50.770856 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4be870f74bf7dd7c165bd7b0ebd446d1ad2a492a09b439f182eeb508807b001c\": container with ID starting with 4be870f74bf7dd7c165bd7b0ebd446d1ad2a492a09b439f182eeb508807b001c not found: ID does not exist" containerID="4be870f74bf7dd7c165bd7b0ebd446d1ad2a492a09b439f182eeb508807b001c" Feb 18 14:13:50 crc kubenswrapper[4817]: I0218 14:13:50.770952 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4be870f74bf7dd7c165bd7b0ebd446d1ad2a492a09b439f182eeb508807b001c"} err="failed to get container status \"4be870f74bf7dd7c165bd7b0ebd446d1ad2a492a09b439f182eeb508807b001c\": rpc error: code = NotFound desc = could not find container \"4be870f74bf7dd7c165bd7b0ebd446d1ad2a492a09b439f182eeb508807b001c\": container with ID starting with 4be870f74bf7dd7c165bd7b0ebd446d1ad2a492a09b439f182eeb508807b001c not found: ID does not exist" Feb 18 14:13:52 crc kubenswrapper[4817]: I0218 14:13:52.179447 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca" path="/var/lib/kubelet/pods/d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca/volumes" Feb 18 14:13:53 crc kubenswrapper[4817]: I0218 14:13:53.229951 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-5864f6ff6b-g4lc4"] Feb 18 14:13:53 crc kubenswrapper[4817]: E0218 14:13:53.230622 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="719bba7d-4fc5-4a70-9711-ecb679a5055a" containerName="util" Feb 18 14:13:53 crc kubenswrapper[4817]: I0218 14:13:53.230641 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="719bba7d-4fc5-4a70-9711-ecb679a5055a" containerName="util" Feb 18 14:13:53 crc kubenswrapper[4817]: E0218 14:13:53.230658 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca" containerName="extract-content" Feb 18 14:13:53 crc kubenswrapper[4817]: I0218 14:13:53.230665 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca" containerName="extract-content" Feb 18 14:13:53 crc kubenswrapper[4817]: E0218 14:13:53.230678 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca" containerName="extract-utilities" Feb 18 14:13:53 crc kubenswrapper[4817]: I0218 14:13:53.230686 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca" containerName="extract-utilities" Feb 18 14:13:53 crc kubenswrapper[4817]: E0218 14:13:53.230702 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca" containerName="registry-server" Feb 18 14:13:53 crc kubenswrapper[4817]: I0218 14:13:53.230709 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca" containerName="registry-server" Feb 18 14:13:53 crc kubenswrapper[4817]: E0218 14:13:53.230722 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="719bba7d-4fc5-4a70-9711-ecb679a5055a" containerName="pull" Feb 18 14:13:53 crc kubenswrapper[4817]: I0218 14:13:53.230729 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="719bba7d-4fc5-4a70-9711-ecb679a5055a" containerName="pull" Feb 18 14:13:53 crc kubenswrapper[4817]: E0218 14:13:53.230743 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="719bba7d-4fc5-4a70-9711-ecb679a5055a" containerName="extract" Feb 18 14:13:53 crc kubenswrapper[4817]: I0218 14:13:53.230749 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="719bba7d-4fc5-4a70-9711-ecb679a5055a" containerName="extract" Feb 18 14:13:53 crc kubenswrapper[4817]: I0218 14:13:53.230877 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="719bba7d-4fc5-4a70-9711-ecb679a5055a" containerName="extract" Feb 18 14:13:53 crc kubenswrapper[4817]: I0218 14:13:53.230891 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9e17f68-a35f-4e4d-a9fa-8aca0dc601ca" containerName="registry-server" Feb 18 14:13:53 crc kubenswrapper[4817]: I0218 14:13:53.231422 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5864f6ff6b-g4lc4" Feb 18 14:13:53 crc kubenswrapper[4817]: I0218 14:13:53.236556 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-hsdn2" Feb 18 14:13:53 crc kubenswrapper[4817]: I0218 14:13:53.255759 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5864f6ff6b-g4lc4"] Feb 18 14:13:53 crc kubenswrapper[4817]: I0218 14:13:53.342671 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5sg4\" (UniqueName: \"kubernetes.io/projected/b05d374b-b714-4826-80b8-246c15521534-kube-api-access-q5sg4\") pod \"openstack-operator-controller-init-5864f6ff6b-g4lc4\" (UID: \"b05d374b-b714-4826-80b8-246c15521534\") " pod="openstack-operators/openstack-operator-controller-init-5864f6ff6b-g4lc4" Feb 18 14:13:53 crc kubenswrapper[4817]: I0218 14:13:53.444147 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5sg4\" (UniqueName: \"kubernetes.io/projected/b05d374b-b714-4826-80b8-246c15521534-kube-api-access-q5sg4\") pod \"openstack-operator-controller-init-5864f6ff6b-g4lc4\" (UID: \"b05d374b-b714-4826-80b8-246c15521534\") " pod="openstack-operators/openstack-operator-controller-init-5864f6ff6b-g4lc4" Feb 18 14:13:53 crc kubenswrapper[4817]: I0218 14:13:53.477401 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5sg4\" (UniqueName: \"kubernetes.io/projected/b05d374b-b714-4826-80b8-246c15521534-kube-api-access-q5sg4\") pod \"openstack-operator-controller-init-5864f6ff6b-g4lc4\" (UID: \"b05d374b-b714-4826-80b8-246c15521534\") " pod="openstack-operators/openstack-operator-controller-init-5864f6ff6b-g4lc4" Feb 18 14:13:53 crc kubenswrapper[4817]: I0218 14:13:53.556897 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5864f6ff6b-g4lc4" Feb 18 14:13:53 crc kubenswrapper[4817]: I0218 14:13:53.968245 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5864f6ff6b-g4lc4"] Feb 18 14:13:54 crc kubenswrapper[4817]: I0218 14:13:54.719413 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5864f6ff6b-g4lc4" event={"ID":"b05d374b-b714-4826-80b8-246c15521534","Type":"ContainerStarted","Data":"a12cbf8a69af6fe5370fc8e7de930fdf65ebb7b0af097398a9e0ddce284b4e32"} Feb 18 14:13:58 crc kubenswrapper[4817]: I0218 14:13:58.753680 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5864f6ff6b-g4lc4" event={"ID":"b05d374b-b714-4826-80b8-246c15521534","Type":"ContainerStarted","Data":"e8b9606e47c3385295e58b017002dd2e5497951fe647c8be5a2b65220abd3b67"} Feb 18 14:13:58 crc kubenswrapper[4817]: I0218 14:13:58.754300 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-5864f6ff6b-g4lc4" Feb 18 14:13:58 crc kubenswrapper[4817]: I0218 14:13:58.783386 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-5864f6ff6b-g4lc4" podStartSLOduration=2.097004791 podStartE2EDuration="5.783345805s" podCreationTimestamp="2026-02-18 14:13:53 +0000 UTC" firstStartedPulling="2026-02-18 14:13:53.976127133 +0000 UTC m=+896.551663116" lastFinishedPulling="2026-02-18 14:13:57.662468157 +0000 UTC m=+900.238004130" observedRunningTime="2026-02-18 14:13:58.77879168 +0000 UTC m=+901.354327663" watchObservedRunningTime="2026-02-18 14:13:58.783345805 +0000 UTC m=+901.358881788" Feb 18 14:14:03 crc kubenswrapper[4817]: I0218 14:14:03.560765 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-5864f6ff6b-g4lc4" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.230446 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-rs8vm"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.233854 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-z64rl"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.234107 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rs8vm" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.234753 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-z64rl" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.241569 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-26jm6" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.241849 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-6hvmt" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.246261 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-z64rl"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.262915 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-rs8vm"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.272951 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-hwmjj"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.273792 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-hwmjj" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.276658 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-bg2s4" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.345520 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-hwmjj"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.383591 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml7bh\" (UniqueName: \"kubernetes.io/projected/4441a78b-c58a-4030-801b-06dbfa1729b1-kube-api-access-ml7bh\") pod \"designate-operator-controller-manager-6d8bf5c495-hwmjj\" (UID: \"4441a78b-c58a-4030-801b-06dbfa1729b1\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-hwmjj" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.383698 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfddg\" (UniqueName: \"kubernetes.io/projected/4f0674c2-f05e-4276-b2b0-dc5ed66c187a-kube-api-access-qfddg\") pod \"barbican-operator-controller-manager-868647ff47-rs8vm\" (UID: \"4f0674c2-f05e-4276-b2b0-dc5ed66c187a\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rs8vm" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.383788 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xxpm\" (UniqueName: \"kubernetes.io/projected/4339e125-4e60-44d9-8e15-97b4000669e2-kube-api-access-6xxpm\") pod \"cinder-operator-controller-manager-5d946d989d-z64rl\" (UID: \"4339e125-4e60-44d9-8e15-97b4000669e2\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-z64rl" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.404971 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-c2b7x"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.406023 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-c2b7x" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.411552 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-f8nz2" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.451337 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-c2b7x"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.484947 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml7bh\" (UniqueName: \"kubernetes.io/projected/4441a78b-c58a-4030-801b-06dbfa1729b1-kube-api-access-ml7bh\") pod \"designate-operator-controller-manager-6d8bf5c495-hwmjj\" (UID: \"4441a78b-c58a-4030-801b-06dbfa1729b1\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-hwmjj" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.485222 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfddg\" (UniqueName: \"kubernetes.io/projected/4f0674c2-f05e-4276-b2b0-dc5ed66c187a-kube-api-access-qfddg\") pod \"barbican-operator-controller-manager-868647ff47-rs8vm\" (UID: \"4f0674c2-f05e-4276-b2b0-dc5ed66c187a\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rs8vm" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.485385 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xxpm\" (UniqueName: \"kubernetes.io/projected/4339e125-4e60-44d9-8e15-97b4000669e2-kube-api-access-6xxpm\") pod \"cinder-operator-controller-manager-5d946d989d-z64rl\" (UID: \"4339e125-4e60-44d9-8e15-97b4000669e2\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-z64rl" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.485484 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlbnm\" (UniqueName: \"kubernetes.io/projected/00831f79-f6d5-4896-b718-4120117751b8-kube-api-access-mlbnm\") pod \"glance-operator-controller-manager-77987464f4-c2b7x\" (UID: \"00831f79-f6d5-4896-b718-4120117751b8\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-c2b7x" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.510048 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-zknf5"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.511458 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-zknf5" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.513970 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-8xg2s" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.531588 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xxpm\" (UniqueName: \"kubernetes.io/projected/4339e125-4e60-44d9-8e15-97b4000669e2-kube-api-access-6xxpm\") pod \"cinder-operator-controller-manager-5d946d989d-z64rl\" (UID: \"4339e125-4e60-44d9-8e15-97b4000669e2\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-z64rl" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.538866 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfddg\" (UniqueName: \"kubernetes.io/projected/4f0674c2-f05e-4276-b2b0-dc5ed66c187a-kube-api-access-qfddg\") pod \"barbican-operator-controller-manager-868647ff47-rs8vm\" (UID: \"4f0674c2-f05e-4276-b2b0-dc5ed66c187a\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rs8vm" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.539433 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml7bh\" (UniqueName: \"kubernetes.io/projected/4441a78b-c58a-4030-801b-06dbfa1729b1-kube-api-access-ml7bh\") pod \"designate-operator-controller-manager-6d8bf5c495-hwmjj\" (UID: \"4441a78b-c58a-4030-801b-06dbfa1729b1\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-hwmjj" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.551163 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-zknf5"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.567439 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-gq259"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.568402 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-gq259" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.569197 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rs8vm" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.570800 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-7tgsq" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.586754 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlbnm\" (UniqueName: \"kubernetes.io/projected/00831f79-f6d5-4896-b718-4120117751b8-kube-api-access-mlbnm\") pod \"glance-operator-controller-manager-77987464f4-c2b7x\" (UID: \"00831f79-f6d5-4896-b718-4120117751b8\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-c2b7x" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.586792 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhgkt\" (UniqueName: \"kubernetes.io/projected/13246e06-5b63-4076-a556-de264d7afdf4-kube-api-access-nhgkt\") pod \"heat-operator-controller-manager-69f49c598c-zknf5\" (UID: \"13246e06-5b63-4076-a556-de264d7afdf4\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-zknf5" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.587117 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-xpwgd"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.587856 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-xpwgd" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.588943 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-z64rl" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.591328 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.591881 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-7rw7f" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.614157 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlbnm\" (UniqueName: \"kubernetes.io/projected/00831f79-f6d5-4896-b718-4120117751b8-kube-api-access-mlbnm\") pod \"glance-operator-controller-manager-77987464f4-c2b7x\" (UID: \"00831f79-f6d5-4896-b718-4120117751b8\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-c2b7x" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.626357 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-hwmjj" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.635648 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-gq259"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.648334 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-xpwgd"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.655349 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-64rvt"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.656649 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-64rvt" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.659089 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-nf9d5" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.662094 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-64rvt"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.669921 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-fs8m7"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.670874 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-fs8m7" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.678218 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-4h8qx"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.678711 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-vff8v" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.685035 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-4rv4f"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.685232 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-4h8qx" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.685614 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-4h8qx"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.685680 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-4rv4f" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.687785 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-rzfsj" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.688035 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-rzxtt" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.689314 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn42p\" (UniqueName: \"kubernetes.io/projected/e7be81da-3629-4713-87c6-34cabd9a8347-kube-api-access-sn42p\") pod \"infra-operator-controller-manager-79d975b745-xpwgd\" (UID: \"e7be81da-3629-4713-87c6-34cabd9a8347\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-xpwgd" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.689383 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7be81da-3629-4713-87c6-34cabd9a8347-cert\") pod \"infra-operator-controller-manager-79d975b745-xpwgd\" (UID: \"e7be81da-3629-4713-87c6-34cabd9a8347\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-xpwgd" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.689418 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srrcj\" (UniqueName: \"kubernetes.io/projected/5721fd5d-07bb-44df-bfb8-4b4dd80ac7a4-kube-api-access-srrcj\") pod \"horizon-operator-controller-manager-5b9b8895d5-gq259\" (UID: \"5721fd5d-07bb-44df-bfb8-4b4dd80ac7a4\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-gq259" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.689448 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhgkt\" (UniqueName: \"kubernetes.io/projected/13246e06-5b63-4076-a556-de264d7afdf4-kube-api-access-nhgkt\") pod \"heat-operator-controller-manager-69f49c598c-zknf5\" (UID: \"13246e06-5b63-4076-a556-de264d7afdf4\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-zknf5" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.691140 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-fs8m7"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.693293 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-4rv4f"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.712435 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhgkt\" (UniqueName: \"kubernetes.io/projected/13246e06-5b63-4076-a556-de264d7afdf4-kube-api-access-nhgkt\") pod \"heat-operator-controller-manager-69f49c598c-zknf5\" (UID: \"13246e06-5b63-4076-a556-de264d7afdf4\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-zknf5" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.726277 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-dr67b"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.727902 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-dr67b" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.729692 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-92dlw" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.735408 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-c2b7x" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.755248 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-9jkwb"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.756300 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9jkwb" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.765039 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-qlrqm"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.766155 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-qlrqm" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.771662 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-bg84z" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.773604 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-l2qsv" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.788139 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-dr67b"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.790641 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js2nb\" (UniqueName: \"kubernetes.io/projected/644f07fc-02ce-49f0-87bf-54f765c15d8c-kube-api-access-js2nb\") pod \"mariadb-operator-controller-manager-6994f66f48-4rv4f\" (UID: \"644f07fc-02ce-49f0-87bf-54f765c15d8c\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-4rv4f" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.790673 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhwmj\" (UniqueName: \"kubernetes.io/projected/01962a92-98c7-412c-86a7-ee21e6cb92a9-kube-api-access-rhwmj\") pod \"ironic-operator-controller-manager-554564d7fc-64rvt\" (UID: \"01962a92-98c7-412c-86a7-ee21e6cb92a9\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-64rvt" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.790703 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7be81da-3629-4713-87c6-34cabd9a8347-cert\") pod \"infra-operator-controller-manager-79d975b745-xpwgd\" (UID: \"e7be81da-3629-4713-87c6-34cabd9a8347\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-xpwgd" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.790730 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs7nx\" (UniqueName: \"kubernetes.io/projected/85bd6fc0-d973-4172-b441-c15d4abeb604-kube-api-access-hs7nx\") pod \"keystone-operator-controller-manager-b4d948c87-fs8m7\" (UID: \"85bd6fc0-d973-4172-b441-c15d4abeb604\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-fs8m7" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.790752 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srrcj\" (UniqueName: \"kubernetes.io/projected/5721fd5d-07bb-44df-bfb8-4b4dd80ac7a4-kube-api-access-srrcj\") pod \"horizon-operator-controller-manager-5b9b8895d5-gq259\" (UID: \"5721fd5d-07bb-44df-bfb8-4b4dd80ac7a4\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-gq259" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.790798 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc4gt\" (UniqueName: \"kubernetes.io/projected/c5e8b4c9-5a63-44c3-9f6c-c7ee268dcef3-kube-api-access-dc4gt\") pod \"manila-operator-controller-manager-54f6768c69-4h8qx\" (UID: \"c5e8b4c9-5a63-44c3-9f6c-c7ee268dcef3\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-4h8qx" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.790826 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn42p\" (UniqueName: \"kubernetes.io/projected/e7be81da-3629-4713-87c6-34cabd9a8347-kube-api-access-sn42p\") pod \"infra-operator-controller-manager-79d975b745-xpwgd\" (UID: \"e7be81da-3629-4713-87c6-34cabd9a8347\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-xpwgd" Feb 18 14:14:33 crc kubenswrapper[4817]: E0218 14:14:33.791179 4817 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 18 14:14:33 crc kubenswrapper[4817]: E0218 14:14:33.791226 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7be81da-3629-4713-87c6-34cabd9a8347-cert podName:e7be81da-3629-4713-87c6-34cabd9a8347 nodeName:}" failed. No retries permitted until 2026-02-18 14:14:34.291207419 +0000 UTC m=+936.866743402 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e7be81da-3629-4713-87c6-34cabd9a8347-cert") pod "infra-operator-controller-manager-79d975b745-xpwgd" (UID: "e7be81da-3629-4713-87c6-34cabd9a8347") : secret "infra-operator-webhook-server-cert" not found Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.791995 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-9jkwb"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.821502 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn42p\" (UniqueName: \"kubernetes.io/projected/e7be81da-3629-4713-87c6-34cabd9a8347-kube-api-access-sn42p\") pod \"infra-operator-controller-manager-79d975b745-xpwgd\" (UID: \"e7be81da-3629-4713-87c6-34cabd9a8347\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-xpwgd" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.829159 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srrcj\" (UniqueName: \"kubernetes.io/projected/5721fd5d-07bb-44df-bfb8-4b4dd80ac7a4-kube-api-access-srrcj\") pod \"horizon-operator-controller-manager-5b9b8895d5-gq259\" (UID: \"5721fd5d-07bb-44df-bfb8-4b4dd80ac7a4\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-gq259" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.834422 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-qlrqm"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.847574 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-l9fqt"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.850328 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-l9fqt" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.852829 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-p8qlc" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.861235 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.862531 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.870200 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-xthgr" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.870379 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.876368 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-l9fqt"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.880720 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-9td4r"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.881970 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9td4r" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.885395 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-4mcmx" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.890204 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-ff84x"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.892197 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-ff84x" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.893303 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs7nx\" (UniqueName: \"kubernetes.io/projected/85bd6fc0-d973-4172-b441-c15d4abeb604-kube-api-access-hs7nx\") pod \"keystone-operator-controller-manager-b4d948c87-fs8m7\" (UID: \"85bd6fc0-d973-4172-b441-c15d4abeb604\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-fs8m7" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.893371 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xd9t\" (UniqueName: \"kubernetes.io/projected/0db531ef-d3b4-4b35-9497-8892cbd3db77-kube-api-access-5xd9t\") pod \"octavia-operator-controller-manager-69f8888797-qlrqm\" (UID: \"0db531ef-d3b4-4b35-9497-8892cbd3db77\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-qlrqm" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.893414 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc4gt\" (UniqueName: \"kubernetes.io/projected/c5e8b4c9-5a63-44c3-9f6c-c7ee268dcef3-kube-api-access-dc4gt\") pod \"manila-operator-controller-manager-54f6768c69-4h8qx\" (UID: \"c5e8b4c9-5a63-44c3-9f6c-c7ee268dcef3\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-4h8qx" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.893464 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr4f7\" (UniqueName: \"kubernetes.io/projected/6e783396-37c1-4a0d-bfe4-495fdf4d41bf-kube-api-access-rr4f7\") pod \"neutron-operator-controller-manager-64ddbf8bb-dr67b\" (UID: \"6e783396-37c1-4a0d-bfe4-495fdf4d41bf\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-dr67b" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.893507 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js2nb\" (UniqueName: \"kubernetes.io/projected/644f07fc-02ce-49f0-87bf-54f765c15d8c-kube-api-access-js2nb\") pod \"mariadb-operator-controller-manager-6994f66f48-4rv4f\" (UID: \"644f07fc-02ce-49f0-87bf-54f765c15d8c\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-4rv4f" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.893531 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpr8t\" (UniqueName: \"kubernetes.io/projected/ca917110-0727-4c63-ad9a-20722a6cba34-kube-api-access-lpr8t\") pod \"nova-operator-controller-manager-567668f5cf-9jkwb\" (UID: \"ca917110-0727-4c63-ad9a-20722a6cba34\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9jkwb" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.893555 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhwmj\" (UniqueName: \"kubernetes.io/projected/01962a92-98c7-412c-86a7-ee21e6cb92a9-kube-api-access-rhwmj\") pod \"ironic-operator-controller-manager-554564d7fc-64rvt\" (UID: \"01962a92-98c7-412c-86a7-ee21e6cb92a9\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-64rvt" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.896536 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-n7wz4" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.904529 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.912394 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc4gt\" (UniqueName: \"kubernetes.io/projected/c5e8b4c9-5a63-44c3-9f6c-c7ee268dcef3-kube-api-access-dc4gt\") pod \"manila-operator-controller-manager-54f6768c69-4h8qx\" (UID: \"c5e8b4c9-5a63-44c3-9f6c-c7ee268dcef3\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-4h8qx" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.916676 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js2nb\" (UniqueName: \"kubernetes.io/projected/644f07fc-02ce-49f0-87bf-54f765c15d8c-kube-api-access-js2nb\") pod \"mariadb-operator-controller-manager-6994f66f48-4rv4f\" (UID: \"644f07fc-02ce-49f0-87bf-54f765c15d8c\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-4rv4f" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.922224 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-ff84x"] Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.935039 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhwmj\" (UniqueName: \"kubernetes.io/projected/01962a92-98c7-412c-86a7-ee21e6cb92a9-kube-api-access-rhwmj\") pod \"ironic-operator-controller-manager-554564d7fc-64rvt\" (UID: \"01962a92-98c7-412c-86a7-ee21e6cb92a9\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-64rvt" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.979913 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs7nx\" (UniqueName: \"kubernetes.io/projected/85bd6fc0-d973-4172-b441-c15d4abeb604-kube-api-access-hs7nx\") pod \"keystone-operator-controller-manager-b4d948c87-fs8m7\" (UID: \"85bd6fc0-d973-4172-b441-c15d4abeb604\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-fs8m7" Feb 18 14:14:33 crc kubenswrapper[4817]: I0218 14:14:33.999649 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-zknf5" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.007329 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr4f7\" (UniqueName: \"kubernetes.io/projected/6e783396-37c1-4a0d-bfe4-495fdf4d41bf-kube-api-access-rr4f7\") pod \"neutron-operator-controller-manager-64ddbf8bb-dr67b\" (UID: \"6e783396-37c1-4a0d-bfe4-495fdf4d41bf\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-dr67b" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.007524 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n89cj\" (UniqueName: \"kubernetes.io/projected/2c04c342-bd87-46e7-8a2d-72dc30f858aa-kube-api-access-n89cj\") pod \"placement-operator-controller-manager-8497b45c89-9td4r\" (UID: \"2c04c342-bd87-46e7-8a2d-72dc30f858aa\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9td4r" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.007599 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4lv6\" (UniqueName: \"kubernetes.io/projected/ad995216-386a-455b-b48d-378dbfd271bf-kube-api-access-p4lv6\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j\" (UID: \"ad995216-386a-455b-b48d-378dbfd271bf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.007702 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpr8t\" (UniqueName: \"kubernetes.io/projected/ca917110-0727-4c63-ad9a-20722a6cba34-kube-api-access-lpr8t\") pod \"nova-operator-controller-manager-567668f5cf-9jkwb\" (UID: \"ca917110-0727-4c63-ad9a-20722a6cba34\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9jkwb" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.007787 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpqnk\" (UniqueName: \"kubernetes.io/projected/9d933918-c23c-456a-8b3f-08ee4c2909dd-kube-api-access-wpqnk\") pod \"ovn-operator-controller-manager-d44cf6b75-l9fqt\" (UID: \"9d933918-c23c-456a-8b3f-08ee4c2909dd\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-l9fqt" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.007921 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xd9t\" (UniqueName: \"kubernetes.io/projected/0db531ef-d3b4-4b35-9497-8892cbd3db77-kube-api-access-5xd9t\") pod \"octavia-operator-controller-manager-69f8888797-qlrqm\" (UID: \"0db531ef-d3b4-4b35-9497-8892cbd3db77\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-qlrqm" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.008343 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad995216-386a-455b-b48d-378dbfd271bf-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j\" (UID: \"ad995216-386a-455b-b48d-378dbfd271bf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.010201 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-9td4r"] Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.037690 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-gq259" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.038584 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcxr4\" (UniqueName: \"kubernetes.io/projected/b77d6c32-6c30-42be-ab69-36b969d40950-kube-api-access-qcxr4\") pod \"swift-operator-controller-manager-68f46476f-ff84x\" (UID: \"b77d6c32-6c30-42be-ab69-36b969d40950\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-ff84x" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.046001 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr4f7\" (UniqueName: \"kubernetes.io/projected/6e783396-37c1-4a0d-bfe4-495fdf4d41bf-kube-api-access-rr4f7\") pod \"neutron-operator-controller-manager-64ddbf8bb-dr67b\" (UID: \"6e783396-37c1-4a0d-bfe4-495fdf4d41bf\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-dr67b" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.066091 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpr8t\" (UniqueName: \"kubernetes.io/projected/ca917110-0727-4c63-ad9a-20722a6cba34-kube-api-access-lpr8t\") pod \"nova-operator-controller-manager-567668f5cf-9jkwb\" (UID: \"ca917110-0727-4c63-ad9a-20722a6cba34\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9jkwb" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.073420 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xd9t\" (UniqueName: \"kubernetes.io/projected/0db531ef-d3b4-4b35-9497-8892cbd3db77-kube-api-access-5xd9t\") pod \"octavia-operator-controller-manager-69f8888797-qlrqm\" (UID: \"0db531ef-d3b4-4b35-9497-8892cbd3db77\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-qlrqm" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.093908 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-64rvt" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.107033 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-fs8m7" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.111481 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6956d67c5c-xbjdr"] Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.112877 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6956d67c5c-xbjdr" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.117307 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-k7zxv" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.119484 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6956d67c5c-xbjdr"] Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.149012 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpqnk\" (UniqueName: \"kubernetes.io/projected/9d933918-c23c-456a-8b3f-08ee4c2909dd-kube-api-access-wpqnk\") pod \"ovn-operator-controller-manager-d44cf6b75-l9fqt\" (UID: \"9d933918-c23c-456a-8b3f-08ee4c2909dd\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-l9fqt" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.149157 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad995216-386a-455b-b48d-378dbfd271bf-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j\" (UID: \"ad995216-386a-455b-b48d-378dbfd271bf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.149193 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcxr4\" (UniqueName: \"kubernetes.io/projected/b77d6c32-6c30-42be-ab69-36b969d40950-kube-api-access-qcxr4\") pod \"swift-operator-controller-manager-68f46476f-ff84x\" (UID: \"b77d6c32-6c30-42be-ab69-36b969d40950\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-ff84x" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.149243 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n89cj\" (UniqueName: \"kubernetes.io/projected/2c04c342-bd87-46e7-8a2d-72dc30f858aa-kube-api-access-n89cj\") pod \"placement-operator-controller-manager-8497b45c89-9td4r\" (UID: \"2c04c342-bd87-46e7-8a2d-72dc30f858aa\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9td4r" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.149269 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4lv6\" (UniqueName: \"kubernetes.io/projected/ad995216-386a-455b-b48d-378dbfd271bf-kube-api-access-p4lv6\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j\" (UID: \"ad995216-386a-455b-b48d-378dbfd271bf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j" Feb 18 14:14:34 crc kubenswrapper[4817]: E0218 14:14:34.149935 4817 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 14:14:34 crc kubenswrapper[4817]: E0218 14:14:34.150011 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad995216-386a-455b-b48d-378dbfd271bf-cert podName:ad995216-386a-455b-b48d-378dbfd271bf nodeName:}" failed. No retries permitted until 2026-02-18 14:14:34.649973627 +0000 UTC m=+937.225509610 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ad995216-386a-455b-b48d-378dbfd271bf-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j" (UID: "ad995216-386a-455b-b48d-378dbfd271bf") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.155251 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-4h8qx" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.180866 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-8q5c8"] Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.184916 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcxr4\" (UniqueName: \"kubernetes.io/projected/b77d6c32-6c30-42be-ab69-36b969d40950-kube-api-access-qcxr4\") pod \"swift-operator-controller-manager-68f46476f-ff84x\" (UID: \"b77d6c32-6c30-42be-ab69-36b969d40950\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-ff84x" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.184653 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4lv6\" (UniqueName: \"kubernetes.io/projected/ad995216-386a-455b-b48d-378dbfd271bf-kube-api-access-p4lv6\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j\" (UID: \"ad995216-386a-455b-b48d-378dbfd271bf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.184600 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-4rv4f" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.186779 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-8q5c8" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.192959 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-sf2vs" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.194048 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpqnk\" (UniqueName: \"kubernetes.io/projected/9d933918-c23c-456a-8b3f-08ee4c2909dd-kube-api-access-wpqnk\") pod \"ovn-operator-controller-manager-d44cf6b75-l9fqt\" (UID: \"9d933918-c23c-456a-8b3f-08ee4c2909dd\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-l9fqt" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.204735 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n89cj\" (UniqueName: \"kubernetes.io/projected/2c04c342-bd87-46e7-8a2d-72dc30f858aa-kube-api-access-n89cj\") pod \"placement-operator-controller-manager-8497b45c89-9td4r\" (UID: \"2c04c342-bd87-46e7-8a2d-72dc30f858aa\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9td4r" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.217943 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-dr67b" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.245269 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9jkwb" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.250637 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn7n9\" (UniqueName: \"kubernetes.io/projected/3374b90b-3a12-4e01-a0cb-ed7c51d844d7-kube-api-access-mn7n9\") pod \"telemetry-operator-controller-manager-6956d67c5c-xbjdr\" (UID: \"3374b90b-3a12-4e01-a0cb-ed7c51d844d7\") " pod="openstack-operators/telemetry-operator-controller-manager-6956d67c5c-xbjdr" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.253181 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-8q5c8"] Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.253262 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-8pfz7"] Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.254163 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-8pfz7" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.256123 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-dmxjp" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.261664 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-8pfz7"] Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.262025 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm62t\" (UniqueName: \"kubernetes.io/projected/bccec692-ee64-46e1-8979-e6173c132d8e-kube-api-access-tm62t\") pod \"test-operator-controller-manager-7866795846-8q5c8\" (UID: \"bccec692-ee64-46e1-8979-e6173c132d8e\") " pod="openstack-operators/test-operator-controller-manager-7866795846-8q5c8" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.287451 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-qlrqm" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.324386 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7954588dd9-dngjl"] Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.325864 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7954588dd9-dngjl" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.332560 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-l9fqt" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.334232 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7954588dd9-dngjl"] Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.338320 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.338391 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-h74xq" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.340945 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.342192 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5np9g"] Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.343235 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5np9g" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.346242 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-qhtg5" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.350178 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5np9g"] Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.362458 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-rs8vm"] Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.363629 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2vx9\" (UniqueName: \"kubernetes.io/projected/93f59b48-31c6-4fed-8ccc-7d722605d896-kube-api-access-k2vx9\") pod \"watcher-operator-controller-manager-5db88f68c-8pfz7\" (UID: \"93f59b48-31c6-4fed-8ccc-7d722605d896\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-8pfz7" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.363706 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm62t\" (UniqueName: \"kubernetes.io/projected/bccec692-ee64-46e1-8979-e6173c132d8e-kube-api-access-tm62t\") pod \"test-operator-controller-manager-7866795846-8q5c8\" (UID: \"bccec692-ee64-46e1-8979-e6173c132d8e\") " pod="openstack-operators/test-operator-controller-manager-7866795846-8q5c8" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.363796 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7be81da-3629-4713-87c6-34cabd9a8347-cert\") pod \"infra-operator-controller-manager-79d975b745-xpwgd\" (UID: \"e7be81da-3629-4713-87c6-34cabd9a8347\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-xpwgd" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.363854 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn7n9\" (UniqueName: \"kubernetes.io/projected/3374b90b-3a12-4e01-a0cb-ed7c51d844d7-kube-api-access-mn7n9\") pod \"telemetry-operator-controller-manager-6956d67c5c-xbjdr\" (UID: \"3374b90b-3a12-4e01-a0cb-ed7c51d844d7\") " pod="openstack-operators/telemetry-operator-controller-manager-6956d67c5c-xbjdr" Feb 18 14:14:34 crc kubenswrapper[4817]: E0218 14:14:34.364366 4817 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 18 14:14:34 crc kubenswrapper[4817]: E0218 14:14:34.364449 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7be81da-3629-4713-87c6-34cabd9a8347-cert podName:e7be81da-3629-4713-87c6-34cabd9a8347 nodeName:}" failed. No retries permitted until 2026-02-18 14:14:35.36442428 +0000 UTC m=+937.939960423 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e7be81da-3629-4713-87c6-34cabd9a8347-cert") pod "infra-operator-controller-manager-79d975b745-xpwgd" (UID: "e7be81da-3629-4713-87c6-34cabd9a8347") : secret "infra-operator-webhook-server-cert" not found Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.376679 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-z64rl"] Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.397992 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9td4r" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.398201 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn7n9\" (UniqueName: \"kubernetes.io/projected/3374b90b-3a12-4e01-a0cb-ed7c51d844d7-kube-api-access-mn7n9\") pod \"telemetry-operator-controller-manager-6956d67c5c-xbjdr\" (UID: \"3374b90b-3a12-4e01-a0cb-ed7c51d844d7\") " pod="openstack-operators/telemetry-operator-controller-manager-6956d67c5c-xbjdr" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.403961 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm62t\" (UniqueName: \"kubernetes.io/projected/bccec692-ee64-46e1-8979-e6173c132d8e-kube-api-access-tm62t\") pod \"test-operator-controller-manager-7866795846-8q5c8\" (UID: \"bccec692-ee64-46e1-8979-e6173c132d8e\") " pod="openstack-operators/test-operator-controller-manager-7866795846-8q5c8" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.416601 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-ff84x" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.442835 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6956d67c5c-xbjdr" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.464647 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m8d2\" (UniqueName: \"kubernetes.io/projected/086958e1-8a7d-40c9-9725-f18776f863a0-kube-api-access-7m8d2\") pod \"rabbitmq-cluster-operator-manager-668c99d594-5np9g\" (UID: \"086958e1-8a7d-40c9-9725-f18776f863a0\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5np9g" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.464698 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swb8n\" (UniqueName: \"kubernetes.io/projected/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-kube-api-access-swb8n\") pod \"openstack-operator-controller-manager-7954588dd9-dngjl\" (UID: \"ff1b9fe3-84fe-47fc-902c-aa23c9e829d8\") " pod="openstack-operators/openstack-operator-controller-manager-7954588dd9-dngjl" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.464735 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2vx9\" (UniqueName: \"kubernetes.io/projected/93f59b48-31c6-4fed-8ccc-7d722605d896-kube-api-access-k2vx9\") pod \"watcher-operator-controller-manager-5db88f68c-8pfz7\" (UID: \"93f59b48-31c6-4fed-8ccc-7d722605d896\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-8pfz7" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.464764 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-webhook-certs\") pod \"openstack-operator-controller-manager-7954588dd9-dngjl\" (UID: \"ff1b9fe3-84fe-47fc-902c-aa23c9e829d8\") " pod="openstack-operators/openstack-operator-controller-manager-7954588dd9-dngjl" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.464841 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-metrics-certs\") pod \"openstack-operator-controller-manager-7954588dd9-dngjl\" (UID: \"ff1b9fe3-84fe-47fc-902c-aa23c9e829d8\") " pod="openstack-operators/openstack-operator-controller-manager-7954588dd9-dngjl" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.473414 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-hwmjj"] Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.487274 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2vx9\" (UniqueName: \"kubernetes.io/projected/93f59b48-31c6-4fed-8ccc-7d722605d896-kube-api-access-k2vx9\") pod \"watcher-operator-controller-manager-5db88f68c-8pfz7\" (UID: \"93f59b48-31c6-4fed-8ccc-7d722605d896\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-8pfz7" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.494825 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-c2b7x"] Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.518265 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-8q5c8" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.566786 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-metrics-certs\") pod \"openstack-operator-controller-manager-7954588dd9-dngjl\" (UID: \"ff1b9fe3-84fe-47fc-902c-aa23c9e829d8\") " pod="openstack-operators/openstack-operator-controller-manager-7954588dd9-dngjl" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.566858 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m8d2\" (UniqueName: \"kubernetes.io/projected/086958e1-8a7d-40c9-9725-f18776f863a0-kube-api-access-7m8d2\") pod \"rabbitmq-cluster-operator-manager-668c99d594-5np9g\" (UID: \"086958e1-8a7d-40c9-9725-f18776f863a0\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5np9g" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.566906 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swb8n\" (UniqueName: \"kubernetes.io/projected/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-kube-api-access-swb8n\") pod \"openstack-operator-controller-manager-7954588dd9-dngjl\" (UID: \"ff1b9fe3-84fe-47fc-902c-aa23c9e829d8\") " pod="openstack-operators/openstack-operator-controller-manager-7954588dd9-dngjl" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.566937 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-webhook-certs\") pod \"openstack-operator-controller-manager-7954588dd9-dngjl\" (UID: \"ff1b9fe3-84fe-47fc-902c-aa23c9e829d8\") " pod="openstack-operators/openstack-operator-controller-manager-7954588dd9-dngjl" Feb 18 14:14:34 crc kubenswrapper[4817]: E0218 14:14:34.567109 4817 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 18 14:14:34 crc kubenswrapper[4817]: E0218 14:14:34.567167 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-webhook-certs podName:ff1b9fe3-84fe-47fc-902c-aa23c9e829d8 nodeName:}" failed. No retries permitted until 2026-02-18 14:14:35.067148329 +0000 UTC m=+937.642684312 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-webhook-certs") pod "openstack-operator-controller-manager-7954588dd9-dngjl" (UID: "ff1b9fe3-84fe-47fc-902c-aa23c9e829d8") : secret "webhook-server-cert" not found Feb 18 14:14:34 crc kubenswrapper[4817]: E0218 14:14:34.567222 4817 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 18 14:14:34 crc kubenswrapper[4817]: E0218 14:14:34.567246 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-metrics-certs podName:ff1b9fe3-84fe-47fc-902c-aa23c9e829d8 nodeName:}" failed. No retries permitted until 2026-02-18 14:14:35.067238592 +0000 UTC m=+937.642774575 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-metrics-certs") pod "openstack-operator-controller-manager-7954588dd9-dngjl" (UID: "ff1b9fe3-84fe-47fc-902c-aa23c9e829d8") : secret "metrics-server-cert" not found Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.592715 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swb8n\" (UniqueName: \"kubernetes.io/projected/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-kube-api-access-swb8n\") pod \"openstack-operator-controller-manager-7954588dd9-dngjl\" (UID: \"ff1b9fe3-84fe-47fc-902c-aa23c9e829d8\") " pod="openstack-operators/openstack-operator-controller-manager-7954588dd9-dngjl" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.594167 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-8pfz7" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.602948 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m8d2\" (UniqueName: \"kubernetes.io/projected/086958e1-8a7d-40c9-9725-f18776f863a0-kube-api-access-7m8d2\") pod \"rabbitmq-cluster-operator-manager-668c99d594-5np9g\" (UID: \"086958e1-8a7d-40c9-9725-f18776f863a0\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5np9g" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.630569 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-zknf5"] Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.682056 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad995216-386a-455b-b48d-378dbfd271bf-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j\" (UID: \"ad995216-386a-455b-b48d-378dbfd271bf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j" Feb 18 14:14:34 crc kubenswrapper[4817]: E0218 14:14:34.682602 4817 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 14:14:34 crc kubenswrapper[4817]: E0218 14:14:34.682688 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad995216-386a-455b-b48d-378dbfd271bf-cert podName:ad995216-386a-455b-b48d-378dbfd271bf nodeName:}" failed. No retries permitted until 2026-02-18 14:14:35.682671634 +0000 UTC m=+938.258207617 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ad995216-386a-455b-b48d-378dbfd271bf-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j" (UID: "ad995216-386a-455b-b48d-378dbfd271bf") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.688872 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5np9g" Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.748791 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-gq259"] Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.875675 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-fs8m7"] Feb 18 14:14:34 crc kubenswrapper[4817]: I0218 14:14:34.893683 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-64rvt"] Feb 18 14:14:35 crc kubenswrapper[4817]: I0218 14:14:35.029902 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-dr67b"] Feb 18 14:14:35 crc kubenswrapper[4817]: I0218 14:14:35.041429 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-4h8qx"] Feb 18 14:14:35 crc kubenswrapper[4817]: I0218 14:14:35.045449 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-hwmjj" event={"ID":"4441a78b-c58a-4030-801b-06dbfa1729b1","Type":"ContainerStarted","Data":"077ed78458c8483e09192e6c6f69405b15bc2715366561aed565f6f5d401be43"} Feb 18 14:14:35 crc kubenswrapper[4817]: I0218 14:14:35.046807 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-fs8m7" event={"ID":"85bd6fc0-d973-4172-b441-c15d4abeb604","Type":"ContainerStarted","Data":"7347712d4591c9791ba40f29327f8cea5dd370be41b20b5832288b1f069a0a66"} Feb 18 14:14:35 crc kubenswrapper[4817]: I0218 14:14:35.054950 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-c2b7x" event={"ID":"00831f79-f6d5-4896-b718-4120117751b8","Type":"ContainerStarted","Data":"3c5171c34ecd1de266341dbfd88ba51852ecf50b63a28991f3aab410f94b817c"} Feb 18 14:14:35 crc kubenswrapper[4817]: W0218 14:14:35.055455 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e783396_37c1_4a0d_bfe4_495fdf4d41bf.slice/crio-fb97e3bf4246884fca771c3ce59079eb33f0b0efaef7cd8beb824ac620b8b3ed WatchSource:0}: Error finding container fb97e3bf4246884fca771c3ce59079eb33f0b0efaef7cd8beb824ac620b8b3ed: Status 404 returned error can't find the container with id fb97e3bf4246884fca771c3ce59079eb33f0b0efaef7cd8beb824ac620b8b3ed Feb 18 14:14:35 crc kubenswrapper[4817]: I0218 14:14:35.057591 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-z64rl" event={"ID":"4339e125-4e60-44d9-8e15-97b4000669e2","Type":"ContainerStarted","Data":"e57a0d051a058876f2e7ff92fa939b4f380e39eed14f4329518d86035a84eae1"} Feb 18 14:14:35 crc kubenswrapper[4817]: W0218 14:14:35.062050 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5e8b4c9_5a63_44c3_9f6c_c7ee268dcef3.slice/crio-d104a3cb5e8699e0daf248921809fec77629b181229c8bca67eeddc38dd91cd9 WatchSource:0}: Error finding container d104a3cb5e8699e0daf248921809fec77629b181229c8bca67eeddc38dd91cd9: Status 404 returned error can't find the container with id d104a3cb5e8699e0daf248921809fec77629b181229c8bca67eeddc38dd91cd9 Feb 18 14:14:35 crc kubenswrapper[4817]: I0218 14:14:35.062864 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-64rvt" event={"ID":"01962a92-98c7-412c-86a7-ee21e6cb92a9","Type":"ContainerStarted","Data":"c4936e940390a64d6d4c9530864c417a13e17fb790698a8c5ae21e8f6c7a073b"} Feb 18 14:14:35 crc kubenswrapper[4817]: I0218 14:14:35.067734 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rs8vm" event={"ID":"4f0674c2-f05e-4276-b2b0-dc5ed66c187a","Type":"ContainerStarted","Data":"b4492b635d8bc137d623d8cff725e3ef4e1b3623fcd931db247275bc17e202f5"} Feb 18 14:14:35 crc kubenswrapper[4817]: I0218 14:14:35.069208 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-zknf5" event={"ID":"13246e06-5b63-4076-a556-de264d7afdf4","Type":"ContainerStarted","Data":"256e053a0558dc256dc8742fc3d1ca6f60c0258cd64f9ab9f65d324463e35e54"} Feb 18 14:14:35 crc kubenswrapper[4817]: I0218 14:14:35.070355 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-gq259" event={"ID":"5721fd5d-07bb-44df-bfb8-4b4dd80ac7a4","Type":"ContainerStarted","Data":"49efc0d9ac5a70675c314d952687d950ea2045ac43a9ee5e8263556717e30ef6"} Feb 18 14:14:35 crc kubenswrapper[4817]: I0218 14:14:35.095557 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-webhook-certs\") pod \"openstack-operator-controller-manager-7954588dd9-dngjl\" (UID: \"ff1b9fe3-84fe-47fc-902c-aa23c9e829d8\") " pod="openstack-operators/openstack-operator-controller-manager-7954588dd9-dngjl" Feb 18 14:14:35 crc kubenswrapper[4817]: I0218 14:14:35.095686 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-metrics-certs\") pod \"openstack-operator-controller-manager-7954588dd9-dngjl\" (UID: \"ff1b9fe3-84fe-47fc-902c-aa23c9e829d8\") " pod="openstack-operators/openstack-operator-controller-manager-7954588dd9-dngjl" Feb 18 14:14:35 crc kubenswrapper[4817]: E0218 14:14:35.095852 4817 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 18 14:14:35 crc kubenswrapper[4817]: E0218 14:14:35.095876 4817 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 18 14:14:35 crc kubenswrapper[4817]: E0218 14:14:35.095917 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-metrics-certs podName:ff1b9fe3-84fe-47fc-902c-aa23c9e829d8 nodeName:}" failed. No retries permitted until 2026-02-18 14:14:36.095896377 +0000 UTC m=+938.671432360 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-metrics-certs") pod "openstack-operator-controller-manager-7954588dd9-dngjl" (UID: "ff1b9fe3-84fe-47fc-902c-aa23c9e829d8") : secret "metrics-server-cert" not found Feb 18 14:14:35 crc kubenswrapper[4817]: E0218 14:14:35.095948 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-webhook-certs podName:ff1b9fe3-84fe-47fc-902c-aa23c9e829d8 nodeName:}" failed. No retries permitted until 2026-02-18 14:14:36.095926408 +0000 UTC m=+938.671462431 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-webhook-certs") pod "openstack-operator-controller-manager-7954588dd9-dngjl" (UID: "ff1b9fe3-84fe-47fc-902c-aa23c9e829d8") : secret "webhook-server-cert" not found Feb 18 14:14:35 crc kubenswrapper[4817]: I0218 14:14:35.209018 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-4rv4f"] Feb 18 14:14:35 crc kubenswrapper[4817]: I0218 14:14:35.213908 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-9jkwb"] Feb 18 14:14:35 crc kubenswrapper[4817]: I0218 14:14:35.226422 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-qlrqm"] Feb 18 14:14:35 crc kubenswrapper[4817]: I0218 14:14:35.402770 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7be81da-3629-4713-87c6-34cabd9a8347-cert\") pod \"infra-operator-controller-manager-79d975b745-xpwgd\" (UID: \"e7be81da-3629-4713-87c6-34cabd9a8347\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-xpwgd" Feb 18 14:14:35 crc kubenswrapper[4817]: E0218 14:14:35.403071 4817 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 18 14:14:35 crc kubenswrapper[4817]: E0218 14:14:35.403603 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7be81da-3629-4713-87c6-34cabd9a8347-cert podName:e7be81da-3629-4713-87c6-34cabd9a8347 nodeName:}" failed. No retries permitted until 2026-02-18 14:14:37.403568466 +0000 UTC m=+939.979104449 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e7be81da-3629-4713-87c6-34cabd9a8347-cert") pod "infra-operator-controller-manager-79d975b745-xpwgd" (UID: "e7be81da-3629-4713-87c6-34cabd9a8347") : secret "infra-operator-webhook-server-cert" not found Feb 18 14:14:35 crc kubenswrapper[4817]: I0218 14:14:35.406299 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-ff84x"] Feb 18 14:14:35 crc kubenswrapper[4817]: W0218 14:14:35.414618 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93f59b48_31c6_4fed_8ccc_7d722605d896.slice/crio-be39fa7d7ef07f2c6d8f6eb662a7b81b767b0870ed59037153124094a37ed5a1 WatchSource:0}: Error finding container be39fa7d7ef07f2c6d8f6eb662a7b81b767b0870ed59037153124094a37ed5a1: Status 404 returned error can't find the container with id be39fa7d7ef07f2c6d8f6eb662a7b81b767b0870ed59037153124094a37ed5a1 Feb 18 14:14:35 crc kubenswrapper[4817]: I0218 14:14:35.417965 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-8pfz7"] Feb 18 14:14:35 crc kubenswrapper[4817]: W0218 14:14:35.424951 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb77d6c32_6c30_42be_ab69_36b969d40950.slice/crio-e038b029de578525f8d3508a898f2ec6a0783d7faa9d5412883293ca51ebd3ec WatchSource:0}: Error finding container e038b029de578525f8d3508a898f2ec6a0783d7faa9d5412883293ca51ebd3ec: Status 404 returned error can't find the container with id e038b029de578525f8d3508a898f2ec6a0783d7faa9d5412883293ca51ebd3ec Feb 18 14:14:35 crc kubenswrapper[4817]: I0218 14:14:35.426079 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-8q5c8"] Feb 18 14:14:35 crc kubenswrapper[4817]: W0218 14:14:35.435070 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c04c342_bd87_46e7_8a2d_72dc30f858aa.slice/crio-b4c0a40816fc118f7f963fe91ed81c10d943f9f003f565b76f2b4fc8f07f5866 WatchSource:0}: Error finding container b4c0a40816fc118f7f963fe91ed81c10d943f9f003f565b76f2b4fc8f07f5866: Status 404 returned error can't find the container with id b4c0a40816fc118f7f963fe91ed81c10d943f9f003f565b76f2b4fc8f07f5866 Feb 18 14:14:35 crc kubenswrapper[4817]: I0218 14:14:35.436083 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-9td4r"] Feb 18 14:14:35 crc kubenswrapper[4817]: E0218 14:14:35.440675 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tm62t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7866795846-8q5c8_openstack-operators(bccec692-ee64-46e1-8979-e6173c132d8e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 18 14:14:35 crc kubenswrapper[4817]: E0218 14:14:35.441438 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wpqnk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-d44cf6b75-l9fqt_openstack-operators(9d933918-c23c-456a-8b3f-08ee4c2909dd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 18 14:14:35 crc kubenswrapper[4817]: E0218 14:14:35.442119 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-7866795846-8q5c8" podUID="bccec692-ee64-46e1-8979-e6173c132d8e" Feb 18 14:14:35 crc kubenswrapper[4817]: E0218 14:14:35.443304 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-l9fqt" podUID="9d933918-c23c-456a-8b3f-08ee4c2909dd" Feb 18 14:14:35 crc kubenswrapper[4817]: I0218 14:14:35.444084 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-l9fqt"] Feb 18 14:14:35 crc kubenswrapper[4817]: E0218 14:14:35.449583 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n89cj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-8497b45c89-9td4r_openstack-operators(2c04c342-bd87-46e7-8a2d-72dc30f858aa): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 18 14:14:35 crc kubenswrapper[4817]: E0218 14:14:35.450763 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9td4r" podUID="2c04c342-bd87-46e7-8a2d-72dc30f858aa" Feb 18 14:14:35 crc kubenswrapper[4817]: I0218 14:14:35.588643 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5np9g"] Feb 18 14:14:35 crc kubenswrapper[4817]: W0218 14:14:35.594918 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod086958e1_8a7d_40c9_9725_f18776f863a0.slice/crio-945c2634af22ff8a8df9de2d0a0d84276f4debfebacde4735b5f47dfa572d13b WatchSource:0}: Error finding container 945c2634af22ff8a8df9de2d0a0d84276f4debfebacde4735b5f47dfa572d13b: Status 404 returned error can't find the container with id 945c2634af22ff8a8df9de2d0a0d84276f4debfebacde4735b5f47dfa572d13b Feb 18 14:14:35 crc kubenswrapper[4817]: I0218 14:14:35.598428 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6956d67c5c-xbjdr"] Feb 18 14:14:35 crc kubenswrapper[4817]: W0218 14:14:35.615915 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3374b90b_3a12_4e01_a0cb_ed7c51d844d7.slice/crio-0286fcde4ec5a01994d0b2c32ac65cc4d7de30ba0064fbd2fcecd39817c03060 WatchSource:0}: Error finding container 0286fcde4ec5a01994d0b2c32ac65cc4d7de30ba0064fbd2fcecd39817c03060: Status 404 returned error can't find the container with id 0286fcde4ec5a01994d0b2c32ac65cc4d7de30ba0064fbd2fcecd39817c03060 Feb 18 14:14:35 crc kubenswrapper[4817]: E0218 14:14:35.621733 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.147:5001/openstack-k8s-operators/telemetry-operator:49fb0a393e644ad55559f09981950c6ee3a56dc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mn7n9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6956d67c5c-xbjdr_openstack-operators(3374b90b-3a12-4e01-a0cb-ed7c51d844d7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 18 14:14:35 crc kubenswrapper[4817]: E0218 14:14:35.623074 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-6956d67c5c-xbjdr" podUID="3374b90b-3a12-4e01-a0cb-ed7c51d844d7" Feb 18 14:14:35 crc kubenswrapper[4817]: E0218 14:14:35.708079 4817 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 14:14:35 crc kubenswrapper[4817]: E0218 14:14:35.708361 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad995216-386a-455b-b48d-378dbfd271bf-cert podName:ad995216-386a-455b-b48d-378dbfd271bf nodeName:}" failed. No retries permitted until 2026-02-18 14:14:37.70828482 +0000 UTC m=+940.283820803 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ad995216-386a-455b-b48d-378dbfd271bf-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j" (UID: "ad995216-386a-455b-b48d-378dbfd271bf") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 14:14:35 crc kubenswrapper[4817]: I0218 14:14:35.707868 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad995216-386a-455b-b48d-378dbfd271bf-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j\" (UID: \"ad995216-386a-455b-b48d-378dbfd271bf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j" Feb 18 14:14:36 crc kubenswrapper[4817]: I0218 14:14:36.088083 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6956d67c5c-xbjdr" event={"ID":"3374b90b-3a12-4e01-a0cb-ed7c51d844d7","Type":"ContainerStarted","Data":"0286fcde4ec5a01994d0b2c32ac65cc4d7de30ba0064fbd2fcecd39817c03060"} Feb 18 14:14:36 crc kubenswrapper[4817]: E0218 14:14:36.089802 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.147:5001/openstack-k8s-operators/telemetry-operator:49fb0a393e644ad55559f09981950c6ee3a56dc1\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6956d67c5c-xbjdr" podUID="3374b90b-3a12-4e01-a0cb-ed7c51d844d7" Feb 18 14:14:36 crc kubenswrapper[4817]: I0218 14:14:36.090265 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9jkwb" event={"ID":"ca917110-0727-4c63-ad9a-20722a6cba34","Type":"ContainerStarted","Data":"5575f3ba19cf00ccfa27db8b5fdf274c15ff0010f29e14eac22c94fedcad88ee"} Feb 18 14:14:36 crc kubenswrapper[4817]: I0218 14:14:36.092061 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-l9fqt" event={"ID":"9d933918-c23c-456a-8b3f-08ee4c2909dd","Type":"ContainerStarted","Data":"46fe0a16f9c5349dcd6378cc18fec306df147b0d8a46de955d2eeb72599ffd27"} Feb 18 14:14:36 crc kubenswrapper[4817]: I0218 14:14:36.095164 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-8q5c8" event={"ID":"bccec692-ee64-46e1-8979-e6173c132d8e","Type":"ContainerStarted","Data":"199ef78ac7ceabfc8830464b4481d06b93c31652284f03f5f487a1e513e948fa"} Feb 18 14:14:36 crc kubenswrapper[4817]: E0218 14:14:36.096762 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-l9fqt" podUID="9d933918-c23c-456a-8b3f-08ee4c2909dd" Feb 18 14:14:36 crc kubenswrapper[4817]: E0218 14:14:36.097142 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-8q5c8" podUID="bccec692-ee64-46e1-8979-e6173c132d8e" Feb 18 14:14:36 crc kubenswrapper[4817]: I0218 14:14:36.097585 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9td4r" event={"ID":"2c04c342-bd87-46e7-8a2d-72dc30f858aa","Type":"ContainerStarted","Data":"b4c0a40816fc118f7f963fe91ed81c10d943f9f003f565b76f2b4fc8f07f5866"} Feb 18 14:14:36 crc kubenswrapper[4817]: E0218 14:14:36.098918 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9td4r" podUID="2c04c342-bd87-46e7-8a2d-72dc30f858aa" Feb 18 14:14:36 crc kubenswrapper[4817]: I0218 14:14:36.099503 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-dr67b" event={"ID":"6e783396-37c1-4a0d-bfe4-495fdf4d41bf","Type":"ContainerStarted","Data":"fb97e3bf4246884fca771c3ce59079eb33f0b0efaef7cd8beb824ac620b8b3ed"} Feb 18 14:14:36 crc kubenswrapper[4817]: I0218 14:14:36.102244 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-8pfz7" event={"ID":"93f59b48-31c6-4fed-8ccc-7d722605d896","Type":"ContainerStarted","Data":"be39fa7d7ef07f2c6d8f6eb662a7b81b767b0870ed59037153124094a37ed5a1"} Feb 18 14:14:36 crc kubenswrapper[4817]: I0218 14:14:36.103790 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-4h8qx" event={"ID":"c5e8b4c9-5a63-44c3-9f6c-c7ee268dcef3","Type":"ContainerStarted","Data":"d104a3cb5e8699e0daf248921809fec77629b181229c8bca67eeddc38dd91cd9"} Feb 18 14:14:36 crc kubenswrapper[4817]: I0218 14:14:36.114466 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-webhook-certs\") pod \"openstack-operator-controller-manager-7954588dd9-dngjl\" (UID: \"ff1b9fe3-84fe-47fc-902c-aa23c9e829d8\") " pod="openstack-operators/openstack-operator-controller-manager-7954588dd9-dngjl" Feb 18 14:14:36 crc kubenswrapper[4817]: I0218 14:14:36.114609 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-metrics-certs\") pod \"openstack-operator-controller-manager-7954588dd9-dngjl\" (UID: \"ff1b9fe3-84fe-47fc-902c-aa23c9e829d8\") " pod="openstack-operators/openstack-operator-controller-manager-7954588dd9-dngjl" Feb 18 14:14:36 crc kubenswrapper[4817]: E0218 14:14:36.114742 4817 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 18 14:14:36 crc kubenswrapper[4817]: E0218 14:14:36.114812 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-metrics-certs podName:ff1b9fe3-84fe-47fc-902c-aa23c9e829d8 nodeName:}" failed. No retries permitted until 2026-02-18 14:14:38.114793895 +0000 UTC m=+940.690329878 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-metrics-certs") pod "openstack-operator-controller-manager-7954588dd9-dngjl" (UID: "ff1b9fe3-84fe-47fc-902c-aa23c9e829d8") : secret "metrics-server-cert" not found Feb 18 14:14:36 crc kubenswrapper[4817]: E0218 14:14:36.116122 4817 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 18 14:14:36 crc kubenswrapper[4817]: E0218 14:14:36.116212 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-webhook-certs podName:ff1b9fe3-84fe-47fc-902c-aa23c9e829d8 nodeName:}" failed. No retries permitted until 2026-02-18 14:14:38.11618779 +0000 UTC m=+940.691723823 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-webhook-certs") pod "openstack-operator-controller-manager-7954588dd9-dngjl" (UID: "ff1b9fe3-84fe-47fc-902c-aa23c9e829d8") : secret "webhook-server-cert" not found Feb 18 14:14:36 crc kubenswrapper[4817]: I0218 14:14:36.127551 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-qlrqm" event={"ID":"0db531ef-d3b4-4b35-9497-8892cbd3db77","Type":"ContainerStarted","Data":"7f12fc3cc45e438a386511d50ba34a11747bc41ed9e4cc27f2248e2641a294f9"} Feb 18 14:14:36 crc kubenswrapper[4817]: I0218 14:14:36.131892 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-4rv4f" event={"ID":"644f07fc-02ce-49f0-87bf-54f765c15d8c","Type":"ContainerStarted","Data":"706d53a50221d6440af72ee2464414ca23ec453337d39fbb373e94e3251c4d7a"} Feb 18 14:14:36 crc kubenswrapper[4817]: I0218 14:14:36.143630 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-ff84x" event={"ID":"b77d6c32-6c30-42be-ab69-36b969d40950","Type":"ContainerStarted","Data":"e038b029de578525f8d3508a898f2ec6a0783d7faa9d5412883293ca51ebd3ec"} Feb 18 14:14:36 crc kubenswrapper[4817]: I0218 14:14:36.148969 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5np9g" event={"ID":"086958e1-8a7d-40c9-9725-f18776f863a0","Type":"ContainerStarted","Data":"945c2634af22ff8a8df9de2d0a0d84276f4debfebacde4735b5f47dfa572d13b"} Feb 18 14:14:37 crc kubenswrapper[4817]: E0218 14:14:37.188208 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-8q5c8" podUID="bccec692-ee64-46e1-8979-e6173c132d8e" Feb 18 14:14:37 crc kubenswrapper[4817]: E0218 14:14:37.188306 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-l9fqt" podUID="9d933918-c23c-456a-8b3f-08ee4c2909dd" Feb 18 14:14:37 crc kubenswrapper[4817]: E0218 14:14:37.188654 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.147:5001/openstack-k8s-operators/telemetry-operator:49fb0a393e644ad55559f09981950c6ee3a56dc1\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6956d67c5c-xbjdr" podUID="3374b90b-3a12-4e01-a0cb-ed7c51d844d7" Feb 18 14:14:37 crc kubenswrapper[4817]: E0218 14:14:37.198774 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9td4r" podUID="2c04c342-bd87-46e7-8a2d-72dc30f858aa" Feb 18 14:14:37 crc kubenswrapper[4817]: I0218 14:14:37.435453 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7be81da-3629-4713-87c6-34cabd9a8347-cert\") pod \"infra-operator-controller-manager-79d975b745-xpwgd\" (UID: \"e7be81da-3629-4713-87c6-34cabd9a8347\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-xpwgd" Feb 18 14:14:37 crc kubenswrapper[4817]: E0218 14:14:37.435699 4817 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 18 14:14:37 crc kubenswrapper[4817]: E0218 14:14:37.435756 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7be81da-3629-4713-87c6-34cabd9a8347-cert podName:e7be81da-3629-4713-87c6-34cabd9a8347 nodeName:}" failed. No retries permitted until 2026-02-18 14:14:41.435736319 +0000 UTC m=+944.011272302 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e7be81da-3629-4713-87c6-34cabd9a8347-cert") pod "infra-operator-controller-manager-79d975b745-xpwgd" (UID: "e7be81da-3629-4713-87c6-34cabd9a8347") : secret "infra-operator-webhook-server-cert" not found Feb 18 14:14:37 crc kubenswrapper[4817]: I0218 14:14:37.740704 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad995216-386a-455b-b48d-378dbfd271bf-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j\" (UID: \"ad995216-386a-455b-b48d-378dbfd271bf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j" Feb 18 14:14:37 crc kubenswrapper[4817]: E0218 14:14:37.741320 4817 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 14:14:37 crc kubenswrapper[4817]: E0218 14:14:37.741377 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad995216-386a-455b-b48d-378dbfd271bf-cert podName:ad995216-386a-455b-b48d-378dbfd271bf nodeName:}" failed. No retries permitted until 2026-02-18 14:14:41.741357027 +0000 UTC m=+944.316893010 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ad995216-386a-455b-b48d-378dbfd271bf-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j" (UID: "ad995216-386a-455b-b48d-378dbfd271bf") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 14:14:38 crc kubenswrapper[4817]: I0218 14:14:38.147087 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-webhook-certs\") pod \"openstack-operator-controller-manager-7954588dd9-dngjl\" (UID: \"ff1b9fe3-84fe-47fc-902c-aa23c9e829d8\") " pod="openstack-operators/openstack-operator-controller-manager-7954588dd9-dngjl" Feb 18 14:14:38 crc kubenswrapper[4817]: I0218 14:14:38.147193 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-metrics-certs\") pod \"openstack-operator-controller-manager-7954588dd9-dngjl\" (UID: \"ff1b9fe3-84fe-47fc-902c-aa23c9e829d8\") " pod="openstack-operators/openstack-operator-controller-manager-7954588dd9-dngjl" Feb 18 14:14:38 crc kubenswrapper[4817]: E0218 14:14:38.147338 4817 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 18 14:14:38 crc kubenswrapper[4817]: E0218 14:14:38.147386 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-metrics-certs podName:ff1b9fe3-84fe-47fc-902c-aa23c9e829d8 nodeName:}" failed. No retries permitted until 2026-02-18 14:14:42.147369299 +0000 UTC m=+944.722905272 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-metrics-certs") pod "openstack-operator-controller-manager-7954588dd9-dngjl" (UID: "ff1b9fe3-84fe-47fc-902c-aa23c9e829d8") : secret "metrics-server-cert" not found Feb 18 14:14:38 crc kubenswrapper[4817]: E0218 14:14:38.147724 4817 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 18 14:14:38 crc kubenswrapper[4817]: E0218 14:14:38.147776 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-webhook-certs podName:ff1b9fe3-84fe-47fc-902c-aa23c9e829d8 nodeName:}" failed. No retries permitted until 2026-02-18 14:14:42.147767559 +0000 UTC m=+944.723303542 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-webhook-certs") pod "openstack-operator-controller-manager-7954588dd9-dngjl" (UID: "ff1b9fe3-84fe-47fc-902c-aa23c9e829d8") : secret "webhook-server-cert" not found Feb 18 14:14:41 crc kubenswrapper[4817]: I0218 14:14:41.495383 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7be81da-3629-4713-87c6-34cabd9a8347-cert\") pod \"infra-operator-controller-manager-79d975b745-xpwgd\" (UID: \"e7be81da-3629-4713-87c6-34cabd9a8347\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-xpwgd" Feb 18 14:14:41 crc kubenswrapper[4817]: E0218 14:14:41.495607 4817 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 18 14:14:41 crc kubenswrapper[4817]: E0218 14:14:41.496364 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7be81da-3629-4713-87c6-34cabd9a8347-cert podName:e7be81da-3629-4713-87c6-34cabd9a8347 nodeName:}" failed. No retries permitted until 2026-02-18 14:14:49.495782911 +0000 UTC m=+952.071318894 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e7be81da-3629-4713-87c6-34cabd9a8347-cert") pod "infra-operator-controller-manager-79d975b745-xpwgd" (UID: "e7be81da-3629-4713-87c6-34cabd9a8347") : secret "infra-operator-webhook-server-cert" not found Feb 18 14:14:41 crc kubenswrapper[4817]: I0218 14:14:41.801119 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad995216-386a-455b-b48d-378dbfd271bf-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j\" (UID: \"ad995216-386a-455b-b48d-378dbfd271bf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j" Feb 18 14:14:41 crc kubenswrapper[4817]: E0218 14:14:41.801299 4817 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 14:14:41 crc kubenswrapper[4817]: E0218 14:14:41.801423 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad995216-386a-455b-b48d-378dbfd271bf-cert podName:ad995216-386a-455b-b48d-378dbfd271bf nodeName:}" failed. No retries permitted until 2026-02-18 14:14:49.801399028 +0000 UTC m=+952.376935021 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ad995216-386a-455b-b48d-378dbfd271bf-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j" (UID: "ad995216-386a-455b-b48d-378dbfd271bf") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 18 14:14:42 crc kubenswrapper[4817]: I0218 14:14:42.208517 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-metrics-certs\") pod \"openstack-operator-controller-manager-7954588dd9-dngjl\" (UID: \"ff1b9fe3-84fe-47fc-902c-aa23c9e829d8\") " pod="openstack-operators/openstack-operator-controller-manager-7954588dd9-dngjl" Feb 18 14:14:42 crc kubenswrapper[4817]: I0218 14:14:42.208598 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-webhook-certs\") pod \"openstack-operator-controller-manager-7954588dd9-dngjl\" (UID: \"ff1b9fe3-84fe-47fc-902c-aa23c9e829d8\") " pod="openstack-operators/openstack-operator-controller-manager-7954588dd9-dngjl" Feb 18 14:14:42 crc kubenswrapper[4817]: E0218 14:14:42.208725 4817 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 18 14:14:42 crc kubenswrapper[4817]: E0218 14:14:42.208748 4817 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 18 14:14:42 crc kubenswrapper[4817]: E0218 14:14:42.208816 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-metrics-certs podName:ff1b9fe3-84fe-47fc-902c-aa23c9e829d8 nodeName:}" failed. No retries permitted until 2026-02-18 14:14:50.208794825 +0000 UTC m=+952.784330838 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-metrics-certs") pod "openstack-operator-controller-manager-7954588dd9-dngjl" (UID: "ff1b9fe3-84fe-47fc-902c-aa23c9e829d8") : secret "metrics-server-cert" not found Feb 18 14:14:42 crc kubenswrapper[4817]: E0218 14:14:42.208840 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-webhook-certs podName:ff1b9fe3-84fe-47fc-902c-aa23c9e829d8 nodeName:}" failed. No retries permitted until 2026-02-18 14:14:50.208830316 +0000 UTC m=+952.784366369 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-webhook-certs") pod "openstack-operator-controller-manager-7954588dd9-dngjl" (UID: "ff1b9fe3-84fe-47fc-902c-aa23c9e829d8") : secret "webhook-server-cert" not found Feb 18 14:14:48 crc kubenswrapper[4817]: E0218 14:14:48.014115 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34" Feb 18 14:14:48 crc kubenswrapper[4817]: E0218 14:14:48.014639 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5xd9t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-69f8888797-qlrqm_openstack-operators(0db531ef-d3b4-4b35-9497-8892cbd3db77): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 14:14:48 crc kubenswrapper[4817]: E0218 14:14:48.015873 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-qlrqm" podUID="0db531ef-d3b4-4b35-9497-8892cbd3db77" Feb 18 14:14:48 crc kubenswrapper[4817]: E0218 14:14:48.262816 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-qlrqm" podUID="0db531ef-d3b4-4b35-9497-8892cbd3db77" Feb 18 14:14:49 crc kubenswrapper[4817]: I0218 14:14:49.543166 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7be81da-3629-4713-87c6-34cabd9a8347-cert\") pod \"infra-operator-controller-manager-79d975b745-xpwgd\" (UID: \"e7be81da-3629-4713-87c6-34cabd9a8347\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-xpwgd" Feb 18 14:14:49 crc kubenswrapper[4817]: I0218 14:14:49.556786 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7be81da-3629-4713-87c6-34cabd9a8347-cert\") pod \"infra-operator-controller-manager-79d975b745-xpwgd\" (UID: \"e7be81da-3629-4713-87c6-34cabd9a8347\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-xpwgd" Feb 18 14:14:49 crc kubenswrapper[4817]: I0218 14:14:49.628733 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-xpwgd" Feb 18 14:14:49 crc kubenswrapper[4817]: I0218 14:14:49.847242 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad995216-386a-455b-b48d-378dbfd271bf-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j\" (UID: \"ad995216-386a-455b-b48d-378dbfd271bf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j" Feb 18 14:14:49 crc kubenswrapper[4817]: I0218 14:14:49.852552 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad995216-386a-455b-b48d-378dbfd271bf-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j\" (UID: \"ad995216-386a-455b-b48d-378dbfd271bf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j" Feb 18 14:14:49 crc kubenswrapper[4817]: I0218 14:14:49.953702 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j" Feb 18 14:14:50 crc kubenswrapper[4817]: I0218 14:14:50.252660 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-metrics-certs\") pod \"openstack-operator-controller-manager-7954588dd9-dngjl\" (UID: \"ff1b9fe3-84fe-47fc-902c-aa23c9e829d8\") " pod="openstack-operators/openstack-operator-controller-manager-7954588dd9-dngjl" Feb 18 14:14:50 crc kubenswrapper[4817]: I0218 14:14:50.252761 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-webhook-certs\") pod \"openstack-operator-controller-manager-7954588dd9-dngjl\" (UID: \"ff1b9fe3-84fe-47fc-902c-aa23c9e829d8\") " pod="openstack-operators/openstack-operator-controller-manager-7954588dd9-dngjl" Feb 18 14:14:50 crc kubenswrapper[4817]: I0218 14:14:50.255899 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-metrics-certs\") pod \"openstack-operator-controller-manager-7954588dd9-dngjl\" (UID: \"ff1b9fe3-84fe-47fc-902c-aa23c9e829d8\") " pod="openstack-operators/openstack-operator-controller-manager-7954588dd9-dngjl" Feb 18 14:14:50 crc kubenswrapper[4817]: I0218 14:14:50.256200 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ff1b9fe3-84fe-47fc-902c-aa23c9e829d8-webhook-certs\") pod \"openstack-operator-controller-manager-7954588dd9-dngjl\" (UID: \"ff1b9fe3-84fe-47fc-902c-aa23c9e829d8\") " pod="openstack-operators/openstack-operator-controller-manager-7954588dd9-dngjl" Feb 18 14:14:50 crc kubenswrapper[4817]: I0218 14:14:50.266415 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7954588dd9-dngjl" Feb 18 14:14:57 crc kubenswrapper[4817]: E0218 14:14:57.164377 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1" Feb 18 14:14:57 crc kubenswrapper[4817]: E0218 14:14:57.165139 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hs7nx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-b4d948c87-fs8m7_openstack-operators(85bd6fc0-d973-4172-b441-c15d4abeb604): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 14:14:57 crc kubenswrapper[4817]: E0218 14:14:57.166371 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-fs8m7" podUID="85bd6fc0-d973-4172-b441-c15d4abeb604" Feb 18 14:14:57 crc kubenswrapper[4817]: E0218 14:14:57.177711 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf" Feb 18 14:14:57 crc kubenswrapper[4817]: E0218 14:14:57.177923 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rr4f7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-64ddbf8bb-dr67b_openstack-operators(6e783396-37c1-4a0d-bfe4-495fdf4d41bf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 14:14:57 crc kubenswrapper[4817]: E0218 14:14:57.179159 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-dr67b" podUID="6e783396-37c1-4a0d-bfe4-495fdf4d41bf" Feb 18 14:14:57 crc kubenswrapper[4817]: E0218 14:14:57.337080 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-fs8m7" podUID="85bd6fc0-d973-4172-b441-c15d4abeb604" Feb 18 14:14:57 crc kubenswrapper[4817]: E0218 14:14:57.339941 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-dr67b" podUID="6e783396-37c1-4a0d-bfe4-495fdf4d41bf" Feb 18 14:14:58 crc kubenswrapper[4817]: E0218 14:14:58.997840 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838" Feb 18 14:14:58 crc kubenswrapper[4817]: E0218 14:14:58.998265 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lpr8t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-567668f5cf-9jkwb_openstack-operators(ca917110-0727-4c63-ad9a-20722a6cba34): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 14:14:59 crc kubenswrapper[4817]: E0218 14:14:59.000046 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9jkwb" podUID="ca917110-0727-4c63-ad9a-20722a6cba34" Feb 18 14:14:59 crc kubenswrapper[4817]: E0218 14:14:59.351100 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9jkwb" podUID="ca917110-0727-4c63-ad9a-20722a6cba34" Feb 18 14:15:00 crc kubenswrapper[4817]: E0218 14:14:59.999567 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Feb 18 14:15:00 crc kubenswrapper[4817]: E0218 14:14:59.999779 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7m8d2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-5np9g_openstack-operators(086958e1-8a7d-40c9-9725-f18776f863a0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 14:15:00 crc kubenswrapper[4817]: E0218 14:15:00.001378 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5np9g" podUID="086958e1-8a7d-40c9-9725-f18776f863a0" Feb 18 14:15:00 crc kubenswrapper[4817]: I0218 14:15:00.157562 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523735-fw4vl"] Feb 18 14:15:00 crc kubenswrapper[4817]: I0218 14:15:00.159657 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523735-fw4vl" Feb 18 14:15:00 crc kubenswrapper[4817]: I0218 14:15:00.165251 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 18 14:15:00 crc kubenswrapper[4817]: I0218 14:15:00.165288 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 18 14:15:00 crc kubenswrapper[4817]: I0218 14:15:00.166539 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523735-fw4vl"] Feb 18 14:15:00 crc kubenswrapper[4817]: I0218 14:15:00.314970 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6441d64-55c6-45f3-a648-74924b94b4f0-config-volume\") pod \"collect-profiles-29523735-fw4vl\" (UID: \"b6441d64-55c6-45f3-a648-74924b94b4f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523735-fw4vl" Feb 18 14:15:00 crc kubenswrapper[4817]: I0218 14:15:00.315663 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6441d64-55c6-45f3-a648-74924b94b4f0-secret-volume\") pod \"collect-profiles-29523735-fw4vl\" (UID: \"b6441d64-55c6-45f3-a648-74924b94b4f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523735-fw4vl" Feb 18 14:15:00 crc kubenswrapper[4817]: I0218 14:15:00.315719 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmxwb\" (UniqueName: \"kubernetes.io/projected/b6441d64-55c6-45f3-a648-74924b94b4f0-kube-api-access-cmxwb\") pod \"collect-profiles-29523735-fw4vl\" (UID: \"b6441d64-55c6-45f3-a648-74924b94b4f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523735-fw4vl" Feb 18 14:15:00 crc kubenswrapper[4817]: E0218 14:15:00.357002 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5np9g" podUID="086958e1-8a7d-40c9-9725-f18776f863a0" Feb 18 14:15:00 crc kubenswrapper[4817]: I0218 14:15:00.416772 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6441d64-55c6-45f3-a648-74924b94b4f0-config-volume\") pod \"collect-profiles-29523735-fw4vl\" (UID: \"b6441d64-55c6-45f3-a648-74924b94b4f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523735-fw4vl" Feb 18 14:15:00 crc kubenswrapper[4817]: I0218 14:15:00.416854 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6441d64-55c6-45f3-a648-74924b94b4f0-secret-volume\") pod \"collect-profiles-29523735-fw4vl\" (UID: \"b6441d64-55c6-45f3-a648-74924b94b4f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523735-fw4vl" Feb 18 14:15:00 crc kubenswrapper[4817]: I0218 14:15:00.416901 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmxwb\" (UniqueName: \"kubernetes.io/projected/b6441d64-55c6-45f3-a648-74924b94b4f0-kube-api-access-cmxwb\") pod \"collect-profiles-29523735-fw4vl\" (UID: \"b6441d64-55c6-45f3-a648-74924b94b4f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523735-fw4vl" Feb 18 14:15:00 crc kubenswrapper[4817]: I0218 14:15:00.419716 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6441d64-55c6-45f3-a648-74924b94b4f0-config-volume\") pod \"collect-profiles-29523735-fw4vl\" (UID: \"b6441d64-55c6-45f3-a648-74924b94b4f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523735-fw4vl" Feb 18 14:15:00 crc kubenswrapper[4817]: I0218 14:15:00.441510 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6441d64-55c6-45f3-a648-74924b94b4f0-secret-volume\") pod \"collect-profiles-29523735-fw4vl\" (UID: \"b6441d64-55c6-45f3-a648-74924b94b4f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523735-fw4vl" Feb 18 14:15:00 crc kubenswrapper[4817]: I0218 14:15:00.444949 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmxwb\" (UniqueName: \"kubernetes.io/projected/b6441d64-55c6-45f3-a648-74924b94b4f0-kube-api-access-cmxwb\") pod \"collect-profiles-29523735-fw4vl\" (UID: \"b6441d64-55c6-45f3-a648-74924b94b4f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523735-fw4vl" Feb 18 14:15:00 crc kubenswrapper[4817]: I0218 14:15:00.480083 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523735-fw4vl" Feb 18 14:15:02 crc kubenswrapper[4817]: I0218 14:15:02.371922 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7954588dd9-dngjl"] Feb 18 14:15:02 crc kubenswrapper[4817]: I0218 14:15:02.382198 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-xpwgd"] Feb 18 14:15:02 crc kubenswrapper[4817]: I0218 14:15:02.383407 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-ff84x" event={"ID":"b77d6c32-6c30-42be-ab69-36b969d40950","Type":"ContainerStarted","Data":"e4fe684b12a0028ad38f1a41b923d29fa1aed6d2cef2e76d4ab627e797b0b86a"} Feb 18 14:15:02 crc kubenswrapper[4817]: I0218 14:15:02.400905 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-64rvt" event={"ID":"01962a92-98c7-412c-86a7-ee21e6cb92a9","Type":"ContainerStarted","Data":"78cf8e03827d27e6592d5434efb56fc30c21304ddaae8345e78f384420971e18"} Feb 18 14:15:02 crc kubenswrapper[4817]: I0218 14:15:02.402112 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-64rvt" Feb 18 14:15:02 crc kubenswrapper[4817]: I0218 14:15:02.426279 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-64rvt" podStartSLOduration=5.350758385 podStartE2EDuration="29.426259398s" podCreationTimestamp="2026-02-18 14:14:33 +0000 UTC" firstStartedPulling="2026-02-18 14:14:34.929214481 +0000 UTC m=+937.504750464" lastFinishedPulling="2026-02-18 14:14:59.004715494 +0000 UTC m=+961.580251477" observedRunningTime="2026-02-18 14:15:02.418846212 +0000 UTC m=+964.994382215" watchObservedRunningTime="2026-02-18 14:15:02.426259398 +0000 UTC m=+965.001795381" Feb 18 14:15:02 crc kubenswrapper[4817]: W0218 14:15:02.503074 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7be81da_3629_4713_87c6_34cabd9a8347.slice/crio-217e6d3c4e39098a2fe9280cdd93d5f1b006d41a7f789a08743e7fca421f3535 WatchSource:0}: Error finding container 217e6d3c4e39098a2fe9280cdd93d5f1b006d41a7f789a08743e7fca421f3535: Status 404 returned error can't find the container with id 217e6d3c4e39098a2fe9280cdd93d5f1b006d41a7f789a08743e7fca421f3535 Feb 18 14:15:02 crc kubenswrapper[4817]: I0218 14:15:02.509340 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j"] Feb 18 14:15:02 crc kubenswrapper[4817]: W0218 14:15:02.539855 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad995216_386a_455b_b48d_378dbfd271bf.slice/crio-fb3b75858f2b93cce005fbb47f540f0c8eb1f3924e18dd97d7089b2e900e5ec3 WatchSource:0}: Error finding container fb3b75858f2b93cce005fbb47f540f0c8eb1f3924e18dd97d7089b2e900e5ec3: Status 404 returned error can't find the container with id fb3b75858f2b93cce005fbb47f540f0c8eb1f3924e18dd97d7089b2e900e5ec3 Feb 18 14:15:02 crc kubenswrapper[4817]: I0218 14:15:02.715848 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523735-fw4vl"] Feb 18 14:15:02 crc kubenswrapper[4817]: W0218 14:15:02.740905 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6441d64_55c6_45f3_a648_74924b94b4f0.slice/crio-e5d22ada13fe34363060e17129fd4e899497ac5c703cb198f28970a8234668f1 WatchSource:0}: Error finding container e5d22ada13fe34363060e17129fd4e899497ac5c703cb198f28970a8234668f1: Status 404 returned error can't find the container with id e5d22ada13fe34363060e17129fd4e899497ac5c703cb198f28970a8234668f1 Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.485337 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-qlrqm" event={"ID":"0db531ef-d3b4-4b35-9497-8892cbd3db77","Type":"ContainerStarted","Data":"3008ef7dcabfe0c0f5c31385658fcd9a8fb9504bd737efa9077e1370c18d1366"} Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.486749 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-qlrqm" Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.526321 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-z64rl" event={"ID":"4339e125-4e60-44d9-8e15-97b4000669e2","Type":"ContainerStarted","Data":"8c733ff4136ed66913e34b637e7f029188fc2ff2047574ae275c0336d0f78f10"} Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.527498 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-z64rl" Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.527910 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-qlrqm" podStartSLOduration=3.609256125 podStartE2EDuration="30.52789871s" podCreationTimestamp="2026-02-18 14:14:33 +0000 UTC" firstStartedPulling="2026-02-18 14:14:35.243609478 +0000 UTC m=+937.819145471" lastFinishedPulling="2026-02-18 14:15:02.162252073 +0000 UTC m=+964.737788056" observedRunningTime="2026-02-18 14:15:03.517363546 +0000 UTC m=+966.092899529" watchObservedRunningTime="2026-02-18 14:15:03.52789871 +0000 UTC m=+966.103434693" Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.564063 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-l9fqt" event={"ID":"9d933918-c23c-456a-8b3f-08ee4c2909dd","Type":"ContainerStarted","Data":"dac77cf5d8c48f1d59fc5efee53d54612e33a46d4d81a3b369c573eb4dabde7c"} Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.564931 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-l9fqt" Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.584771 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-zknf5" event={"ID":"13246e06-5b63-4076-a556-de264d7afdf4","Type":"ContainerStarted","Data":"1938d2c45a2bf5473fb6c1ce8bf027ba99fb20b3b42b8f3d0f66b65776bb1e4c"} Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.585637 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-zknf5" Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.586295 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-z64rl" podStartSLOduration=5.369132147 podStartE2EDuration="30.586282703s" podCreationTimestamp="2026-02-18 14:14:33 +0000 UTC" firstStartedPulling="2026-02-18 14:14:34.330267615 +0000 UTC m=+936.905803598" lastFinishedPulling="2026-02-18 14:14:59.547418171 +0000 UTC m=+962.122954154" observedRunningTime="2026-02-18 14:15:03.584416496 +0000 UTC m=+966.159952469" watchObservedRunningTime="2026-02-18 14:15:03.586282703 +0000 UTC m=+966.161818696" Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.614843 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-l9fqt" podStartSLOduration=4.13151392 podStartE2EDuration="30.614823648s" podCreationTimestamp="2026-02-18 14:14:33 +0000 UTC" firstStartedPulling="2026-02-18 14:14:35.441101556 +0000 UTC m=+938.016637539" lastFinishedPulling="2026-02-18 14:15:01.924411284 +0000 UTC m=+964.499947267" observedRunningTime="2026-02-18 14:15:03.612401367 +0000 UTC m=+966.187937360" watchObservedRunningTime="2026-02-18 14:15:03.614823648 +0000 UTC m=+966.190359631" Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.631848 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523735-fw4vl" event={"ID":"b6441d64-55c6-45f3-a648-74924b94b4f0","Type":"ContainerStarted","Data":"6b4a9274808cae48ea94dc6d50a89327c5a88a9b67bfb177ac190dbacc4a5757"} Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.632106 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523735-fw4vl" event={"ID":"b6441d64-55c6-45f3-a648-74924b94b4f0","Type":"ContainerStarted","Data":"e5d22ada13fe34363060e17129fd4e899497ac5c703cb198f28970a8234668f1"} Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.639732 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-zknf5" podStartSLOduration=6.330285968 podStartE2EDuration="30.639715992s" podCreationTimestamp="2026-02-18 14:14:33 +0000 UTC" firstStartedPulling="2026-02-18 14:14:34.693577817 +0000 UTC m=+937.269113800" lastFinishedPulling="2026-02-18 14:14:59.003007841 +0000 UTC m=+961.578543824" observedRunningTime="2026-02-18 14:15:03.638333397 +0000 UTC m=+966.213869380" watchObservedRunningTime="2026-02-18 14:15:03.639715992 +0000 UTC m=+966.215251975" Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.658945 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9td4r" event={"ID":"2c04c342-bd87-46e7-8a2d-72dc30f858aa","Type":"ContainerStarted","Data":"58a45657ced9acff68fd223b97bc014260d5131796afa3b8ecc0306cce23d35f"} Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.659829 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9td4r" Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.682321 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-4h8qx" event={"ID":"c5e8b4c9-5a63-44c3-9f6c-c7ee268dcef3","Type":"ContainerStarted","Data":"accf66332a7401d997edc0c631963107336e74cfc5c23616e7b8b99f14412e98"} Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.683145 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-4h8qx" Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.692244 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29523735-fw4vl" podStartSLOduration=3.692227268 podStartE2EDuration="3.692227268s" podCreationTimestamp="2026-02-18 14:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:15:03.69031426 +0000 UTC m=+966.265850253" watchObservedRunningTime="2026-02-18 14:15:03.692227268 +0000 UTC m=+966.267763251" Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.700222 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7954588dd9-dngjl" event={"ID":"ff1b9fe3-84fe-47fc-902c-aa23c9e829d8","Type":"ContainerStarted","Data":"a616bf69a8fe47e373fd11efc10e245cbc69068aa2830a8e6b291a183e9de97c"} Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.700476 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7954588dd9-dngjl" event={"ID":"ff1b9fe3-84fe-47fc-902c-aa23c9e829d8","Type":"ContainerStarted","Data":"3a4aa80477d14fdbcee820181fe96aa3603cf9768f2530a57b2842ae4f715ef0"} Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.701147 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7954588dd9-dngjl" Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.709450 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-8q5c8" event={"ID":"bccec692-ee64-46e1-8979-e6173c132d8e","Type":"ContainerStarted","Data":"e195d9b0336b3c2797c581cdf40aba5490fd8402a69498b2279aa2e0df720461"} Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.710455 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7866795846-8q5c8" Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.722270 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j" event={"ID":"ad995216-386a-455b-b48d-378dbfd271bf","Type":"ContainerStarted","Data":"fb3b75858f2b93cce005fbb47f540f0c8eb1f3924e18dd97d7089b2e900e5ec3"} Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.746260 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rs8vm" event={"ID":"4f0674c2-f05e-4276-b2b0-dc5ed66c187a","Type":"ContainerStarted","Data":"f4672c70a214e2400f608a062b298ab41eff2268180e735de6120e149b637d83"} Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.746611 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rs8vm" Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.751401 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-gq259" event={"ID":"5721fd5d-07bb-44df-bfb8-4b4dd80ac7a4","Type":"ContainerStarted","Data":"da69d86eaaaf490db9689c2ed9ec2ae7e7e44781c644a96c78a4b75f0badb9aa"} Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.752253 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-gq259" Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.763637 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9td4r" podStartSLOduration=4.355334846 podStartE2EDuration="30.763621726s" podCreationTimestamp="2026-02-18 14:14:33 +0000 UTC" firstStartedPulling="2026-02-18 14:14:35.449360373 +0000 UTC m=+938.024896356" lastFinishedPulling="2026-02-18 14:15:01.857647253 +0000 UTC m=+964.433183236" observedRunningTime="2026-02-18 14:15:03.759950205 +0000 UTC m=+966.335486188" watchObservedRunningTime="2026-02-18 14:15:03.763621726 +0000 UTC m=+966.339157709" Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.764874 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-4rv4f" event={"ID":"644f07fc-02ce-49f0-87bf-54f765c15d8c","Type":"ContainerStarted","Data":"05bf5f4495e6162e9bd80750d0bcd23a1ac6ae445268bad20aaaa10d0953c971"} Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.765576 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-4rv4f" Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.765740 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-4h8qx" podStartSLOduration=6.826630754 podStartE2EDuration="30.765733079s" podCreationTimestamp="2026-02-18 14:14:33 +0000 UTC" firstStartedPulling="2026-02-18 14:14:35.065991548 +0000 UTC m=+937.641527531" lastFinishedPulling="2026-02-18 14:14:59.005093873 +0000 UTC m=+961.580629856" observedRunningTime="2026-02-18 14:15:03.722183748 +0000 UTC m=+966.297719731" watchObservedRunningTime="2026-02-18 14:15:03.765733079 +0000 UTC m=+966.341269062" Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.771766 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-xpwgd" event={"ID":"e7be81da-3629-4713-87c6-34cabd9a8347","Type":"ContainerStarted","Data":"217e6d3c4e39098a2fe9280cdd93d5f1b006d41a7f789a08743e7fca421f3535"} Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.774764 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-c2b7x" event={"ID":"00831f79-f6d5-4896-b718-4120117751b8","Type":"ContainerStarted","Data":"013a5ae3e91729c77997d438fcf30404e1b2e3be8218f592fc23bfb651d09487"} Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.775125 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987464f4-c2b7x" Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.782020 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7866795846-8q5c8" podStartSLOduration=4.394397686 podStartE2EDuration="30.782004017s" podCreationTimestamp="2026-02-18 14:14:33 +0000 UTC" firstStartedPulling="2026-02-18 14:14:35.438493741 +0000 UTC m=+938.014029724" lastFinishedPulling="2026-02-18 14:15:01.826100072 +0000 UTC m=+964.401636055" observedRunningTime="2026-02-18 14:15:03.781534055 +0000 UTC m=+966.357070038" watchObservedRunningTime="2026-02-18 14:15:03.782004017 +0000 UTC m=+966.357540000" Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.786093 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-hwmjj" event={"ID":"4441a78b-c58a-4030-801b-06dbfa1729b1","Type":"ContainerStarted","Data":"b45dacf6136cbadd5e21022daf8feedcbb70f3a3b074da4973416b407235e21e"} Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.786742 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-hwmjj" Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.792570 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-8pfz7" event={"ID":"93f59b48-31c6-4fed-8ccc-7d722605d896","Type":"ContainerStarted","Data":"6401c40c703e4c606df133be6c1353180431c39a6dbf89f6a183da73f738c7ed"} Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.793248 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-8pfz7" Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.799208 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6956d67c5c-xbjdr" event={"ID":"3374b90b-3a12-4e01-a0cb-ed7c51d844d7","Type":"ContainerStarted","Data":"db0d63e737fa4dac8a832399d499af214575539a009ae2ca425fabb1bb99f115"} Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.799530 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6956d67c5c-xbjdr" Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.799836 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-ff84x" Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.809619 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-gq259" podStartSLOduration=6.072607702 podStartE2EDuration="30.809599038s" podCreationTimestamp="2026-02-18 14:14:33 +0000 UTC" firstStartedPulling="2026-02-18 14:14:34.810307912 +0000 UTC m=+937.385843895" lastFinishedPulling="2026-02-18 14:14:59.547299248 +0000 UTC m=+962.122835231" observedRunningTime="2026-02-18 14:15:03.800844049 +0000 UTC m=+966.376380022" watchObservedRunningTime="2026-02-18 14:15:03.809599038 +0000 UTC m=+966.385135021" Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.894291 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7954588dd9-dngjl" podStartSLOduration=30.89427658 podStartE2EDuration="30.89427658s" podCreationTimestamp="2026-02-18 14:14:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:15:03.866420592 +0000 UTC m=+966.441956575" watchObservedRunningTime="2026-02-18 14:15:03.89427658 +0000 UTC m=+966.469812563" Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.897986 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rs8vm" podStartSLOduration=6.190098156 podStartE2EDuration="30.897968652s" podCreationTimestamp="2026-02-18 14:14:33 +0000 UTC" firstStartedPulling="2026-02-18 14:14:34.272153829 +0000 UTC m=+936.847689812" lastFinishedPulling="2026-02-18 14:14:58.980024325 +0000 UTC m=+961.555560308" observedRunningTime="2026-02-18 14:15:03.892236289 +0000 UTC m=+966.467772272" watchObservedRunningTime="2026-02-18 14:15:03.897968652 +0000 UTC m=+966.473504635" Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.976761 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-8pfz7" podStartSLOduration=7.396631504 podStartE2EDuration="30.976744556s" podCreationTimestamp="2026-02-18 14:14:33 +0000 UTC" firstStartedPulling="2026-02-18 14:14:35.424990332 +0000 UTC m=+938.000526315" lastFinishedPulling="2026-02-18 14:14:59.005103384 +0000 UTC m=+961.580639367" observedRunningTime="2026-02-18 14:15:03.944340294 +0000 UTC m=+966.519876317" watchObservedRunningTime="2026-02-18 14:15:03.976744556 +0000 UTC m=+966.552280539" Feb 18 14:15:03 crc kubenswrapper[4817]: I0218 14:15:03.990475 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-ff84x" podStartSLOduration=7.445159271 podStartE2EDuration="30.99045601s" podCreationTimestamp="2026-02-18 14:14:33 +0000 UTC" firstStartedPulling="2026-02-18 14:14:35.433954677 +0000 UTC m=+938.009490660" lastFinishedPulling="2026-02-18 14:14:58.979251406 +0000 UTC m=+961.554787399" observedRunningTime="2026-02-18 14:15:03.976358006 +0000 UTC m=+966.551893989" watchObservedRunningTime="2026-02-18 14:15:03.99045601 +0000 UTC m=+966.565991993" Feb 18 14:15:04 crc kubenswrapper[4817]: I0218 14:15:04.057766 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-hwmjj" podStartSLOduration=6.569070741 podStartE2EDuration="31.057750706s" podCreationTimestamp="2026-02-18 14:14:33 +0000 UTC" firstStartedPulling="2026-02-18 14:14:34.516158692 +0000 UTC m=+937.091694675" lastFinishedPulling="2026-02-18 14:14:59.004838657 +0000 UTC m=+961.580374640" observedRunningTime="2026-02-18 14:15:04.055683324 +0000 UTC m=+966.631219307" watchObservedRunningTime="2026-02-18 14:15:04.057750706 +0000 UTC m=+966.633286689" Feb 18 14:15:04 crc kubenswrapper[4817]: I0218 14:15:04.058698 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6956d67c5c-xbjdr" podStartSLOduration=4.468484512 podStartE2EDuration="31.058692139s" podCreationTimestamp="2026-02-18 14:14:33 +0000 UTC" firstStartedPulling="2026-02-18 14:14:35.621474855 +0000 UTC m=+938.197010838" lastFinishedPulling="2026-02-18 14:15:02.211682482 +0000 UTC m=+964.787218465" observedRunningTime="2026-02-18 14:15:04.024301868 +0000 UTC m=+966.599837851" watchObservedRunningTime="2026-02-18 14:15:04.058692139 +0000 UTC m=+966.634228122" Feb 18 14:15:04 crc kubenswrapper[4817]: I0218 14:15:04.084595 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987464f4-c2b7x" podStartSLOduration=6.648400048 podStartE2EDuration="31.084574768s" podCreationTimestamp="2026-02-18 14:14:33 +0000 UTC" firstStartedPulling="2026-02-18 14:14:34.543111537 +0000 UTC m=+937.118647520" lastFinishedPulling="2026-02-18 14:14:58.979286257 +0000 UTC m=+961.554822240" observedRunningTime="2026-02-18 14:15:04.084231689 +0000 UTC m=+966.659767662" watchObservedRunningTime="2026-02-18 14:15:04.084574768 +0000 UTC m=+966.660110761" Feb 18 14:15:04 crc kubenswrapper[4817]: I0218 14:15:04.141524 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-4rv4f" podStartSLOduration=7.378604573 podStartE2EDuration="31.141505104s" podCreationTimestamp="2026-02-18 14:14:33 +0000 UTC" firstStartedPulling="2026-02-18 14:14:35.241918415 +0000 UTC m=+937.817454398" lastFinishedPulling="2026-02-18 14:14:59.004818936 +0000 UTC m=+961.580354929" observedRunningTime="2026-02-18 14:15:04.124238552 +0000 UTC m=+966.699774535" watchObservedRunningTime="2026-02-18 14:15:04.141505104 +0000 UTC m=+966.717041087" Feb 18 14:15:04 crc kubenswrapper[4817]: I0218 14:15:04.809883 4817 generic.go:334] "Generic (PLEG): container finished" podID="b6441d64-55c6-45f3-a648-74924b94b4f0" containerID="6b4a9274808cae48ea94dc6d50a89327c5a88a9b67bfb177ac190dbacc4a5757" exitCode=0 Feb 18 14:15:04 crc kubenswrapper[4817]: I0218 14:15:04.809959 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523735-fw4vl" event={"ID":"b6441d64-55c6-45f3-a648-74924b94b4f0","Type":"ContainerDied","Data":"6b4a9274808cae48ea94dc6d50a89327c5a88a9b67bfb177ac190dbacc4a5757"} Feb 18 14:15:06 crc kubenswrapper[4817]: I0218 14:15:06.195451 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523735-fw4vl" Feb 18 14:15:06 crc kubenswrapper[4817]: I0218 14:15:06.344623 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmxwb\" (UniqueName: \"kubernetes.io/projected/b6441d64-55c6-45f3-a648-74924b94b4f0-kube-api-access-cmxwb\") pod \"b6441d64-55c6-45f3-a648-74924b94b4f0\" (UID: \"b6441d64-55c6-45f3-a648-74924b94b4f0\") " Feb 18 14:15:06 crc kubenswrapper[4817]: I0218 14:15:06.344691 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6441d64-55c6-45f3-a648-74924b94b4f0-secret-volume\") pod \"b6441d64-55c6-45f3-a648-74924b94b4f0\" (UID: \"b6441d64-55c6-45f3-a648-74924b94b4f0\") " Feb 18 14:15:06 crc kubenswrapper[4817]: I0218 14:15:06.344814 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6441d64-55c6-45f3-a648-74924b94b4f0-config-volume\") pod \"b6441d64-55c6-45f3-a648-74924b94b4f0\" (UID: \"b6441d64-55c6-45f3-a648-74924b94b4f0\") " Feb 18 14:15:06 crc kubenswrapper[4817]: I0218 14:15:06.345641 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6441d64-55c6-45f3-a648-74924b94b4f0-config-volume" (OuterVolumeSpecName: "config-volume") pod "b6441d64-55c6-45f3-a648-74924b94b4f0" (UID: "b6441d64-55c6-45f3-a648-74924b94b4f0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:15:06 crc kubenswrapper[4817]: I0218 14:15:06.362212 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6441d64-55c6-45f3-a648-74924b94b4f0-kube-api-access-cmxwb" (OuterVolumeSpecName: "kube-api-access-cmxwb") pod "b6441d64-55c6-45f3-a648-74924b94b4f0" (UID: "b6441d64-55c6-45f3-a648-74924b94b4f0"). InnerVolumeSpecName "kube-api-access-cmxwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:15:06 crc kubenswrapper[4817]: I0218 14:15:06.362529 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6441d64-55c6-45f3-a648-74924b94b4f0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b6441d64-55c6-45f3-a648-74924b94b4f0" (UID: "b6441d64-55c6-45f3-a648-74924b94b4f0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:15:06 crc kubenswrapper[4817]: I0218 14:15:06.446475 4817 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6441d64-55c6-45f3-a648-74924b94b4f0-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 14:15:06 crc kubenswrapper[4817]: I0218 14:15:06.446505 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmxwb\" (UniqueName: \"kubernetes.io/projected/b6441d64-55c6-45f3-a648-74924b94b4f0-kube-api-access-cmxwb\") on node \"crc\" DevicePath \"\"" Feb 18 14:15:06 crc kubenswrapper[4817]: I0218 14:15:06.446516 4817 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6441d64-55c6-45f3-a648-74924b94b4f0-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 18 14:15:06 crc kubenswrapper[4817]: I0218 14:15:06.829606 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523735-fw4vl" event={"ID":"b6441d64-55c6-45f3-a648-74924b94b4f0","Type":"ContainerDied","Data":"e5d22ada13fe34363060e17129fd4e899497ac5c703cb198f28970a8234668f1"} Feb 18 14:15:06 crc kubenswrapper[4817]: I0218 14:15:06.829629 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523735-fw4vl" Feb 18 14:15:06 crc kubenswrapper[4817]: I0218 14:15:06.829651 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5d22ada13fe34363060e17129fd4e899497ac5c703cb198f28970a8234668f1" Feb 18 14:15:07 crc kubenswrapper[4817]: I0218 14:15:07.837765 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-xpwgd" event={"ID":"e7be81da-3629-4713-87c6-34cabd9a8347","Type":"ContainerStarted","Data":"07f5b68e06d6de571937e0d1073a092e6bd65026e11d4bd50def148a2b27d66c"} Feb 18 14:15:07 crc kubenswrapper[4817]: I0218 14:15:07.838544 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-xpwgd" Feb 18 14:15:07 crc kubenswrapper[4817]: I0218 14:15:07.839937 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j" event={"ID":"ad995216-386a-455b-b48d-378dbfd271bf","Type":"ContainerStarted","Data":"9c795002438ac17a2764a945cc9f0869668300d13d68d38f5cb9b2f282bc5250"} Feb 18 14:15:07 crc kubenswrapper[4817]: I0218 14:15:07.840360 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j" Feb 18 14:15:07 crc kubenswrapper[4817]: I0218 14:15:07.862615 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-xpwgd" podStartSLOduration=30.632957291 podStartE2EDuration="34.862596333s" podCreationTimestamp="2026-02-18 14:14:33 +0000 UTC" firstStartedPulling="2026-02-18 14:15:02.523286319 +0000 UTC m=+965.098822292" lastFinishedPulling="2026-02-18 14:15:06.752925351 +0000 UTC m=+969.328461334" observedRunningTime="2026-02-18 14:15:07.855249149 +0000 UTC m=+970.430785132" watchObservedRunningTime="2026-02-18 14:15:07.862596333 +0000 UTC m=+970.438132316" Feb 18 14:15:07 crc kubenswrapper[4817]: I0218 14:15:07.883555 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j" podStartSLOduration=30.691808946 podStartE2EDuration="34.883535758s" podCreationTimestamp="2026-02-18 14:14:33 +0000 UTC" firstStartedPulling="2026-02-18 14:15:02.543321831 +0000 UTC m=+965.118857814" lastFinishedPulling="2026-02-18 14:15:06.735048643 +0000 UTC m=+969.310584626" observedRunningTime="2026-02-18 14:15:07.878263816 +0000 UTC m=+970.453799799" watchObservedRunningTime="2026-02-18 14:15:07.883535758 +0000 UTC m=+970.459071741" Feb 18 14:15:10 crc kubenswrapper[4817]: I0218 14:15:10.273203 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7954588dd9-dngjl" Feb 18 14:15:11 crc kubenswrapper[4817]: I0218 14:15:11.873407 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-dr67b" event={"ID":"6e783396-37c1-4a0d-bfe4-495fdf4d41bf","Type":"ContainerStarted","Data":"b96a57fc67ee856d707d962fbaee345ce81253d7a67285444cc4d945f4187f89"} Feb 18 14:15:11 crc kubenswrapper[4817]: I0218 14:15:11.874187 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-dr67b" Feb 18 14:15:11 crc kubenswrapper[4817]: I0218 14:15:11.889710 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-dr67b" podStartSLOduration=2.377741678 podStartE2EDuration="38.889691879s" podCreationTimestamp="2026-02-18 14:14:33 +0000 UTC" firstStartedPulling="2026-02-18 14:14:35.061326641 +0000 UTC m=+937.636862624" lastFinishedPulling="2026-02-18 14:15:11.573276852 +0000 UTC m=+974.148812825" observedRunningTime="2026-02-18 14:15:11.887494734 +0000 UTC m=+974.463030717" watchObservedRunningTime="2026-02-18 14:15:11.889691879 +0000 UTC m=+974.465227862" Feb 18 14:15:13 crc kubenswrapper[4817]: I0218 14:15:13.572850 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rs8vm" Feb 18 14:15:13 crc kubenswrapper[4817]: I0218 14:15:13.596773 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-z64rl" Feb 18 14:15:13 crc kubenswrapper[4817]: I0218 14:15:13.631048 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-hwmjj" Feb 18 14:15:13 crc kubenswrapper[4817]: I0218 14:15:13.738514 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987464f4-c2b7x" Feb 18 14:15:13 crc kubenswrapper[4817]: I0218 14:15:13.895829 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-fs8m7" event={"ID":"85bd6fc0-d973-4172-b441-c15d4abeb604","Type":"ContainerStarted","Data":"2aa2e591e9c5ba8273f9fa4fdb5dbde954f6d8d1b19f37d8298c12065ade172f"} Feb 18 14:15:13 crc kubenswrapper[4817]: I0218 14:15:13.896089 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-fs8m7" Feb 18 14:15:13 crc kubenswrapper[4817]: I0218 14:15:13.899798 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9jkwb" event={"ID":"ca917110-0727-4c63-ad9a-20722a6cba34","Type":"ContainerStarted","Data":"9059088f20daa533e0efa5051eba8f5698f7e07503c113ba2480091f2d125d02"} Feb 18 14:15:13 crc kubenswrapper[4817]: I0218 14:15:13.900068 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9jkwb" Feb 18 14:15:13 crc kubenswrapper[4817]: I0218 14:15:13.902268 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5np9g" event={"ID":"086958e1-8a7d-40c9-9725-f18776f863a0","Type":"ContainerStarted","Data":"1c2ec6c4907045574e0d3df8745972fad0c7d0a0ac2941fea17ff2ecfc0f07f7"} Feb 18 14:15:13 crc kubenswrapper[4817]: I0218 14:15:13.911516 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-fs8m7" podStartSLOduration=2.977943176 podStartE2EDuration="40.911498524s" podCreationTimestamp="2026-02-18 14:14:33 +0000 UTC" firstStartedPulling="2026-02-18 14:14:34.901565268 +0000 UTC m=+937.477101251" lastFinishedPulling="2026-02-18 14:15:12.835120626 +0000 UTC m=+975.410656599" observedRunningTime="2026-02-18 14:15:13.910567791 +0000 UTC m=+976.486103774" watchObservedRunningTime="2026-02-18 14:15:13.911498524 +0000 UTC m=+976.487034507" Feb 18 14:15:13 crc kubenswrapper[4817]: I0218 14:15:13.932348 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-5np9g" podStartSLOduration=2.696457118 podStartE2EDuration="39.932324296s" podCreationTimestamp="2026-02-18 14:14:34 +0000 UTC" firstStartedPulling="2026-02-18 14:14:35.599972366 +0000 UTC m=+938.175508349" lastFinishedPulling="2026-02-18 14:15:12.835839544 +0000 UTC m=+975.411375527" observedRunningTime="2026-02-18 14:15:13.927843383 +0000 UTC m=+976.503379376" watchObservedRunningTime="2026-02-18 14:15:13.932324296 +0000 UTC m=+976.507860279" Feb 18 14:15:13 crc kubenswrapper[4817]: I0218 14:15:13.947856 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9jkwb" podStartSLOduration=3.349944556 podStartE2EDuration="40.947835214s" podCreationTimestamp="2026-02-18 14:14:33 +0000 UTC" firstStartedPulling="2026-02-18 14:14:35.236852428 +0000 UTC m=+937.812388411" lastFinishedPulling="2026-02-18 14:15:12.834743086 +0000 UTC m=+975.410279069" observedRunningTime="2026-02-18 14:15:13.943319661 +0000 UTC m=+976.518855644" watchObservedRunningTime="2026-02-18 14:15:13.947835214 +0000 UTC m=+976.523371197" Feb 18 14:15:14 crc kubenswrapper[4817]: I0218 14:15:14.003556 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-zknf5" Feb 18 14:15:14 crc kubenswrapper[4817]: I0218 14:15:14.043513 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-gq259" Feb 18 14:15:14 crc kubenswrapper[4817]: I0218 14:15:14.097867 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-64rvt" Feb 18 14:15:14 crc kubenswrapper[4817]: I0218 14:15:14.159429 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-4h8qx" Feb 18 14:15:14 crc kubenswrapper[4817]: I0218 14:15:14.187571 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-4rv4f" Feb 18 14:15:14 crc kubenswrapper[4817]: I0218 14:15:14.291969 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-qlrqm" Feb 18 14:15:14 crc kubenswrapper[4817]: I0218 14:15:14.337297 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-l9fqt" Feb 18 14:15:14 crc kubenswrapper[4817]: I0218 14:15:14.400739 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9td4r" Feb 18 14:15:14 crc kubenswrapper[4817]: I0218 14:15:14.424835 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-ff84x" Feb 18 14:15:14 crc kubenswrapper[4817]: I0218 14:15:14.446039 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6956d67c5c-xbjdr" Feb 18 14:15:14 crc kubenswrapper[4817]: I0218 14:15:14.523596 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7866795846-8q5c8" Feb 18 14:15:14 crc kubenswrapper[4817]: I0218 14:15:14.600636 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-8pfz7" Feb 18 14:15:19 crc kubenswrapper[4817]: I0218 14:15:19.636037 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-xpwgd" Feb 18 14:15:19 crc kubenswrapper[4817]: I0218 14:15:19.958944 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j" Feb 18 14:15:24 crc kubenswrapper[4817]: I0218 14:15:24.110508 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-fs8m7" Feb 18 14:15:24 crc kubenswrapper[4817]: I0218 14:15:24.222252 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-dr67b" Feb 18 14:15:24 crc kubenswrapper[4817]: I0218 14:15:24.252962 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9jkwb" Feb 18 14:15:42 crc kubenswrapper[4817]: I0218 14:15:42.863037 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:15:42 crc kubenswrapper[4817]: I0218 14:15:42.863653 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:15:44 crc kubenswrapper[4817]: I0218 14:15:44.260098 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-r54zh"] Feb 18 14:15:44 crc kubenswrapper[4817]: E0218 14:15:44.261168 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6441d64-55c6-45f3-a648-74924b94b4f0" containerName="collect-profiles" Feb 18 14:15:44 crc kubenswrapper[4817]: I0218 14:15:44.261187 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6441d64-55c6-45f3-a648-74924b94b4f0" containerName="collect-profiles" Feb 18 14:15:44 crc kubenswrapper[4817]: I0218 14:15:44.261378 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6441d64-55c6-45f3-a648-74924b94b4f0" containerName="collect-profiles" Feb 18 14:15:44 crc kubenswrapper[4817]: I0218 14:15:44.262360 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-r54zh" Feb 18 14:15:44 crc kubenswrapper[4817]: I0218 14:15:44.269248 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 18 14:15:44 crc kubenswrapper[4817]: I0218 14:15:44.269291 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-r54zh"] Feb 18 14:15:44 crc kubenswrapper[4817]: I0218 14:15:44.269522 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-wxbvf" Feb 18 14:15:44 crc kubenswrapper[4817]: I0218 14:15:44.269685 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 18 14:15:44 crc kubenswrapper[4817]: I0218 14:15:44.269810 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 18 14:15:44 crc kubenswrapper[4817]: I0218 14:15:44.357105 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kckq\" (UniqueName: \"kubernetes.io/projected/0f31baa1-620e-461a-99cb-cf33a2c9ebe9-kube-api-access-7kckq\") pod \"dnsmasq-dns-675f4bcbfc-r54zh\" (UID: \"0f31baa1-620e-461a-99cb-cf33a2c9ebe9\") " pod="openstack/dnsmasq-dns-675f4bcbfc-r54zh" Feb 18 14:15:44 crc kubenswrapper[4817]: I0218 14:15:44.357177 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f31baa1-620e-461a-99cb-cf33a2c9ebe9-config\") pod \"dnsmasq-dns-675f4bcbfc-r54zh\" (UID: \"0f31baa1-620e-461a-99cb-cf33a2c9ebe9\") " pod="openstack/dnsmasq-dns-675f4bcbfc-r54zh" Feb 18 14:15:44 crc kubenswrapper[4817]: I0218 14:15:44.365705 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-d4d9k"] Feb 18 14:15:44 crc kubenswrapper[4817]: I0218 14:15:44.367990 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-d4d9k" Feb 18 14:15:44 crc kubenswrapper[4817]: I0218 14:15:44.387380 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 18 14:15:44 crc kubenswrapper[4817]: I0218 14:15:44.396898 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-d4d9k"] Feb 18 14:15:44 crc kubenswrapper[4817]: I0218 14:15:44.461751 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0f55bee-e270-4511-b150-5fe86f80e614-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-d4d9k\" (UID: \"a0f55bee-e270-4511-b150-5fe86f80e614\") " pod="openstack/dnsmasq-dns-78dd6ddcc-d4d9k" Feb 18 14:15:44 crc kubenswrapper[4817]: I0218 14:15:44.461839 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0f55bee-e270-4511-b150-5fe86f80e614-config\") pod \"dnsmasq-dns-78dd6ddcc-d4d9k\" (UID: \"a0f55bee-e270-4511-b150-5fe86f80e614\") " pod="openstack/dnsmasq-dns-78dd6ddcc-d4d9k" Feb 18 14:15:44 crc kubenswrapper[4817]: I0218 14:15:44.461920 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gdzv\" (UniqueName: \"kubernetes.io/projected/a0f55bee-e270-4511-b150-5fe86f80e614-kube-api-access-2gdzv\") pod \"dnsmasq-dns-78dd6ddcc-d4d9k\" (UID: \"a0f55bee-e270-4511-b150-5fe86f80e614\") " pod="openstack/dnsmasq-dns-78dd6ddcc-d4d9k" Feb 18 14:15:44 crc kubenswrapper[4817]: I0218 14:15:44.461961 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kckq\" (UniqueName: \"kubernetes.io/projected/0f31baa1-620e-461a-99cb-cf33a2c9ebe9-kube-api-access-7kckq\") pod \"dnsmasq-dns-675f4bcbfc-r54zh\" (UID: \"0f31baa1-620e-461a-99cb-cf33a2c9ebe9\") " pod="openstack/dnsmasq-dns-675f4bcbfc-r54zh" Feb 18 14:15:44 crc kubenswrapper[4817]: I0218 14:15:44.462009 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f31baa1-620e-461a-99cb-cf33a2c9ebe9-config\") pod \"dnsmasq-dns-675f4bcbfc-r54zh\" (UID: \"0f31baa1-620e-461a-99cb-cf33a2c9ebe9\") " pod="openstack/dnsmasq-dns-675f4bcbfc-r54zh" Feb 18 14:15:44 crc kubenswrapper[4817]: I0218 14:15:44.463142 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f31baa1-620e-461a-99cb-cf33a2c9ebe9-config\") pod \"dnsmasq-dns-675f4bcbfc-r54zh\" (UID: \"0f31baa1-620e-461a-99cb-cf33a2c9ebe9\") " pod="openstack/dnsmasq-dns-675f4bcbfc-r54zh" Feb 18 14:15:44 crc kubenswrapper[4817]: I0218 14:15:44.504125 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kckq\" (UniqueName: \"kubernetes.io/projected/0f31baa1-620e-461a-99cb-cf33a2c9ebe9-kube-api-access-7kckq\") pod \"dnsmasq-dns-675f4bcbfc-r54zh\" (UID: \"0f31baa1-620e-461a-99cb-cf33a2c9ebe9\") " pod="openstack/dnsmasq-dns-675f4bcbfc-r54zh" Feb 18 14:15:44 crc kubenswrapper[4817]: I0218 14:15:44.563785 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0f55bee-e270-4511-b150-5fe86f80e614-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-d4d9k\" (UID: \"a0f55bee-e270-4511-b150-5fe86f80e614\") " pod="openstack/dnsmasq-dns-78dd6ddcc-d4d9k" Feb 18 14:15:44 crc kubenswrapper[4817]: I0218 14:15:44.563859 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0f55bee-e270-4511-b150-5fe86f80e614-config\") pod \"dnsmasq-dns-78dd6ddcc-d4d9k\" (UID: \"a0f55bee-e270-4511-b150-5fe86f80e614\") " pod="openstack/dnsmasq-dns-78dd6ddcc-d4d9k" Feb 18 14:15:44 crc kubenswrapper[4817]: I0218 14:15:44.563935 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gdzv\" (UniqueName: \"kubernetes.io/projected/a0f55bee-e270-4511-b150-5fe86f80e614-kube-api-access-2gdzv\") pod \"dnsmasq-dns-78dd6ddcc-d4d9k\" (UID: \"a0f55bee-e270-4511-b150-5fe86f80e614\") " pod="openstack/dnsmasq-dns-78dd6ddcc-d4d9k" Feb 18 14:15:44 crc kubenswrapper[4817]: I0218 14:15:44.565044 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0f55bee-e270-4511-b150-5fe86f80e614-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-d4d9k\" (UID: \"a0f55bee-e270-4511-b150-5fe86f80e614\") " pod="openstack/dnsmasq-dns-78dd6ddcc-d4d9k" Feb 18 14:15:44 crc kubenswrapper[4817]: I0218 14:15:44.565253 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0f55bee-e270-4511-b150-5fe86f80e614-config\") pod \"dnsmasq-dns-78dd6ddcc-d4d9k\" (UID: \"a0f55bee-e270-4511-b150-5fe86f80e614\") " pod="openstack/dnsmasq-dns-78dd6ddcc-d4d9k" Feb 18 14:15:44 crc kubenswrapper[4817]: I0218 14:15:44.585809 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gdzv\" (UniqueName: \"kubernetes.io/projected/a0f55bee-e270-4511-b150-5fe86f80e614-kube-api-access-2gdzv\") pod \"dnsmasq-dns-78dd6ddcc-d4d9k\" (UID: \"a0f55bee-e270-4511-b150-5fe86f80e614\") " pod="openstack/dnsmasq-dns-78dd6ddcc-d4d9k" Feb 18 14:15:44 crc kubenswrapper[4817]: I0218 14:15:44.601911 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-r54zh" Feb 18 14:15:44 crc kubenswrapper[4817]: I0218 14:15:44.709415 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-d4d9k" Feb 18 14:15:45 crc kubenswrapper[4817]: I0218 14:15:45.061920 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-r54zh"] Feb 18 14:15:45 crc kubenswrapper[4817]: I0218 14:15:45.124825 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-r54zh" event={"ID":"0f31baa1-620e-461a-99cb-cf33a2c9ebe9","Type":"ContainerStarted","Data":"d4af17dbcfc28c5d15790b9b0df7bee318dcfd9ff7ce126da6408c8576815e44"} Feb 18 14:15:45 crc kubenswrapper[4817]: I0218 14:15:45.145383 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-d4d9k"] Feb 18 14:15:45 crc kubenswrapper[4817]: W0218 14:15:45.148184 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0f55bee_e270_4511_b150_5fe86f80e614.slice/crio-e7376d29005cf9cbf6b4d2fddee5837f70d9c5122f50ac3c0bbbeaad505304b7 WatchSource:0}: Error finding container e7376d29005cf9cbf6b4d2fddee5837f70d9c5122f50ac3c0bbbeaad505304b7: Status 404 returned error can't find the container with id e7376d29005cf9cbf6b4d2fddee5837f70d9c5122f50ac3c0bbbeaad505304b7 Feb 18 14:15:46 crc kubenswrapper[4817]: I0218 14:15:46.135582 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-d4d9k" event={"ID":"a0f55bee-e270-4511-b150-5fe86f80e614","Type":"ContainerStarted","Data":"e7376d29005cf9cbf6b4d2fddee5837f70d9c5122f50ac3c0bbbeaad505304b7"} Feb 18 14:15:47 crc kubenswrapper[4817]: I0218 14:15:47.071238 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-r54zh"] Feb 18 14:15:47 crc kubenswrapper[4817]: I0218 14:15:47.095495 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-lczjn"] Feb 18 14:15:47 crc kubenswrapper[4817]: I0218 14:15:47.102296 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-lczjn" Feb 18 14:15:47 crc kubenswrapper[4817]: I0218 14:15:47.108541 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-lczjn"] Feb 18 14:15:47 crc kubenswrapper[4817]: I0218 14:15:47.206682 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42gj8\" (UniqueName: \"kubernetes.io/projected/c43d705a-6aa5-43a0-839d-1ab705e28be6-kube-api-access-42gj8\") pod \"dnsmasq-dns-666b6646f7-lczjn\" (UID: \"c43d705a-6aa5-43a0-839d-1ab705e28be6\") " pod="openstack/dnsmasq-dns-666b6646f7-lczjn" Feb 18 14:15:47 crc kubenswrapper[4817]: I0218 14:15:47.206927 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c43d705a-6aa5-43a0-839d-1ab705e28be6-dns-svc\") pod \"dnsmasq-dns-666b6646f7-lczjn\" (UID: \"c43d705a-6aa5-43a0-839d-1ab705e28be6\") " pod="openstack/dnsmasq-dns-666b6646f7-lczjn" Feb 18 14:15:47 crc kubenswrapper[4817]: I0218 14:15:47.207021 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c43d705a-6aa5-43a0-839d-1ab705e28be6-config\") pod \"dnsmasq-dns-666b6646f7-lczjn\" (UID: \"c43d705a-6aa5-43a0-839d-1ab705e28be6\") " pod="openstack/dnsmasq-dns-666b6646f7-lczjn" Feb 18 14:15:47 crc kubenswrapper[4817]: I0218 14:15:47.308828 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c43d705a-6aa5-43a0-839d-1ab705e28be6-dns-svc\") pod \"dnsmasq-dns-666b6646f7-lczjn\" (UID: \"c43d705a-6aa5-43a0-839d-1ab705e28be6\") " pod="openstack/dnsmasq-dns-666b6646f7-lczjn" Feb 18 14:15:47 crc kubenswrapper[4817]: I0218 14:15:47.308894 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c43d705a-6aa5-43a0-839d-1ab705e28be6-config\") pod \"dnsmasq-dns-666b6646f7-lczjn\" (UID: \"c43d705a-6aa5-43a0-839d-1ab705e28be6\") " pod="openstack/dnsmasq-dns-666b6646f7-lczjn" Feb 18 14:15:47 crc kubenswrapper[4817]: I0218 14:15:47.310214 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c43d705a-6aa5-43a0-839d-1ab705e28be6-dns-svc\") pod \"dnsmasq-dns-666b6646f7-lczjn\" (UID: \"c43d705a-6aa5-43a0-839d-1ab705e28be6\") " pod="openstack/dnsmasq-dns-666b6646f7-lczjn" Feb 18 14:15:47 crc kubenswrapper[4817]: I0218 14:15:47.310288 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c43d705a-6aa5-43a0-839d-1ab705e28be6-config\") pod \"dnsmasq-dns-666b6646f7-lczjn\" (UID: \"c43d705a-6aa5-43a0-839d-1ab705e28be6\") " pod="openstack/dnsmasq-dns-666b6646f7-lczjn" Feb 18 14:15:47 crc kubenswrapper[4817]: I0218 14:15:47.310454 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42gj8\" (UniqueName: \"kubernetes.io/projected/c43d705a-6aa5-43a0-839d-1ab705e28be6-kube-api-access-42gj8\") pod \"dnsmasq-dns-666b6646f7-lczjn\" (UID: \"c43d705a-6aa5-43a0-839d-1ab705e28be6\") " pod="openstack/dnsmasq-dns-666b6646f7-lczjn" Feb 18 14:15:47 crc kubenswrapper[4817]: I0218 14:15:47.343320 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42gj8\" (UniqueName: \"kubernetes.io/projected/c43d705a-6aa5-43a0-839d-1ab705e28be6-kube-api-access-42gj8\") pod \"dnsmasq-dns-666b6646f7-lczjn\" (UID: \"c43d705a-6aa5-43a0-839d-1ab705e28be6\") " pod="openstack/dnsmasq-dns-666b6646f7-lczjn" Feb 18 14:15:47 crc kubenswrapper[4817]: I0218 14:15:47.419640 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-d4d9k"] Feb 18 14:15:47 crc kubenswrapper[4817]: I0218 14:15:47.429907 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-lczjn" Feb 18 14:15:47 crc kubenswrapper[4817]: I0218 14:15:47.450435 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-lfrgc"] Feb 18 14:15:47 crc kubenswrapper[4817]: I0218 14:15:47.451665 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-lfrgc" Feb 18 14:15:47 crc kubenswrapper[4817]: I0218 14:15:47.466763 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-lfrgc"] Feb 18 14:15:47 crc kubenswrapper[4817]: I0218 14:15:47.517694 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4q62\" (UniqueName: \"kubernetes.io/projected/5ffa1255-1d7c-43a4-a197-931d34164a31-kube-api-access-g4q62\") pod \"dnsmasq-dns-57d769cc4f-lfrgc\" (UID: \"5ffa1255-1d7c-43a4-a197-931d34164a31\") " pod="openstack/dnsmasq-dns-57d769cc4f-lfrgc" Feb 18 14:15:47 crc kubenswrapper[4817]: I0218 14:15:47.517743 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ffa1255-1d7c-43a4-a197-931d34164a31-config\") pod \"dnsmasq-dns-57d769cc4f-lfrgc\" (UID: \"5ffa1255-1d7c-43a4-a197-931d34164a31\") " pod="openstack/dnsmasq-dns-57d769cc4f-lfrgc" Feb 18 14:15:47 crc kubenswrapper[4817]: I0218 14:15:47.517780 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ffa1255-1d7c-43a4-a197-931d34164a31-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-lfrgc\" (UID: \"5ffa1255-1d7c-43a4-a197-931d34164a31\") " pod="openstack/dnsmasq-dns-57d769cc4f-lfrgc" Feb 18 14:15:47 crc kubenswrapper[4817]: I0218 14:15:47.619524 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4q62\" (UniqueName: \"kubernetes.io/projected/5ffa1255-1d7c-43a4-a197-931d34164a31-kube-api-access-g4q62\") pod \"dnsmasq-dns-57d769cc4f-lfrgc\" (UID: \"5ffa1255-1d7c-43a4-a197-931d34164a31\") " pod="openstack/dnsmasq-dns-57d769cc4f-lfrgc" Feb 18 14:15:47 crc kubenswrapper[4817]: I0218 14:15:47.619590 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ffa1255-1d7c-43a4-a197-931d34164a31-config\") pod \"dnsmasq-dns-57d769cc4f-lfrgc\" (UID: \"5ffa1255-1d7c-43a4-a197-931d34164a31\") " pod="openstack/dnsmasq-dns-57d769cc4f-lfrgc" Feb 18 14:15:47 crc kubenswrapper[4817]: I0218 14:15:47.619639 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ffa1255-1d7c-43a4-a197-931d34164a31-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-lfrgc\" (UID: \"5ffa1255-1d7c-43a4-a197-931d34164a31\") " pod="openstack/dnsmasq-dns-57d769cc4f-lfrgc" Feb 18 14:15:47 crc kubenswrapper[4817]: I0218 14:15:47.621534 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ffa1255-1d7c-43a4-a197-931d34164a31-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-lfrgc\" (UID: \"5ffa1255-1d7c-43a4-a197-931d34164a31\") " pod="openstack/dnsmasq-dns-57d769cc4f-lfrgc" Feb 18 14:15:47 crc kubenswrapper[4817]: I0218 14:15:47.629544 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ffa1255-1d7c-43a4-a197-931d34164a31-config\") pod \"dnsmasq-dns-57d769cc4f-lfrgc\" (UID: \"5ffa1255-1d7c-43a4-a197-931d34164a31\") " pod="openstack/dnsmasq-dns-57d769cc4f-lfrgc" Feb 18 14:15:47 crc kubenswrapper[4817]: I0218 14:15:47.653813 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4q62\" (UniqueName: \"kubernetes.io/projected/5ffa1255-1d7c-43a4-a197-931d34164a31-kube-api-access-g4q62\") pod \"dnsmasq-dns-57d769cc4f-lfrgc\" (UID: \"5ffa1255-1d7c-43a4-a197-931d34164a31\") " pod="openstack/dnsmasq-dns-57d769cc4f-lfrgc" Feb 18 14:15:47 crc kubenswrapper[4817]: I0218 14:15:47.806314 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-lfrgc" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.034893 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-lczjn"] Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.160204 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-lczjn" event={"ID":"c43d705a-6aa5-43a0-839d-1ab705e28be6","Type":"ContainerStarted","Data":"f2aadc52b196d6aa00c6e8eb3526ecad5b6465654003e4e07159bf48070be824"} Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.244745 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.246630 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.251471 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.251693 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.251969 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.252113 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.252237 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-5rlj7" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.252309 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.252325 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.282255 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.307262 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-lfrgc"] Feb 18 14:15:48 crc kubenswrapper[4817]: W0218 14:15:48.314894 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ffa1255_1d7c_43a4_a197_931d34164a31.slice/crio-b230e337a15cebeaabb082fc171738df6fe4c6b66870a032585c55f85039c269 WatchSource:0}: Error finding container b230e337a15cebeaabb082fc171738df6fe4c6b66870a032585c55f85039c269: Status 404 returned error can't find the container with id b230e337a15cebeaabb082fc171738df6fe4c6b66870a032585c55f85039c269 Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.338386 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.338483 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.338554 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.338579 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.338724 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.338793 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.338853 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-config-data\") pod \"rabbitmq-server-0\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.338929 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8ea7e34d-f8e0-47a0-a00a-1c7ce9082423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ea7e34d-f8e0-47a0-a00a-1c7ce9082423\") pod \"rabbitmq-server-0\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.339068 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgh5g\" (UniqueName: \"kubernetes.io/projected/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-kube-api-access-wgh5g\") pod \"rabbitmq-server-0\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.339143 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.339226 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.442030 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.442094 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.442117 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.442144 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.442177 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.442967 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.444482 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.444596 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-config-data\") pod \"rabbitmq-server-0\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.445074 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.445143 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-config-data\") pod \"rabbitmq-server-0\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.445176 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8ea7e34d-f8e0-47a0-a00a-1c7ce9082423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ea7e34d-f8e0-47a0-a00a-1c7ce9082423\") pod \"rabbitmq-server-0\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.445378 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgh5g\" (UniqueName: \"kubernetes.io/projected/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-kube-api-access-wgh5g\") pod \"rabbitmq-server-0\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.445426 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.445464 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.445573 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.447608 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.448706 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.450168 4817 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.450206 4817 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8ea7e34d-f8e0-47a0-a00a-1c7ce9082423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ea7e34d-f8e0-47a0-a00a-1c7ce9082423\") pod \"rabbitmq-server-0\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ec3ed5eb3e080d5af3d3f8c52d88ffd1c20f1133cdda296c9c2f6b5efdb03bc5/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.453358 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.454966 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.475009 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.478150 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgh5g\" (UniqueName: \"kubernetes.io/projected/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-kube-api-access-wgh5g\") pod \"rabbitmq-server-0\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.486077 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8ea7e34d-f8e0-47a0-a00a-1c7ce9082423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ea7e34d-f8e0-47a0-a00a-1c7ce9082423\") pod \"rabbitmq-server-0\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.568041 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.569239 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.580070 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.584287 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.584579 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.584339 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.584760 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-v8dct" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.584811 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.585054 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.585161 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.599230 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.659206 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/14e634c8-da00-43a5-96a8-33e8bf806873-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.659242 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/14e634c8-da00-43a5-96a8-33e8bf806873-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.659290 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/14e634c8-da00-43a5-96a8-33e8bf806873-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.659309 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/14e634c8-da00-43a5-96a8-33e8bf806873-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.659328 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/14e634c8-da00-43a5-96a8-33e8bf806873-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.659366 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/14e634c8-da00-43a5-96a8-33e8bf806873-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.659383 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/14e634c8-da00-43a5-96a8-33e8bf806873-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.659415 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1b819bed-6fd2-438e-9959-c6456132bba7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1b819bed-6fd2-438e-9959-c6456132bba7\") pod \"rabbitmq-cell1-server-0\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.659432 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99sfr\" (UniqueName: \"kubernetes.io/projected/14e634c8-da00-43a5-96a8-33e8bf806873-kube-api-access-99sfr\") pod \"rabbitmq-cell1-server-0\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.659450 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/14e634c8-da00-43a5-96a8-33e8bf806873-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.659479 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/14e634c8-da00-43a5-96a8-33e8bf806873-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.761382 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/14e634c8-da00-43a5-96a8-33e8bf806873-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.761458 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/14e634c8-da00-43a5-96a8-33e8bf806873-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.761503 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1b819bed-6fd2-438e-9959-c6456132bba7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1b819bed-6fd2-438e-9959-c6456132bba7\") pod \"rabbitmq-cell1-server-0\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.761579 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99sfr\" (UniqueName: \"kubernetes.io/projected/14e634c8-da00-43a5-96a8-33e8bf806873-kube-api-access-99sfr\") pod \"rabbitmq-cell1-server-0\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.761607 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/14e634c8-da00-43a5-96a8-33e8bf806873-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.761662 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/14e634c8-da00-43a5-96a8-33e8bf806873-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.761715 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/14e634c8-da00-43a5-96a8-33e8bf806873-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.761733 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/14e634c8-da00-43a5-96a8-33e8bf806873-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.761789 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/14e634c8-da00-43a5-96a8-33e8bf806873-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.761814 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/14e634c8-da00-43a5-96a8-33e8bf806873-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.761833 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/14e634c8-da00-43a5-96a8-33e8bf806873-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.763174 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/14e634c8-da00-43a5-96a8-33e8bf806873-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.764002 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/14e634c8-da00-43a5-96a8-33e8bf806873-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.765478 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/14e634c8-da00-43a5-96a8-33e8bf806873-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.765508 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/14e634c8-da00-43a5-96a8-33e8bf806873-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.766081 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/14e634c8-da00-43a5-96a8-33e8bf806873-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.768191 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/14e634c8-da00-43a5-96a8-33e8bf806873-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.768512 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/14e634c8-da00-43a5-96a8-33e8bf806873-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.771644 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/14e634c8-da00-43a5-96a8-33e8bf806873-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.773687 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/14e634c8-da00-43a5-96a8-33e8bf806873-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.776539 4817 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.776601 4817 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1b819bed-6fd2-438e-9959-c6456132bba7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1b819bed-6fd2-438e-9959-c6456132bba7\") pod \"rabbitmq-cell1-server-0\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/03221d3a9d26cd285b43664e9f0aaceceb14476da6a82478bb8f80895eef44b7/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.799923 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99sfr\" (UniqueName: \"kubernetes.io/projected/14e634c8-da00-43a5-96a8-33e8bf806873-kube-api-access-99sfr\") pod \"rabbitmq-cell1-server-0\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.810538 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1b819bed-6fd2-438e-9959-c6456132bba7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1b819bed-6fd2-438e-9959-c6456132bba7\") pod \"rabbitmq-cell1-server-0\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:48 crc kubenswrapper[4817]: I0218 14:15:48.918288 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:15:49 crc kubenswrapper[4817]: I0218 14:15:49.181329 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-lfrgc" event={"ID":"5ffa1255-1d7c-43a4-a197-931d34164a31","Type":"ContainerStarted","Data":"b230e337a15cebeaabb082fc171738df6fe4c6b66870a032585c55f85039c269"} Feb 18 14:15:49 crc kubenswrapper[4817]: I0218 14:15:49.788727 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 18 14:15:49 crc kubenswrapper[4817]: I0218 14:15:49.789948 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 18 14:15:49 crc kubenswrapper[4817]: I0218 14:15:49.794598 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-hqn2r" Feb 18 14:15:49 crc kubenswrapper[4817]: I0218 14:15:49.796908 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 18 14:15:49 crc kubenswrapper[4817]: I0218 14:15:49.798493 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 18 14:15:49 crc kubenswrapper[4817]: I0218 14:15:49.802494 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 18 14:15:49 crc kubenswrapper[4817]: I0218 14:15:49.802751 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 18 14:15:49 crc kubenswrapper[4817]: I0218 14:15:49.814277 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 18 14:15:49 crc kubenswrapper[4817]: I0218 14:15:49.880152 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/def7b080-de6e-49f1-9437-44d6f40b48c4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"def7b080-de6e-49f1-9437-44d6f40b48c4\") " pod="openstack/openstack-galera-0" Feb 18 14:15:49 crc kubenswrapper[4817]: I0218 14:15:49.880243 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/def7b080-de6e-49f1-9437-44d6f40b48c4-config-data-default\") pod \"openstack-galera-0\" (UID: \"def7b080-de6e-49f1-9437-44d6f40b48c4\") " pod="openstack/openstack-galera-0" Feb 18 14:15:49 crc kubenswrapper[4817]: I0218 14:15:49.880281 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d84b0391-3ccd-400a-90bf-65bd569fc51b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d84b0391-3ccd-400a-90bf-65bd569fc51b\") pod \"openstack-galera-0\" (UID: \"def7b080-de6e-49f1-9437-44d6f40b48c4\") " pod="openstack/openstack-galera-0" Feb 18 14:15:49 crc kubenswrapper[4817]: I0218 14:15:49.880551 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/def7b080-de6e-49f1-9437-44d6f40b48c4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"def7b080-de6e-49f1-9437-44d6f40b48c4\") " pod="openstack/openstack-galera-0" Feb 18 14:15:49 crc kubenswrapper[4817]: I0218 14:15:49.880650 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/def7b080-de6e-49f1-9437-44d6f40b48c4-kolla-config\") pod \"openstack-galera-0\" (UID: \"def7b080-de6e-49f1-9437-44d6f40b48c4\") " pod="openstack/openstack-galera-0" Feb 18 14:15:49 crc kubenswrapper[4817]: I0218 14:15:49.880719 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/def7b080-de6e-49f1-9437-44d6f40b48c4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"def7b080-de6e-49f1-9437-44d6f40b48c4\") " pod="openstack/openstack-galera-0" Feb 18 14:15:49 crc kubenswrapper[4817]: I0218 14:15:49.880750 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lthwz\" (UniqueName: \"kubernetes.io/projected/def7b080-de6e-49f1-9437-44d6f40b48c4-kube-api-access-lthwz\") pod \"openstack-galera-0\" (UID: \"def7b080-de6e-49f1-9437-44d6f40b48c4\") " pod="openstack/openstack-galera-0" Feb 18 14:15:49 crc kubenswrapper[4817]: I0218 14:15:49.880770 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/def7b080-de6e-49f1-9437-44d6f40b48c4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"def7b080-de6e-49f1-9437-44d6f40b48c4\") " pod="openstack/openstack-galera-0" Feb 18 14:15:49 crc kubenswrapper[4817]: I0218 14:15:49.982508 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/def7b080-de6e-49f1-9437-44d6f40b48c4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"def7b080-de6e-49f1-9437-44d6f40b48c4\") " pod="openstack/openstack-galera-0" Feb 18 14:15:49 crc kubenswrapper[4817]: I0218 14:15:49.982582 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/def7b080-de6e-49f1-9437-44d6f40b48c4-config-data-default\") pod \"openstack-galera-0\" (UID: \"def7b080-de6e-49f1-9437-44d6f40b48c4\") " pod="openstack/openstack-galera-0" Feb 18 14:15:49 crc kubenswrapper[4817]: I0218 14:15:49.982603 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d84b0391-3ccd-400a-90bf-65bd569fc51b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d84b0391-3ccd-400a-90bf-65bd569fc51b\") pod \"openstack-galera-0\" (UID: \"def7b080-de6e-49f1-9437-44d6f40b48c4\") " pod="openstack/openstack-galera-0" Feb 18 14:15:49 crc kubenswrapper[4817]: I0218 14:15:49.982631 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/def7b080-de6e-49f1-9437-44d6f40b48c4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"def7b080-de6e-49f1-9437-44d6f40b48c4\") " pod="openstack/openstack-galera-0" Feb 18 14:15:49 crc kubenswrapper[4817]: I0218 14:15:49.982675 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/def7b080-de6e-49f1-9437-44d6f40b48c4-kolla-config\") pod \"openstack-galera-0\" (UID: \"def7b080-de6e-49f1-9437-44d6f40b48c4\") " pod="openstack/openstack-galera-0" Feb 18 14:15:49 crc kubenswrapper[4817]: I0218 14:15:49.982723 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/def7b080-de6e-49f1-9437-44d6f40b48c4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"def7b080-de6e-49f1-9437-44d6f40b48c4\") " pod="openstack/openstack-galera-0" Feb 18 14:15:49 crc kubenswrapper[4817]: I0218 14:15:49.982751 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lthwz\" (UniqueName: \"kubernetes.io/projected/def7b080-de6e-49f1-9437-44d6f40b48c4-kube-api-access-lthwz\") pod \"openstack-galera-0\" (UID: \"def7b080-de6e-49f1-9437-44d6f40b48c4\") " pod="openstack/openstack-galera-0" Feb 18 14:15:49 crc kubenswrapper[4817]: I0218 14:15:49.982774 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/def7b080-de6e-49f1-9437-44d6f40b48c4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"def7b080-de6e-49f1-9437-44d6f40b48c4\") " pod="openstack/openstack-galera-0" Feb 18 14:15:49 crc kubenswrapper[4817]: I0218 14:15:49.983825 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/def7b080-de6e-49f1-9437-44d6f40b48c4-kolla-config\") pod \"openstack-galera-0\" (UID: \"def7b080-de6e-49f1-9437-44d6f40b48c4\") " pod="openstack/openstack-galera-0" Feb 18 14:15:49 crc kubenswrapper[4817]: I0218 14:15:49.984133 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/def7b080-de6e-49f1-9437-44d6f40b48c4-config-data-default\") pod \"openstack-galera-0\" (UID: \"def7b080-de6e-49f1-9437-44d6f40b48c4\") " pod="openstack/openstack-galera-0" Feb 18 14:15:49 crc kubenswrapper[4817]: I0218 14:15:49.984494 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/def7b080-de6e-49f1-9437-44d6f40b48c4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"def7b080-de6e-49f1-9437-44d6f40b48c4\") " pod="openstack/openstack-galera-0" Feb 18 14:15:49 crc kubenswrapper[4817]: I0218 14:15:49.985686 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/def7b080-de6e-49f1-9437-44d6f40b48c4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"def7b080-de6e-49f1-9437-44d6f40b48c4\") " pod="openstack/openstack-galera-0" Feb 18 14:15:49 crc kubenswrapper[4817]: I0218 14:15:49.987377 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/def7b080-de6e-49f1-9437-44d6f40b48c4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"def7b080-de6e-49f1-9437-44d6f40b48c4\") " pod="openstack/openstack-galera-0" Feb 18 14:15:49 crc kubenswrapper[4817]: I0218 14:15:49.988927 4817 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 18 14:15:49 crc kubenswrapper[4817]: I0218 14:15:49.988994 4817 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d84b0391-3ccd-400a-90bf-65bd569fc51b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d84b0391-3ccd-400a-90bf-65bd569fc51b\") pod \"openstack-galera-0\" (UID: \"def7b080-de6e-49f1-9437-44d6f40b48c4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/402b729bbbb093077608d3fb14ecbe22a2861c7e593dee4283aed811a3d399c2/globalmount\"" pod="openstack/openstack-galera-0" Feb 18 14:15:49 crc kubenswrapper[4817]: I0218 14:15:49.989275 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/def7b080-de6e-49f1-9437-44d6f40b48c4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"def7b080-de6e-49f1-9437-44d6f40b48c4\") " pod="openstack/openstack-galera-0" Feb 18 14:15:50 crc kubenswrapper[4817]: I0218 14:15:50.001461 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lthwz\" (UniqueName: \"kubernetes.io/projected/def7b080-de6e-49f1-9437-44d6f40b48c4-kube-api-access-lthwz\") pod \"openstack-galera-0\" (UID: \"def7b080-de6e-49f1-9437-44d6f40b48c4\") " pod="openstack/openstack-galera-0" Feb 18 14:15:50 crc kubenswrapper[4817]: I0218 14:15:50.032454 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d84b0391-3ccd-400a-90bf-65bd569fc51b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d84b0391-3ccd-400a-90bf-65bd569fc51b\") pod \"openstack-galera-0\" (UID: \"def7b080-de6e-49f1-9437-44d6f40b48c4\") " pod="openstack/openstack-galera-0" Feb 18 14:15:50 crc kubenswrapper[4817]: I0218 14:15:50.137589 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.129859 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.132639 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.137730 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.137735 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-2pr7d" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.138211 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.138415 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.141261 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.300828 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-535c82af-06f9-4ef6-b7d5-309a7777e4cf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-535c82af-06f9-4ef6-b7d5-309a7777e4cf\") pod \"openstack-cell1-galera-0\" (UID: \"641785a9-2372-4857-8882-192bf7d7fe45\") " pod="openstack/openstack-cell1-galera-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.300878 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/641785a9-2372-4857-8882-192bf7d7fe45-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"641785a9-2372-4857-8882-192bf7d7fe45\") " pod="openstack/openstack-cell1-galera-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.300914 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prh8r\" (UniqueName: \"kubernetes.io/projected/641785a9-2372-4857-8882-192bf7d7fe45-kube-api-access-prh8r\") pod \"openstack-cell1-galera-0\" (UID: \"641785a9-2372-4857-8882-192bf7d7fe45\") " pod="openstack/openstack-cell1-galera-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.300963 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/641785a9-2372-4857-8882-192bf7d7fe45-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"641785a9-2372-4857-8882-192bf7d7fe45\") " pod="openstack/openstack-cell1-galera-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.301034 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/641785a9-2372-4857-8882-192bf7d7fe45-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"641785a9-2372-4857-8882-192bf7d7fe45\") " pod="openstack/openstack-cell1-galera-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.301101 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/641785a9-2372-4857-8882-192bf7d7fe45-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"641785a9-2372-4857-8882-192bf7d7fe45\") " pod="openstack/openstack-cell1-galera-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.301168 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/641785a9-2372-4857-8882-192bf7d7fe45-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"641785a9-2372-4857-8882-192bf7d7fe45\") " pod="openstack/openstack-cell1-galera-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.301195 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/641785a9-2372-4857-8882-192bf7d7fe45-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"641785a9-2372-4857-8882-192bf7d7fe45\") " pod="openstack/openstack-cell1-galera-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.399168 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.401736 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.402546 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/641785a9-2372-4857-8882-192bf7d7fe45-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"641785a9-2372-4857-8882-192bf7d7fe45\") " pod="openstack/openstack-cell1-galera-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.402615 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/641785a9-2372-4857-8882-192bf7d7fe45-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"641785a9-2372-4857-8882-192bf7d7fe45\") " pod="openstack/openstack-cell1-galera-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.402633 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/641785a9-2372-4857-8882-192bf7d7fe45-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"641785a9-2372-4857-8882-192bf7d7fe45\") " pod="openstack/openstack-cell1-galera-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.402694 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-535c82af-06f9-4ef6-b7d5-309a7777e4cf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-535c82af-06f9-4ef6-b7d5-309a7777e4cf\") pod \"openstack-cell1-galera-0\" (UID: \"641785a9-2372-4857-8882-192bf7d7fe45\") " pod="openstack/openstack-cell1-galera-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.402722 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/641785a9-2372-4857-8882-192bf7d7fe45-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"641785a9-2372-4857-8882-192bf7d7fe45\") " pod="openstack/openstack-cell1-galera-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.402750 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prh8r\" (UniqueName: \"kubernetes.io/projected/641785a9-2372-4857-8882-192bf7d7fe45-kube-api-access-prh8r\") pod \"openstack-cell1-galera-0\" (UID: \"641785a9-2372-4857-8882-192bf7d7fe45\") " pod="openstack/openstack-cell1-galera-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.402787 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/641785a9-2372-4857-8882-192bf7d7fe45-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"641785a9-2372-4857-8882-192bf7d7fe45\") " pod="openstack/openstack-cell1-galera-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.402829 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/641785a9-2372-4857-8882-192bf7d7fe45-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"641785a9-2372-4857-8882-192bf7d7fe45\") " pod="openstack/openstack-cell1-galera-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.404365 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/641785a9-2372-4857-8882-192bf7d7fe45-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"641785a9-2372-4857-8882-192bf7d7fe45\") " pod="openstack/openstack-cell1-galera-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.404887 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/641785a9-2372-4857-8882-192bf7d7fe45-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"641785a9-2372-4857-8882-192bf7d7fe45\") " pod="openstack/openstack-cell1-galera-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.405317 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/641785a9-2372-4857-8882-192bf7d7fe45-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"641785a9-2372-4857-8882-192bf7d7fe45\") " pod="openstack/openstack-cell1-galera-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.405621 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/641785a9-2372-4857-8882-192bf7d7fe45-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"641785a9-2372-4857-8882-192bf7d7fe45\") " pod="openstack/openstack-cell1-galera-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.406342 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.406568 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.406849 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-hsn7w" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.408326 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/641785a9-2372-4857-8882-192bf7d7fe45-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"641785a9-2372-4857-8882-192bf7d7fe45\") " pod="openstack/openstack-cell1-galera-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.409296 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/641785a9-2372-4857-8882-192bf7d7fe45-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"641785a9-2372-4857-8882-192bf7d7fe45\") " pod="openstack/openstack-cell1-galera-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.410177 4817 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.410206 4817 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-535c82af-06f9-4ef6-b7d5-309a7777e4cf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-535c82af-06f9-4ef6-b7d5-309a7777e4cf\") pod \"openstack-cell1-galera-0\" (UID: \"641785a9-2372-4857-8882-192bf7d7fe45\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/32ecb024218ed71711a4da17045eee0b0cd71ac78b1c160a4be5f01fc5eb55d1/globalmount\"" pod="openstack/openstack-cell1-galera-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.413514 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.426401 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prh8r\" (UniqueName: \"kubernetes.io/projected/641785a9-2372-4857-8882-192bf7d7fe45-kube-api-access-prh8r\") pod \"openstack-cell1-galera-0\" (UID: \"641785a9-2372-4857-8882-192bf7d7fe45\") " pod="openstack/openstack-cell1-galera-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.456831 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-535c82af-06f9-4ef6-b7d5-309a7777e4cf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-535c82af-06f9-4ef6-b7d5-309a7777e4cf\") pod \"openstack-cell1-galera-0\" (UID: \"641785a9-2372-4857-8882-192bf7d7fe45\") " pod="openstack/openstack-cell1-galera-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.462230 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.504376 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb347a6f-041d-41e7-be8b-b151f150e6ab-memcached-tls-certs\") pod \"memcached-0\" (UID: \"cb347a6f-041d-41e7-be8b-b151f150e6ab\") " pod="openstack/memcached-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.504837 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8rrb\" (UniqueName: \"kubernetes.io/projected/cb347a6f-041d-41e7-be8b-b151f150e6ab-kube-api-access-d8rrb\") pod \"memcached-0\" (UID: \"cb347a6f-041d-41e7-be8b-b151f150e6ab\") " pod="openstack/memcached-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.504880 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb347a6f-041d-41e7-be8b-b151f150e6ab-combined-ca-bundle\") pod \"memcached-0\" (UID: \"cb347a6f-041d-41e7-be8b-b151f150e6ab\") " pod="openstack/memcached-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.504935 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb347a6f-041d-41e7-be8b-b151f150e6ab-config-data\") pod \"memcached-0\" (UID: \"cb347a6f-041d-41e7-be8b-b151f150e6ab\") " pod="openstack/memcached-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.504958 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cb347a6f-041d-41e7-be8b-b151f150e6ab-kolla-config\") pod \"memcached-0\" (UID: \"cb347a6f-041d-41e7-be8b-b151f150e6ab\") " pod="openstack/memcached-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.607830 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb347a6f-041d-41e7-be8b-b151f150e6ab-combined-ca-bundle\") pod \"memcached-0\" (UID: \"cb347a6f-041d-41e7-be8b-b151f150e6ab\") " pod="openstack/memcached-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.608850 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb347a6f-041d-41e7-be8b-b151f150e6ab-config-data\") pod \"memcached-0\" (UID: \"cb347a6f-041d-41e7-be8b-b151f150e6ab\") " pod="openstack/memcached-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.608887 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cb347a6f-041d-41e7-be8b-b151f150e6ab-kolla-config\") pod \"memcached-0\" (UID: \"cb347a6f-041d-41e7-be8b-b151f150e6ab\") " pod="openstack/memcached-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.608936 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb347a6f-041d-41e7-be8b-b151f150e6ab-memcached-tls-certs\") pod \"memcached-0\" (UID: \"cb347a6f-041d-41e7-be8b-b151f150e6ab\") " pod="openstack/memcached-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.609031 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8rrb\" (UniqueName: \"kubernetes.io/projected/cb347a6f-041d-41e7-be8b-b151f150e6ab-kube-api-access-d8rrb\") pod \"memcached-0\" (UID: \"cb347a6f-041d-41e7-be8b-b151f150e6ab\") " pod="openstack/memcached-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.610967 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cb347a6f-041d-41e7-be8b-b151f150e6ab-kolla-config\") pod \"memcached-0\" (UID: \"cb347a6f-041d-41e7-be8b-b151f150e6ab\") " pod="openstack/memcached-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.611065 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb347a6f-041d-41e7-be8b-b151f150e6ab-config-data\") pod \"memcached-0\" (UID: \"cb347a6f-041d-41e7-be8b-b151f150e6ab\") " pod="openstack/memcached-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.622950 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb347a6f-041d-41e7-be8b-b151f150e6ab-memcached-tls-certs\") pod \"memcached-0\" (UID: \"cb347a6f-041d-41e7-be8b-b151f150e6ab\") " pod="openstack/memcached-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.622953 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb347a6f-041d-41e7-be8b-b151f150e6ab-combined-ca-bundle\") pod \"memcached-0\" (UID: \"cb347a6f-041d-41e7-be8b-b151f150e6ab\") " pod="openstack/memcached-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.634887 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8rrb\" (UniqueName: \"kubernetes.io/projected/cb347a6f-041d-41e7-be8b-b151f150e6ab-kube-api-access-d8rrb\") pod \"memcached-0\" (UID: \"cb347a6f-041d-41e7-be8b-b151f150e6ab\") " pod="openstack/memcached-0" Feb 18 14:15:51 crc kubenswrapper[4817]: I0218 14:15:51.803154 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 18 14:15:54 crc kubenswrapper[4817]: I0218 14:15:54.069250 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 14:15:54 crc kubenswrapper[4817]: I0218 14:15:54.070803 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 18 14:15:54 crc kubenswrapper[4817]: I0218 14:15:54.074204 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-h9txp" Feb 18 14:15:54 crc kubenswrapper[4817]: I0218 14:15:54.080537 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 14:15:54 crc kubenswrapper[4817]: I0218 14:15:54.263919 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4982j\" (UniqueName: \"kubernetes.io/projected/b50fdcb5-1983-4d14-ab5f-390b3dc090ce-kube-api-access-4982j\") pod \"kube-state-metrics-0\" (UID: \"b50fdcb5-1983-4d14-ab5f-390b3dc090ce\") " pod="openstack/kube-state-metrics-0" Feb 18 14:15:54 crc kubenswrapper[4817]: I0218 14:15:54.369774 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4982j\" (UniqueName: \"kubernetes.io/projected/b50fdcb5-1983-4d14-ab5f-390b3dc090ce-kube-api-access-4982j\") pod \"kube-state-metrics-0\" (UID: \"b50fdcb5-1983-4d14-ab5f-390b3dc090ce\") " pod="openstack/kube-state-metrics-0" Feb 18 14:15:54 crc kubenswrapper[4817]: I0218 14:15:54.403063 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4982j\" (UniqueName: \"kubernetes.io/projected/b50fdcb5-1983-4d14-ab5f-390b3dc090ce-kube-api-access-4982j\") pod \"kube-state-metrics-0\" (UID: \"b50fdcb5-1983-4d14-ab5f-390b3dc090ce\") " pod="openstack/kube-state-metrics-0" Feb 18 14:15:54 crc kubenswrapper[4817]: I0218 14:15:54.694777 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 18 14:15:54 crc kubenswrapper[4817]: I0218 14:15:54.813029 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 18 14:15:54 crc kubenswrapper[4817]: I0218 14:15:54.814669 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 18 14:15:54 crc kubenswrapper[4817]: I0218 14:15:54.816885 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Feb 18 14:15:54 crc kubenswrapper[4817]: I0218 14:15:54.817295 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Feb 18 14:15:54 crc kubenswrapper[4817]: I0218 14:15:54.817406 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Feb 18 14:15:54 crc kubenswrapper[4817]: I0218 14:15:54.820056 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Feb 18 14:15:54 crc kubenswrapper[4817]: I0218 14:15:54.821739 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-tv6lj" Feb 18 14:15:54 crc kubenswrapper[4817]: I0218 14:15:54.833522 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 18 14:15:54 crc kubenswrapper[4817]: I0218 14:15:54.984004 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9e5146f3-4a88-4e31-82e7-0e0f72188d22-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"9e5146f3-4a88-4e31-82e7-0e0f72188d22\") " pod="openstack/alertmanager-metric-storage-0" Feb 18 14:15:54 crc kubenswrapper[4817]: I0218 14:15:54.984320 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24txn\" (UniqueName: \"kubernetes.io/projected/9e5146f3-4a88-4e31-82e7-0e0f72188d22-kube-api-access-24txn\") pod \"alertmanager-metric-storage-0\" (UID: \"9e5146f3-4a88-4e31-82e7-0e0f72188d22\") " pod="openstack/alertmanager-metric-storage-0" Feb 18 14:15:54 crc kubenswrapper[4817]: I0218 14:15:54.984342 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9e5146f3-4a88-4e31-82e7-0e0f72188d22-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"9e5146f3-4a88-4e31-82e7-0e0f72188d22\") " pod="openstack/alertmanager-metric-storage-0" Feb 18 14:15:54 crc kubenswrapper[4817]: I0218 14:15:54.984485 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9e5146f3-4a88-4e31-82e7-0e0f72188d22-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"9e5146f3-4a88-4e31-82e7-0e0f72188d22\") " pod="openstack/alertmanager-metric-storage-0" Feb 18 14:15:54 crc kubenswrapper[4817]: I0218 14:15:54.984525 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9e5146f3-4a88-4e31-82e7-0e0f72188d22-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"9e5146f3-4a88-4e31-82e7-0e0f72188d22\") " pod="openstack/alertmanager-metric-storage-0" Feb 18 14:15:54 crc kubenswrapper[4817]: I0218 14:15:54.984577 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/9e5146f3-4a88-4e31-82e7-0e0f72188d22-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"9e5146f3-4a88-4e31-82e7-0e0f72188d22\") " pod="openstack/alertmanager-metric-storage-0" Feb 18 14:15:54 crc kubenswrapper[4817]: I0218 14:15:54.984722 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9e5146f3-4a88-4e31-82e7-0e0f72188d22-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"9e5146f3-4a88-4e31-82e7-0e0f72188d22\") " pod="openstack/alertmanager-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.086380 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24txn\" (UniqueName: \"kubernetes.io/projected/9e5146f3-4a88-4e31-82e7-0e0f72188d22-kube-api-access-24txn\") pod \"alertmanager-metric-storage-0\" (UID: \"9e5146f3-4a88-4e31-82e7-0e0f72188d22\") " pod="openstack/alertmanager-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.086438 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9e5146f3-4a88-4e31-82e7-0e0f72188d22-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"9e5146f3-4a88-4e31-82e7-0e0f72188d22\") " pod="openstack/alertmanager-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.086499 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9e5146f3-4a88-4e31-82e7-0e0f72188d22-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"9e5146f3-4a88-4e31-82e7-0e0f72188d22\") " pod="openstack/alertmanager-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.086526 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9e5146f3-4a88-4e31-82e7-0e0f72188d22-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"9e5146f3-4a88-4e31-82e7-0e0f72188d22\") " pod="openstack/alertmanager-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.086580 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/9e5146f3-4a88-4e31-82e7-0e0f72188d22-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"9e5146f3-4a88-4e31-82e7-0e0f72188d22\") " pod="openstack/alertmanager-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.086621 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9e5146f3-4a88-4e31-82e7-0e0f72188d22-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"9e5146f3-4a88-4e31-82e7-0e0f72188d22\") " pod="openstack/alertmanager-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.086654 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9e5146f3-4a88-4e31-82e7-0e0f72188d22-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"9e5146f3-4a88-4e31-82e7-0e0f72188d22\") " pod="openstack/alertmanager-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.087199 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/9e5146f3-4a88-4e31-82e7-0e0f72188d22-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"9e5146f3-4a88-4e31-82e7-0e0f72188d22\") " pod="openstack/alertmanager-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.090145 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9e5146f3-4a88-4e31-82e7-0e0f72188d22-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"9e5146f3-4a88-4e31-82e7-0e0f72188d22\") " pod="openstack/alertmanager-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.091770 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9e5146f3-4a88-4e31-82e7-0e0f72188d22-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"9e5146f3-4a88-4e31-82e7-0e0f72188d22\") " pod="openstack/alertmanager-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.096243 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9e5146f3-4a88-4e31-82e7-0e0f72188d22-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"9e5146f3-4a88-4e31-82e7-0e0f72188d22\") " pod="openstack/alertmanager-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.096406 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9e5146f3-4a88-4e31-82e7-0e0f72188d22-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"9e5146f3-4a88-4e31-82e7-0e0f72188d22\") " pod="openstack/alertmanager-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.101504 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24txn\" (UniqueName: \"kubernetes.io/projected/9e5146f3-4a88-4e31-82e7-0e0f72188d22-kube-api-access-24txn\") pod \"alertmanager-metric-storage-0\" (UID: \"9e5146f3-4a88-4e31-82e7-0e0f72188d22\") " pod="openstack/alertmanager-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.102694 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9e5146f3-4a88-4e31-82e7-0e0f72188d22-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"9e5146f3-4a88-4e31-82e7-0e0f72188d22\") " pod="openstack/alertmanager-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.138555 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.408208 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.410533 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.413020 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-287x5" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.413220 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.413347 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.413247 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.413754 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.413260 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.414043 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.414494 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.440963 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.596218 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-04929794-6ef0-4dce-978a-755fd164a7e0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-04929794-6ef0-4dce-978a-755fd164a7e0\") pod \"prometheus-metric-storage-0\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.596345 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.596467 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.596543 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.596581 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.596615 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t56w9\" (UniqueName: \"kubernetes.io/projected/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-kube-api-access-t56w9\") pod \"prometheus-metric-storage-0\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.596711 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.596793 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-config\") pod \"prometheus-metric-storage-0\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.596881 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.596900 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.699023 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.699080 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.699138 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-04929794-6ef0-4dce-978a-755fd164a7e0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-04929794-6ef0-4dce-978a-755fd164a7e0\") pod \"prometheus-metric-storage-0\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.699180 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.699214 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.699230 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.699254 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.699275 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t56w9\" (UniqueName: \"kubernetes.io/projected/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-kube-api-access-t56w9\") pod \"prometheus-metric-storage-0\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.699294 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.699311 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-config\") pod \"prometheus-metric-storage-0\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.702141 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.702618 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.703061 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.705896 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.706434 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.708626 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-config\") pod \"prometheus-metric-storage-0\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.708843 4817 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.708890 4817 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-04929794-6ef0-4dce-978a-755fd164a7e0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-04929794-6ef0-4dce-978a-755fd164a7e0\") pod \"prometheus-metric-storage-0\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ab4d68949bad9bb24db26ca4380e123968c4e63f4e0711630b3924ef3b41508c/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.709998 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.710537 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.729478 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t56w9\" (UniqueName: \"kubernetes.io/projected/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-kube-api-access-t56w9\") pod \"prometheus-metric-storage-0\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:15:55 crc kubenswrapper[4817]: I0218 14:15:55.746468 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-04929794-6ef0-4dce-978a-755fd164a7e0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-04929794-6ef0-4dce-978a-755fd164a7e0\") pod \"prometheus-metric-storage-0\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:15:56 crc kubenswrapper[4817]: I0218 14:15:56.046257 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 18 14:15:56 crc kubenswrapper[4817]: I0218 14:15:56.968619 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9bcxg"] Feb 18 14:15:56 crc kubenswrapper[4817]: I0218 14:15:56.970173 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9bcxg" Feb 18 14:15:56 crc kubenswrapper[4817]: I0218 14:15:56.979398 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-jgtzs" Feb 18 14:15:56 crc kubenswrapper[4817]: I0218 14:15:56.979814 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 18 14:15:56 crc kubenswrapper[4817]: I0218 14:15:56.980072 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 18 14:15:56 crc kubenswrapper[4817]: I0218 14:15:56.988749 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9bcxg"] Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.006323 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-46wx9"] Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.008360 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-46wx9" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.024140 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-46wx9"] Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.127429 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2a166377-16ac-4c6b-9207-cddf8c814dc1-etc-ovs\") pod \"ovn-controller-ovs-46wx9\" (UID: \"2a166377-16ac-4c6b-9207-cddf8c814dc1\") " pod="openstack/ovn-controller-ovs-46wx9" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.127507 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddb73215-bd2a-47eb-bbcf-b4708117244f-ovn-controller-tls-certs\") pod \"ovn-controller-9bcxg\" (UID: \"ddb73215-bd2a-47eb-bbcf-b4708117244f\") " pod="openstack/ovn-controller-9bcxg" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.127539 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ddb73215-bd2a-47eb-bbcf-b4708117244f-var-log-ovn\") pod \"ovn-controller-9bcxg\" (UID: \"ddb73215-bd2a-47eb-bbcf-b4708117244f\") " pod="openstack/ovn-controller-9bcxg" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.127565 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ddb73215-bd2a-47eb-bbcf-b4708117244f-var-run-ovn\") pod \"ovn-controller-9bcxg\" (UID: \"ddb73215-bd2a-47eb-bbcf-b4708117244f\") " pod="openstack/ovn-controller-9bcxg" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.127726 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddb73215-bd2a-47eb-bbcf-b4708117244f-combined-ca-bundle\") pod \"ovn-controller-9bcxg\" (UID: \"ddb73215-bd2a-47eb-bbcf-b4708117244f\") " pod="openstack/ovn-controller-9bcxg" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.127808 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2a166377-16ac-4c6b-9207-cddf8c814dc1-var-run\") pod \"ovn-controller-ovs-46wx9\" (UID: \"2a166377-16ac-4c6b-9207-cddf8c814dc1\") " pod="openstack/ovn-controller-ovs-46wx9" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.127884 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2a166377-16ac-4c6b-9207-cddf8c814dc1-var-log\") pod \"ovn-controller-ovs-46wx9\" (UID: \"2a166377-16ac-4c6b-9207-cddf8c814dc1\") " pod="openstack/ovn-controller-ovs-46wx9" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.127900 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a166377-16ac-4c6b-9207-cddf8c814dc1-scripts\") pod \"ovn-controller-ovs-46wx9\" (UID: \"2a166377-16ac-4c6b-9207-cddf8c814dc1\") " pod="openstack/ovn-controller-ovs-46wx9" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.127970 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ddb73215-bd2a-47eb-bbcf-b4708117244f-var-run\") pod \"ovn-controller-9bcxg\" (UID: \"ddb73215-bd2a-47eb-bbcf-b4708117244f\") " pod="openstack/ovn-controller-9bcxg" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.128036 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lplc\" (UniqueName: \"kubernetes.io/projected/ddb73215-bd2a-47eb-bbcf-b4708117244f-kube-api-access-6lplc\") pod \"ovn-controller-9bcxg\" (UID: \"ddb73215-bd2a-47eb-bbcf-b4708117244f\") " pod="openstack/ovn-controller-9bcxg" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.128110 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2a166377-16ac-4c6b-9207-cddf8c814dc1-var-lib\") pod \"ovn-controller-ovs-46wx9\" (UID: \"2a166377-16ac-4c6b-9207-cddf8c814dc1\") " pod="openstack/ovn-controller-ovs-46wx9" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.128243 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ddb73215-bd2a-47eb-bbcf-b4708117244f-scripts\") pod \"ovn-controller-9bcxg\" (UID: \"ddb73215-bd2a-47eb-bbcf-b4708117244f\") " pod="openstack/ovn-controller-9bcxg" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.128275 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v56lc\" (UniqueName: \"kubernetes.io/projected/2a166377-16ac-4c6b-9207-cddf8c814dc1-kube-api-access-v56lc\") pod \"ovn-controller-ovs-46wx9\" (UID: \"2a166377-16ac-4c6b-9207-cddf8c814dc1\") " pod="openstack/ovn-controller-ovs-46wx9" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.230178 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ddb73215-bd2a-47eb-bbcf-b4708117244f-scripts\") pod \"ovn-controller-9bcxg\" (UID: \"ddb73215-bd2a-47eb-bbcf-b4708117244f\") " pod="openstack/ovn-controller-9bcxg" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.230221 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v56lc\" (UniqueName: \"kubernetes.io/projected/2a166377-16ac-4c6b-9207-cddf8c814dc1-kube-api-access-v56lc\") pod \"ovn-controller-ovs-46wx9\" (UID: \"2a166377-16ac-4c6b-9207-cddf8c814dc1\") " pod="openstack/ovn-controller-ovs-46wx9" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.230253 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2a166377-16ac-4c6b-9207-cddf8c814dc1-etc-ovs\") pod \"ovn-controller-ovs-46wx9\" (UID: \"2a166377-16ac-4c6b-9207-cddf8c814dc1\") " pod="openstack/ovn-controller-ovs-46wx9" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.230295 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddb73215-bd2a-47eb-bbcf-b4708117244f-ovn-controller-tls-certs\") pod \"ovn-controller-9bcxg\" (UID: \"ddb73215-bd2a-47eb-bbcf-b4708117244f\") " pod="openstack/ovn-controller-9bcxg" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.230312 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ddb73215-bd2a-47eb-bbcf-b4708117244f-var-log-ovn\") pod \"ovn-controller-9bcxg\" (UID: \"ddb73215-bd2a-47eb-bbcf-b4708117244f\") " pod="openstack/ovn-controller-9bcxg" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.230352 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ddb73215-bd2a-47eb-bbcf-b4708117244f-var-run-ovn\") pod \"ovn-controller-9bcxg\" (UID: \"ddb73215-bd2a-47eb-bbcf-b4708117244f\") " pod="openstack/ovn-controller-9bcxg" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.230393 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddb73215-bd2a-47eb-bbcf-b4708117244f-combined-ca-bundle\") pod \"ovn-controller-9bcxg\" (UID: \"ddb73215-bd2a-47eb-bbcf-b4708117244f\") " pod="openstack/ovn-controller-9bcxg" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.230425 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2a166377-16ac-4c6b-9207-cddf8c814dc1-var-run\") pod \"ovn-controller-ovs-46wx9\" (UID: \"2a166377-16ac-4c6b-9207-cddf8c814dc1\") " pod="openstack/ovn-controller-ovs-46wx9" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.230460 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2a166377-16ac-4c6b-9207-cddf8c814dc1-var-log\") pod \"ovn-controller-ovs-46wx9\" (UID: \"2a166377-16ac-4c6b-9207-cddf8c814dc1\") " pod="openstack/ovn-controller-ovs-46wx9" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.230477 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a166377-16ac-4c6b-9207-cddf8c814dc1-scripts\") pod \"ovn-controller-ovs-46wx9\" (UID: \"2a166377-16ac-4c6b-9207-cddf8c814dc1\") " pod="openstack/ovn-controller-ovs-46wx9" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.230514 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ddb73215-bd2a-47eb-bbcf-b4708117244f-var-run\") pod \"ovn-controller-9bcxg\" (UID: \"ddb73215-bd2a-47eb-bbcf-b4708117244f\") " pod="openstack/ovn-controller-9bcxg" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.230533 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lplc\" (UniqueName: \"kubernetes.io/projected/ddb73215-bd2a-47eb-bbcf-b4708117244f-kube-api-access-6lplc\") pod \"ovn-controller-9bcxg\" (UID: \"ddb73215-bd2a-47eb-bbcf-b4708117244f\") " pod="openstack/ovn-controller-9bcxg" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.230552 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2a166377-16ac-4c6b-9207-cddf8c814dc1-var-lib\") pod \"ovn-controller-ovs-46wx9\" (UID: \"2a166377-16ac-4c6b-9207-cddf8c814dc1\") " pod="openstack/ovn-controller-ovs-46wx9" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.231222 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2a166377-16ac-4c6b-9207-cddf8c814dc1-var-lib\") pod \"ovn-controller-ovs-46wx9\" (UID: \"2a166377-16ac-4c6b-9207-cddf8c814dc1\") " pod="openstack/ovn-controller-ovs-46wx9" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.231449 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2a166377-16ac-4c6b-9207-cddf8c814dc1-etc-ovs\") pod \"ovn-controller-ovs-46wx9\" (UID: \"2a166377-16ac-4c6b-9207-cddf8c814dc1\") " pod="openstack/ovn-controller-ovs-46wx9" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.231570 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ddb73215-bd2a-47eb-bbcf-b4708117244f-var-run\") pod \"ovn-controller-9bcxg\" (UID: \"ddb73215-bd2a-47eb-bbcf-b4708117244f\") " pod="openstack/ovn-controller-9bcxg" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.231584 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2a166377-16ac-4c6b-9207-cddf8c814dc1-var-run\") pod \"ovn-controller-ovs-46wx9\" (UID: \"2a166377-16ac-4c6b-9207-cddf8c814dc1\") " pod="openstack/ovn-controller-ovs-46wx9" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.231615 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2a166377-16ac-4c6b-9207-cddf8c814dc1-var-log\") pod \"ovn-controller-ovs-46wx9\" (UID: \"2a166377-16ac-4c6b-9207-cddf8c814dc1\") " pod="openstack/ovn-controller-ovs-46wx9" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.231727 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ddb73215-bd2a-47eb-bbcf-b4708117244f-var-run-ovn\") pod \"ovn-controller-9bcxg\" (UID: \"ddb73215-bd2a-47eb-bbcf-b4708117244f\") " pod="openstack/ovn-controller-9bcxg" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.231861 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ddb73215-bd2a-47eb-bbcf-b4708117244f-var-log-ovn\") pod \"ovn-controller-9bcxg\" (UID: \"ddb73215-bd2a-47eb-bbcf-b4708117244f\") " pod="openstack/ovn-controller-9bcxg" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.232682 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ddb73215-bd2a-47eb-bbcf-b4708117244f-scripts\") pod \"ovn-controller-9bcxg\" (UID: \"ddb73215-bd2a-47eb-bbcf-b4708117244f\") " pod="openstack/ovn-controller-9bcxg" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.234160 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a166377-16ac-4c6b-9207-cddf8c814dc1-scripts\") pod \"ovn-controller-ovs-46wx9\" (UID: \"2a166377-16ac-4c6b-9207-cddf8c814dc1\") " pod="openstack/ovn-controller-ovs-46wx9" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.237173 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddb73215-bd2a-47eb-bbcf-b4708117244f-combined-ca-bundle\") pod \"ovn-controller-9bcxg\" (UID: \"ddb73215-bd2a-47eb-bbcf-b4708117244f\") " pod="openstack/ovn-controller-9bcxg" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.242701 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddb73215-bd2a-47eb-bbcf-b4708117244f-ovn-controller-tls-certs\") pod \"ovn-controller-9bcxg\" (UID: \"ddb73215-bd2a-47eb-bbcf-b4708117244f\") " pod="openstack/ovn-controller-9bcxg" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.251375 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v56lc\" (UniqueName: \"kubernetes.io/projected/2a166377-16ac-4c6b-9207-cddf8c814dc1-kube-api-access-v56lc\") pod \"ovn-controller-ovs-46wx9\" (UID: \"2a166377-16ac-4c6b-9207-cddf8c814dc1\") " pod="openstack/ovn-controller-ovs-46wx9" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.251555 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lplc\" (UniqueName: \"kubernetes.io/projected/ddb73215-bd2a-47eb-bbcf-b4708117244f-kube-api-access-6lplc\") pod \"ovn-controller-9bcxg\" (UID: \"ddb73215-bd2a-47eb-bbcf-b4708117244f\") " pod="openstack/ovn-controller-9bcxg" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.301410 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9bcxg" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.329293 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-46wx9" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.849045 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.855813 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.868008 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.868370 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.868418 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.868548 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-j7nf6" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.868686 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.873366 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.943726 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b89d17a9-16cb-4abe-ba88-107ce95dbceb-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b89d17a9-16cb-4abe-ba88-107ce95dbceb\") " pod="openstack/ovsdbserver-nb-0" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.943774 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b89d17a9-16cb-4abe-ba88-107ce95dbceb-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b89d17a9-16cb-4abe-ba88-107ce95dbceb\") " pod="openstack/ovsdbserver-nb-0" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.943810 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b89d17a9-16cb-4abe-ba88-107ce95dbceb-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b89d17a9-16cb-4abe-ba88-107ce95dbceb\") " pod="openstack/ovsdbserver-nb-0" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.943868 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b89d17a9-16cb-4abe-ba88-107ce95dbceb-config\") pod \"ovsdbserver-nb-0\" (UID: \"b89d17a9-16cb-4abe-ba88-107ce95dbceb\") " pod="openstack/ovsdbserver-nb-0" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.943946 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ece93d41-e78c-442a-8ee3-382130a46e23\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ece93d41-e78c-442a-8ee3-382130a46e23\") pod \"ovsdbserver-nb-0\" (UID: \"b89d17a9-16cb-4abe-ba88-107ce95dbceb\") " pod="openstack/ovsdbserver-nb-0" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.944048 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jg7s\" (UniqueName: \"kubernetes.io/projected/b89d17a9-16cb-4abe-ba88-107ce95dbceb-kube-api-access-4jg7s\") pod \"ovsdbserver-nb-0\" (UID: \"b89d17a9-16cb-4abe-ba88-107ce95dbceb\") " pod="openstack/ovsdbserver-nb-0" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.944097 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b89d17a9-16cb-4abe-ba88-107ce95dbceb-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b89d17a9-16cb-4abe-ba88-107ce95dbceb\") " pod="openstack/ovsdbserver-nb-0" Feb 18 14:15:57 crc kubenswrapper[4817]: I0218 14:15:57.944186 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b89d17a9-16cb-4abe-ba88-107ce95dbceb-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b89d17a9-16cb-4abe-ba88-107ce95dbceb\") " pod="openstack/ovsdbserver-nb-0" Feb 18 14:15:58 crc kubenswrapper[4817]: I0218 14:15:58.045786 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jg7s\" (UniqueName: \"kubernetes.io/projected/b89d17a9-16cb-4abe-ba88-107ce95dbceb-kube-api-access-4jg7s\") pod \"ovsdbserver-nb-0\" (UID: \"b89d17a9-16cb-4abe-ba88-107ce95dbceb\") " pod="openstack/ovsdbserver-nb-0" Feb 18 14:15:58 crc kubenswrapper[4817]: I0218 14:15:58.045839 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b89d17a9-16cb-4abe-ba88-107ce95dbceb-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b89d17a9-16cb-4abe-ba88-107ce95dbceb\") " pod="openstack/ovsdbserver-nb-0" Feb 18 14:15:58 crc kubenswrapper[4817]: I0218 14:15:58.045877 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b89d17a9-16cb-4abe-ba88-107ce95dbceb-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b89d17a9-16cb-4abe-ba88-107ce95dbceb\") " pod="openstack/ovsdbserver-nb-0" Feb 18 14:15:58 crc kubenswrapper[4817]: I0218 14:15:58.045931 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b89d17a9-16cb-4abe-ba88-107ce95dbceb-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b89d17a9-16cb-4abe-ba88-107ce95dbceb\") " pod="openstack/ovsdbserver-nb-0" Feb 18 14:15:58 crc kubenswrapper[4817]: I0218 14:15:58.045959 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b89d17a9-16cb-4abe-ba88-107ce95dbceb-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b89d17a9-16cb-4abe-ba88-107ce95dbceb\") " pod="openstack/ovsdbserver-nb-0" Feb 18 14:15:58 crc kubenswrapper[4817]: I0218 14:15:58.045999 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b89d17a9-16cb-4abe-ba88-107ce95dbceb-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b89d17a9-16cb-4abe-ba88-107ce95dbceb\") " pod="openstack/ovsdbserver-nb-0" Feb 18 14:15:58 crc kubenswrapper[4817]: I0218 14:15:58.046031 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b89d17a9-16cb-4abe-ba88-107ce95dbceb-config\") pod \"ovsdbserver-nb-0\" (UID: \"b89d17a9-16cb-4abe-ba88-107ce95dbceb\") " pod="openstack/ovsdbserver-nb-0" Feb 18 14:15:58 crc kubenswrapper[4817]: I0218 14:15:58.046056 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ece93d41-e78c-442a-8ee3-382130a46e23\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ece93d41-e78c-442a-8ee3-382130a46e23\") pod \"ovsdbserver-nb-0\" (UID: \"b89d17a9-16cb-4abe-ba88-107ce95dbceb\") " pod="openstack/ovsdbserver-nb-0" Feb 18 14:15:58 crc kubenswrapper[4817]: I0218 14:15:58.046554 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b89d17a9-16cb-4abe-ba88-107ce95dbceb-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b89d17a9-16cb-4abe-ba88-107ce95dbceb\") " pod="openstack/ovsdbserver-nb-0" Feb 18 14:15:58 crc kubenswrapper[4817]: I0218 14:15:58.047282 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b89d17a9-16cb-4abe-ba88-107ce95dbceb-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b89d17a9-16cb-4abe-ba88-107ce95dbceb\") " pod="openstack/ovsdbserver-nb-0" Feb 18 14:15:58 crc kubenswrapper[4817]: I0218 14:15:58.049469 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b89d17a9-16cb-4abe-ba88-107ce95dbceb-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b89d17a9-16cb-4abe-ba88-107ce95dbceb\") " pod="openstack/ovsdbserver-nb-0" Feb 18 14:15:58 crc kubenswrapper[4817]: I0218 14:15:58.049630 4817 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 18 14:15:58 crc kubenswrapper[4817]: I0218 14:15:58.049656 4817 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ece93d41-e78c-442a-8ee3-382130a46e23\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ece93d41-e78c-442a-8ee3-382130a46e23\") pod \"ovsdbserver-nb-0\" (UID: \"b89d17a9-16cb-4abe-ba88-107ce95dbceb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e1cddbe35f30c7777cf80fac993ce4a859496febdb978b13783608343668f5be/globalmount\"" pod="openstack/ovsdbserver-nb-0" Feb 18 14:15:58 crc kubenswrapper[4817]: I0218 14:15:58.049729 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b89d17a9-16cb-4abe-ba88-107ce95dbceb-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b89d17a9-16cb-4abe-ba88-107ce95dbceb\") " pod="openstack/ovsdbserver-nb-0" Feb 18 14:15:58 crc kubenswrapper[4817]: I0218 14:15:58.052944 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b89d17a9-16cb-4abe-ba88-107ce95dbceb-config\") pod \"ovsdbserver-nb-0\" (UID: \"b89d17a9-16cb-4abe-ba88-107ce95dbceb\") " pod="openstack/ovsdbserver-nb-0" Feb 18 14:15:58 crc kubenswrapper[4817]: I0218 14:15:58.056460 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b89d17a9-16cb-4abe-ba88-107ce95dbceb-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b89d17a9-16cb-4abe-ba88-107ce95dbceb\") " pod="openstack/ovsdbserver-nb-0" Feb 18 14:15:58 crc kubenswrapper[4817]: I0218 14:15:58.070158 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jg7s\" (UniqueName: \"kubernetes.io/projected/b89d17a9-16cb-4abe-ba88-107ce95dbceb-kube-api-access-4jg7s\") pod \"ovsdbserver-nb-0\" (UID: \"b89d17a9-16cb-4abe-ba88-107ce95dbceb\") " pod="openstack/ovsdbserver-nb-0" Feb 18 14:15:58 crc kubenswrapper[4817]: I0218 14:15:58.083158 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ece93d41-e78c-442a-8ee3-382130a46e23\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ece93d41-e78c-442a-8ee3-382130a46e23\") pod \"ovsdbserver-nb-0\" (UID: \"b89d17a9-16cb-4abe-ba88-107ce95dbceb\") " pod="openstack/ovsdbserver-nb-0" Feb 18 14:15:58 crc kubenswrapper[4817]: I0218 14:15:58.191645 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-j7nf6" Feb 18 14:15:58 crc kubenswrapper[4817]: I0218 14:15:58.198227 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 18 14:16:01 crc kubenswrapper[4817]: I0218 14:16:01.906161 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 18 14:16:01 crc kubenswrapper[4817]: I0218 14:16:01.907916 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 18 14:16:01 crc kubenswrapper[4817]: I0218 14:16:01.910689 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 18 14:16:01 crc kubenswrapper[4817]: I0218 14:16:01.910923 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 18 14:16:01 crc kubenswrapper[4817]: I0218 14:16:01.911483 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 18 14:16:01 crc kubenswrapper[4817]: I0218 14:16:01.913405 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 18 14:16:01 crc kubenswrapper[4817]: I0218 14:16:01.918312 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-bql95" Feb 18 14:16:02 crc kubenswrapper[4817]: I0218 14:16:02.015727 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/317526d8-4a73-4ae4-9607-b1d7375ba7f6-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"317526d8-4a73-4ae4-9607-b1d7375ba7f6\") " pod="openstack/ovsdbserver-sb-0" Feb 18 14:16:02 crc kubenswrapper[4817]: I0218 14:16:02.015804 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/317526d8-4a73-4ae4-9607-b1d7375ba7f6-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"317526d8-4a73-4ae4-9607-b1d7375ba7f6\") " pod="openstack/ovsdbserver-sb-0" Feb 18 14:16:02 crc kubenswrapper[4817]: I0218 14:16:02.015860 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1f480060-c491-40e8-b70c-ccd757218d9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1f480060-c491-40e8-b70c-ccd757218d9a\") pod \"ovsdbserver-sb-0\" (UID: \"317526d8-4a73-4ae4-9607-b1d7375ba7f6\") " pod="openstack/ovsdbserver-sb-0" Feb 18 14:16:02 crc kubenswrapper[4817]: I0218 14:16:02.015898 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/317526d8-4a73-4ae4-9607-b1d7375ba7f6-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"317526d8-4a73-4ae4-9607-b1d7375ba7f6\") " pod="openstack/ovsdbserver-sb-0" Feb 18 14:16:02 crc kubenswrapper[4817]: I0218 14:16:02.015953 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/317526d8-4a73-4ae4-9607-b1d7375ba7f6-config\") pod \"ovsdbserver-sb-0\" (UID: \"317526d8-4a73-4ae4-9607-b1d7375ba7f6\") " pod="openstack/ovsdbserver-sb-0" Feb 18 14:16:02 crc kubenswrapper[4817]: I0218 14:16:02.015990 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfgk2\" (UniqueName: \"kubernetes.io/projected/317526d8-4a73-4ae4-9607-b1d7375ba7f6-kube-api-access-pfgk2\") pod \"ovsdbserver-sb-0\" (UID: \"317526d8-4a73-4ae4-9607-b1d7375ba7f6\") " pod="openstack/ovsdbserver-sb-0" Feb 18 14:16:02 crc kubenswrapper[4817]: I0218 14:16:02.016277 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/317526d8-4a73-4ae4-9607-b1d7375ba7f6-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"317526d8-4a73-4ae4-9607-b1d7375ba7f6\") " pod="openstack/ovsdbserver-sb-0" Feb 18 14:16:02 crc kubenswrapper[4817]: I0218 14:16:02.016432 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/317526d8-4a73-4ae4-9607-b1d7375ba7f6-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"317526d8-4a73-4ae4-9607-b1d7375ba7f6\") " pod="openstack/ovsdbserver-sb-0" Feb 18 14:16:02 crc kubenswrapper[4817]: I0218 14:16:02.117804 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/317526d8-4a73-4ae4-9607-b1d7375ba7f6-config\") pod \"ovsdbserver-sb-0\" (UID: \"317526d8-4a73-4ae4-9607-b1d7375ba7f6\") " pod="openstack/ovsdbserver-sb-0" Feb 18 14:16:02 crc kubenswrapper[4817]: I0218 14:16:02.117849 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfgk2\" (UniqueName: \"kubernetes.io/projected/317526d8-4a73-4ae4-9607-b1d7375ba7f6-kube-api-access-pfgk2\") pod \"ovsdbserver-sb-0\" (UID: \"317526d8-4a73-4ae4-9607-b1d7375ba7f6\") " pod="openstack/ovsdbserver-sb-0" Feb 18 14:16:02 crc kubenswrapper[4817]: I0218 14:16:02.117873 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/317526d8-4a73-4ae4-9607-b1d7375ba7f6-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"317526d8-4a73-4ae4-9607-b1d7375ba7f6\") " pod="openstack/ovsdbserver-sb-0" Feb 18 14:16:02 crc kubenswrapper[4817]: I0218 14:16:02.117925 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/317526d8-4a73-4ae4-9607-b1d7375ba7f6-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"317526d8-4a73-4ae4-9607-b1d7375ba7f6\") " pod="openstack/ovsdbserver-sb-0" Feb 18 14:16:02 crc kubenswrapper[4817]: I0218 14:16:02.118062 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/317526d8-4a73-4ae4-9607-b1d7375ba7f6-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"317526d8-4a73-4ae4-9607-b1d7375ba7f6\") " pod="openstack/ovsdbserver-sb-0" Feb 18 14:16:02 crc kubenswrapper[4817]: I0218 14:16:02.118108 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/317526d8-4a73-4ae4-9607-b1d7375ba7f6-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"317526d8-4a73-4ae4-9607-b1d7375ba7f6\") " pod="openstack/ovsdbserver-sb-0" Feb 18 14:16:02 crc kubenswrapper[4817]: I0218 14:16:02.118157 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1f480060-c491-40e8-b70c-ccd757218d9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1f480060-c491-40e8-b70c-ccd757218d9a\") pod \"ovsdbserver-sb-0\" (UID: \"317526d8-4a73-4ae4-9607-b1d7375ba7f6\") " pod="openstack/ovsdbserver-sb-0" Feb 18 14:16:02 crc kubenswrapper[4817]: I0218 14:16:02.118188 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/317526d8-4a73-4ae4-9607-b1d7375ba7f6-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"317526d8-4a73-4ae4-9607-b1d7375ba7f6\") " pod="openstack/ovsdbserver-sb-0" Feb 18 14:16:02 crc kubenswrapper[4817]: I0218 14:16:02.119382 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/317526d8-4a73-4ae4-9607-b1d7375ba7f6-config\") pod \"ovsdbserver-sb-0\" (UID: \"317526d8-4a73-4ae4-9607-b1d7375ba7f6\") " pod="openstack/ovsdbserver-sb-0" Feb 18 14:16:02 crc kubenswrapper[4817]: I0218 14:16:02.120021 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/317526d8-4a73-4ae4-9607-b1d7375ba7f6-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"317526d8-4a73-4ae4-9607-b1d7375ba7f6\") " pod="openstack/ovsdbserver-sb-0" Feb 18 14:16:02 crc kubenswrapper[4817]: I0218 14:16:02.121989 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/317526d8-4a73-4ae4-9607-b1d7375ba7f6-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"317526d8-4a73-4ae4-9607-b1d7375ba7f6\") " pod="openstack/ovsdbserver-sb-0" Feb 18 14:16:02 crc kubenswrapper[4817]: I0218 14:16:02.133612 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/317526d8-4a73-4ae4-9607-b1d7375ba7f6-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"317526d8-4a73-4ae4-9607-b1d7375ba7f6\") " pod="openstack/ovsdbserver-sb-0" Feb 18 14:16:02 crc kubenswrapper[4817]: I0218 14:16:02.133692 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/317526d8-4a73-4ae4-9607-b1d7375ba7f6-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"317526d8-4a73-4ae4-9607-b1d7375ba7f6\") " pod="openstack/ovsdbserver-sb-0" Feb 18 14:16:02 crc kubenswrapper[4817]: I0218 14:16:02.133767 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/317526d8-4a73-4ae4-9607-b1d7375ba7f6-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"317526d8-4a73-4ae4-9607-b1d7375ba7f6\") " pod="openstack/ovsdbserver-sb-0" Feb 18 14:16:02 crc kubenswrapper[4817]: I0218 14:16:02.134377 4817 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 18 14:16:02 crc kubenswrapper[4817]: I0218 14:16:02.134405 4817 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1f480060-c491-40e8-b70c-ccd757218d9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1f480060-c491-40e8-b70c-ccd757218d9a\") pod \"ovsdbserver-sb-0\" (UID: \"317526d8-4a73-4ae4-9607-b1d7375ba7f6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9e4e98c2df1b1dba3aeda5f4d2a96ed54691a7766b051fc58af90b8fd49236fc/globalmount\"" pod="openstack/ovsdbserver-sb-0" Feb 18 14:16:02 crc kubenswrapper[4817]: I0218 14:16:02.137697 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfgk2\" (UniqueName: \"kubernetes.io/projected/317526d8-4a73-4ae4-9607-b1d7375ba7f6-kube-api-access-pfgk2\") pod \"ovsdbserver-sb-0\" (UID: \"317526d8-4a73-4ae4-9607-b1d7375ba7f6\") " pod="openstack/ovsdbserver-sb-0" Feb 18 14:16:02 crc kubenswrapper[4817]: I0218 14:16:02.176740 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1f480060-c491-40e8-b70c-ccd757218d9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1f480060-c491-40e8-b70c-ccd757218d9a\") pod \"ovsdbserver-sb-0\" (UID: \"317526d8-4a73-4ae4-9607-b1d7375ba7f6\") " pod="openstack/ovsdbserver-sb-0" Feb 18 14:16:02 crc kubenswrapper[4817]: I0218 14:16:02.231516 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 18 14:16:03 crc kubenswrapper[4817]: E0218 14:16:03.490038 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 18 14:16:03 crc kubenswrapper[4817]: E0218 14:16:03.490675 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7kckq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-r54zh_openstack(0f31baa1-620e-461a-99cb-cf33a2c9ebe9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 14:16:03 crc kubenswrapper[4817]: E0218 14:16:03.492134 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-r54zh" podUID="0f31baa1-620e-461a-99cb-cf33a2c9ebe9" Feb 18 14:16:03 crc kubenswrapper[4817]: I0218 14:16:03.722551 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 14:16:03 crc kubenswrapper[4817]: W0218 14:16:03.740703 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14e634c8_da00_43a5_96a8_33e8bf806873.slice/crio-cc1428f1cfb51837027e2a64e977cce278d40c00f9339e7400e81a892ceddf10 WatchSource:0}: Error finding container cc1428f1cfb51837027e2a64e977cce278d40c00f9339e7400e81a892ceddf10: Status 404 returned error can't find the container with id cc1428f1cfb51837027e2a64e977cce278d40c00f9339e7400e81a892ceddf10 Feb 18 14:16:03 crc kubenswrapper[4817]: I0218 14:16:03.991900 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 18 14:16:04 crc kubenswrapper[4817]: I0218 14:16:04.010118 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 14:16:04 crc kubenswrapper[4817]: I0218 14:16:04.030713 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 14:16:04 crc kubenswrapper[4817]: I0218 14:16:04.040162 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 18 14:16:04 crc kubenswrapper[4817]: I0218 14:16:04.214366 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-46wx9"] Feb 18 14:16:04 crc kubenswrapper[4817]: I0218 14:16:04.323250 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b50fdcb5-1983-4d14-ab5f-390b3dc090ce","Type":"ContainerStarted","Data":"3437761b58e3decdd124d9e73ffbe01fc7ed463745a654012279e6cffedd4525"} Feb 18 14:16:04 crc kubenswrapper[4817]: I0218 14:16:04.325120 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"cb347a6f-041d-41e7-be8b-b151f150e6ab","Type":"ContainerStarted","Data":"feb963f8060577e645af462d0d2100753c8094a63d3b9ff6adec3e922ea84c8a"} Feb 18 14:16:04 crc kubenswrapper[4817]: I0218 14:16:04.326527 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"14e634c8-da00-43a5-96a8-33e8bf806873","Type":"ContainerStarted","Data":"cc1428f1cfb51837027e2a64e977cce278d40c00f9339e7400e81a892ceddf10"} Feb 18 14:16:04 crc kubenswrapper[4817]: I0218 14:16:04.327953 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1","Type":"ContainerStarted","Data":"8a663cf915c818bbce22dc9d945e60e5103401059ab3d8fe702524bb142600a5"} Feb 18 14:16:04 crc kubenswrapper[4817]: I0218 14:16:04.330753 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"def7b080-de6e-49f1-9437-44d6f40b48c4","Type":"ContainerStarted","Data":"384139a2cb0e7f246c15c37cfcf752dbff0d2e4c54cb083642ea0b30ac98f751"} Feb 18 14:16:04 crc kubenswrapper[4817]: I0218 14:16:04.333663 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-46wx9" event={"ID":"2a166377-16ac-4c6b-9207-cddf8c814dc1","Type":"ContainerStarted","Data":"29b24f0f708aea29d61a84277057363e46c6b186eb45bf5f9c28f57c324205b4"} Feb 18 14:16:04 crc kubenswrapper[4817]: I0218 14:16:04.447018 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 18 14:16:04 crc kubenswrapper[4817]: I0218 14:16:04.472844 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9bcxg"] Feb 18 14:16:04 crc kubenswrapper[4817]: I0218 14:16:04.488503 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 18 14:16:04 crc kubenswrapper[4817]: I0218 14:16:04.517601 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 18 14:16:04 crc kubenswrapper[4817]: I0218 14:16:04.544673 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 18 14:16:04 crc kubenswrapper[4817]: W0218 14:16:04.546652 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb89d17a9_16cb_4abe_ba88_107ce95dbceb.slice/crio-1c5340aaa04bd8de0bb2df62b655ab6b2c73c9034c4cdf7d3f6caa156cedabc9 WatchSource:0}: Error finding container 1c5340aaa04bd8de0bb2df62b655ab6b2c73c9034c4cdf7d3f6caa156cedabc9: Status 404 returned error can't find the container with id 1c5340aaa04bd8de0bb2df62b655ab6b2c73c9034c4cdf7d3f6caa156cedabc9 Feb 18 14:16:04 crc kubenswrapper[4817]: I0218 14:16:04.634921 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-r54zh" Feb 18 14:16:04 crc kubenswrapper[4817]: I0218 14:16:04.781265 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kckq\" (UniqueName: \"kubernetes.io/projected/0f31baa1-620e-461a-99cb-cf33a2c9ebe9-kube-api-access-7kckq\") pod \"0f31baa1-620e-461a-99cb-cf33a2c9ebe9\" (UID: \"0f31baa1-620e-461a-99cb-cf33a2c9ebe9\") " Feb 18 14:16:04 crc kubenswrapper[4817]: I0218 14:16:04.781460 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f31baa1-620e-461a-99cb-cf33a2c9ebe9-config\") pod \"0f31baa1-620e-461a-99cb-cf33a2c9ebe9\" (UID: \"0f31baa1-620e-461a-99cb-cf33a2c9ebe9\") " Feb 18 14:16:04 crc kubenswrapper[4817]: I0218 14:16:04.782410 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f31baa1-620e-461a-99cb-cf33a2c9ebe9-config" (OuterVolumeSpecName: "config") pod "0f31baa1-620e-461a-99cb-cf33a2c9ebe9" (UID: "0f31baa1-620e-461a-99cb-cf33a2c9ebe9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:16:04 crc kubenswrapper[4817]: I0218 14:16:04.789066 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f31baa1-620e-461a-99cb-cf33a2c9ebe9-kube-api-access-7kckq" (OuterVolumeSpecName: "kube-api-access-7kckq") pod "0f31baa1-620e-461a-99cb-cf33a2c9ebe9" (UID: "0f31baa1-620e-461a-99cb-cf33a2c9ebe9"). InnerVolumeSpecName "kube-api-access-7kckq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:16:04 crc kubenswrapper[4817]: I0218 14:16:04.857065 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-distributor-585d9bcbc-fmj4p"] Feb 18 14:16:04 crc kubenswrapper[4817]: I0218 14:16:04.863062 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-fmj4p" Feb 18 14:16:04 crc kubenswrapper[4817]: W0218 14:16:04.865469 4817 reflector.go:561] object-"openstack"/"cloudkitty-lokistack-config": failed to list *v1.ConfigMap: configmaps "cloudkitty-lokistack-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Feb 18 14:16:04 crc kubenswrapper[4817]: E0218 14:16:04.865509 4817 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"cloudkitty-lokistack-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"cloudkitty-lokistack-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 14:16:04 crc kubenswrapper[4817]: W0218 14:16:04.865549 4817 reflector.go:561] object-"openstack"/"cloudkitty-lokistack-distributor-grpc": failed to list *v1.Secret: secrets "cloudkitty-lokistack-distributor-grpc" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Feb 18 14:16:04 crc kubenswrapper[4817]: E0218 14:16:04.865561 4817 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"cloudkitty-lokistack-distributor-grpc\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cloudkitty-lokistack-distributor-grpc\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 14:16:04 crc kubenswrapper[4817]: W0218 14:16:04.865595 4817 reflector.go:561] object-"openstack"/"cloudkitty-lokistack-ca-bundle": failed to list *v1.ConfigMap: configmaps "cloudkitty-lokistack-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Feb 18 14:16:04 crc kubenswrapper[4817]: E0218 14:16:04.865606 4817 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"cloudkitty-lokistack-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"cloudkitty-lokistack-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 14:16:04 crc kubenswrapper[4817]: W0218 14:16:04.865637 4817 reflector.go:561] object-"openstack"/"cloudkitty-lokistack-distributor-http": failed to list *v1.Secret: secrets "cloudkitty-lokistack-distributor-http" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Feb 18 14:16:04 crc kubenswrapper[4817]: E0218 14:16:04.865648 4817 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"cloudkitty-lokistack-distributor-http\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cloudkitty-lokistack-distributor-http\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 14:16:04 crc kubenswrapper[4817]: W0218 14:16:04.865700 4817 reflector.go:561] object-"openstack"/"cloudkitty-lokistack-dockercfg-kjn45": failed to list *v1.Secret: secrets "cloudkitty-lokistack-dockercfg-kjn45" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Feb 18 14:16:04 crc kubenswrapper[4817]: E0218 14:16:04.865719 4817 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"cloudkitty-lokistack-dockercfg-kjn45\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cloudkitty-lokistack-dockercfg-kjn45\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 18 14:16:04 crc kubenswrapper[4817]: I0218 14:16:04.883136 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f31baa1-620e-461a-99cb-cf33a2c9ebe9-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:16:04 crc kubenswrapper[4817]: I0218 14:16:04.883170 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kckq\" (UniqueName: \"kubernetes.io/projected/0f31baa1-620e-461a-99cb-cf33a2c9ebe9-kube-api-access-7kckq\") on node \"crc\" DevicePath \"\"" Feb 18 14:16:04 crc kubenswrapper[4817]: I0218 14:16:04.884674 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-585d9bcbc-fmj4p"] Feb 18 14:16:04 crc kubenswrapper[4817]: I0218 14:16:04.985023 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-fmj4p\" (UID: \"f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-fmj4p" Feb 18 14:16:04 crc kubenswrapper[4817]: I0218 14:16:04.985112 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-fmj4p\" (UID: \"f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-fmj4p" Feb 18 14:16:04 crc kubenswrapper[4817]: I0218 14:16:04.985226 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2qp2\" (UniqueName: \"kubernetes.io/projected/f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b-kube-api-access-q2qp2\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-fmj4p\" (UID: \"f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-fmj4p" Feb 18 14:16:04 crc kubenswrapper[4817]: I0218 14:16:04.985248 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b-config\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-fmj4p\" (UID: \"f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-fmj4p" Feb 18 14:16:04 crc kubenswrapper[4817]: I0218 14:16:04.985304 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-fmj4p\" (UID: \"f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-fmj4p" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.077782 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-querier-58c84b5844-qb4ph"] Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.079174 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-qb4ph" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.086858 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-http" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.087221 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-loki-s3" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.092706 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2qp2\" (UniqueName: \"kubernetes.io/projected/f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b-kube-api-access-q2qp2\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-fmj4p\" (UID: \"f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-fmj4p" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.092762 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b-config\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-fmj4p\" (UID: \"f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-fmj4p" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.092905 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-fmj4p\" (UID: \"f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-fmj4p" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.093722 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-fmj4p\" (UID: \"f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-fmj4p" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.094107 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-fmj4p\" (UID: \"f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-fmj4p" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.098269 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-grpc" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.114946 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-58c84b5844-qb4ph"] Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.166052 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2qp2\" (UniqueName: \"kubernetes.io/projected/f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b-kube-api-access-q2qp2\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-fmj4p\" (UID: \"f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-fmj4p" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.204171 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxhzn\" (UniqueName: \"kubernetes.io/projected/b7632ade-ab1b-45b8-9f25-9fb98abc4f1a-kube-api-access-qxhzn\") pod \"cloudkitty-lokistack-querier-58c84b5844-qb4ph\" (UID: \"b7632ade-ab1b-45b8-9f25-9fb98abc4f1a\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-qb4ph" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.204261 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7632ade-ab1b-45b8-9f25-9fb98abc4f1a-config\") pod \"cloudkitty-lokistack-querier-58c84b5844-qb4ph\" (UID: \"b7632ade-ab1b-45b8-9f25-9fb98abc4f1a\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-qb4ph" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.204295 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7632ade-ab1b-45b8-9f25-9fb98abc4f1a-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-58c84b5844-qb4ph\" (UID: \"b7632ade-ab1b-45b8-9f25-9fb98abc4f1a\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-qb4ph" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.204322 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/b7632ade-ab1b-45b8-9f25-9fb98abc4f1a-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-58c84b5844-qb4ph\" (UID: \"b7632ade-ab1b-45b8-9f25-9fb98abc4f1a\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-qb4ph" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.204394 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/b7632ade-ab1b-45b8-9f25-9fb98abc4f1a-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-58c84b5844-qb4ph\" (UID: \"b7632ade-ab1b-45b8-9f25-9fb98abc4f1a\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-qb4ph" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.204439 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/b7632ade-ab1b-45b8-9f25-9fb98abc4f1a-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-58c84b5844-qb4ph\" (UID: \"b7632ade-ab1b-45b8-9f25-9fb98abc4f1a\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-qb4ph" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.314030 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxhzn\" (UniqueName: \"kubernetes.io/projected/b7632ade-ab1b-45b8-9f25-9fb98abc4f1a-kube-api-access-qxhzn\") pod \"cloudkitty-lokistack-querier-58c84b5844-qb4ph\" (UID: \"b7632ade-ab1b-45b8-9f25-9fb98abc4f1a\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-qb4ph" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.314129 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7632ade-ab1b-45b8-9f25-9fb98abc4f1a-config\") pod \"cloudkitty-lokistack-querier-58c84b5844-qb4ph\" (UID: \"b7632ade-ab1b-45b8-9f25-9fb98abc4f1a\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-qb4ph" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.314168 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7632ade-ab1b-45b8-9f25-9fb98abc4f1a-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-58c84b5844-qb4ph\" (UID: \"b7632ade-ab1b-45b8-9f25-9fb98abc4f1a\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-qb4ph" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.314197 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/b7632ade-ab1b-45b8-9f25-9fb98abc4f1a-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-58c84b5844-qb4ph\" (UID: \"b7632ade-ab1b-45b8-9f25-9fb98abc4f1a\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-qb4ph" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.314294 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/b7632ade-ab1b-45b8-9f25-9fb98abc4f1a-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-58c84b5844-qb4ph\" (UID: \"b7632ade-ab1b-45b8-9f25-9fb98abc4f1a\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-qb4ph" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.314341 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/b7632ade-ab1b-45b8-9f25-9fb98abc4f1a-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-58c84b5844-qb4ph\" (UID: \"b7632ade-ab1b-45b8-9f25-9fb98abc4f1a\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-qb4ph" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.314046 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz"] Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.316123 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.319812 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/b7632ade-ab1b-45b8-9f25-9fb98abc4f1a-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-58c84b5844-qb4ph\" (UID: \"b7632ade-ab1b-45b8-9f25-9fb98abc4f1a\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-qb4ph" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.327547 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-grpc" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.328178 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-http" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.338674 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/b7632ade-ab1b-45b8-9f25-9fb98abc4f1a-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-58c84b5844-qb4ph\" (UID: \"b7632ade-ab1b-45b8-9f25-9fb98abc4f1a\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-qb4ph" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.348083 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/b7632ade-ab1b-45b8-9f25-9fb98abc4f1a-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-58c84b5844-qb4ph\" (UID: \"b7632ade-ab1b-45b8-9f25-9fb98abc4f1a\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-qb4ph" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.361039 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz"] Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.371271 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9bcxg" event={"ID":"ddb73215-bd2a-47eb-bbcf-b4708117244f","Type":"ContainerStarted","Data":"e821a2e70e3f5302bfe8336fe045238de942c190ce164020f8ad3e539b4ac214"} Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.374807 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d5d448f4-839e-4b71-ac6e-0c941ccd5a14","Type":"ContainerStarted","Data":"ac599d57fa780b172192a3ed1aedd48964d320e0159a4d96cfa2f1eddda81da0"} Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.376843 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxhzn\" (UniqueName: \"kubernetes.io/projected/b7632ade-ab1b-45b8-9f25-9fb98abc4f1a-kube-api-access-qxhzn\") pod \"cloudkitty-lokistack-querier-58c84b5844-qb4ph\" (UID: \"b7632ade-ab1b-45b8-9f25-9fb98abc4f1a\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-qb4ph" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.381260 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"9e5146f3-4a88-4e31-82e7-0e0f72188d22","Type":"ContainerStarted","Data":"39535b8e6ee867ac243ed6f3dccb8aa2b5d8e79aebf7a758326b1b4bd1dab3db"} Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.382316 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"641785a9-2372-4857-8882-192bf7d7fe45","Type":"ContainerStarted","Data":"e5063ed28258f8493e0cd97afd5d83eb400039c6f150ba5bd34b9dd1c3b94702"} Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.389115 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b89d17a9-16cb-4abe-ba88-107ce95dbceb","Type":"ContainerStarted","Data":"1c5340aaa04bd8de0bb2df62b655ab6b2c73c9034c4cdf7d3f6caa156cedabc9"} Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.391168 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-r54zh" event={"ID":"0f31baa1-620e-461a-99cb-cf33a2c9ebe9","Type":"ContainerDied","Data":"d4af17dbcfc28c5d15790b9b0df7bee318dcfd9ff7ce126da6408c8576815e44"} Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.391259 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-r54zh" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.415469 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00036a73-dd30-4b48-a135-19b064818e5c-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz\" (UID: \"00036a73-dd30-4b48-a135-19b064818e5c\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.415524 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/00036a73-dd30-4b48-a135-19b064818e5c-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz\" (UID: \"00036a73-dd30-4b48-a135-19b064818e5c\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.418375 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mt9p\" (UniqueName: \"kubernetes.io/projected/00036a73-dd30-4b48-a135-19b064818e5c-kube-api-access-2mt9p\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz\" (UID: \"00036a73-dd30-4b48-a135-19b064818e5c\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.418537 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00036a73-dd30-4b48-a135-19b064818e5c-config\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz\" (UID: \"00036a73-dd30-4b48-a135-19b064818e5c\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.418670 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/00036a73-dd30-4b48-a135-19b064818e5c-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz\" (UID: \"00036a73-dd30-4b48-a135-19b064818e5c\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.496540 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-r54zh"] Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.504916 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-r54zh"] Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.512707 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26"] Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.513932 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.517885 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb"] Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.527916 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/00036a73-dd30-4b48-a135-19b064818e5c-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz\" (UID: \"00036a73-dd30-4b48-a135-19b064818e5c\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.528032 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mt9p\" (UniqueName: \"kubernetes.io/projected/00036a73-dd30-4b48-a135-19b064818e5c-kube-api-access-2mt9p\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz\" (UID: \"00036a73-dd30-4b48-a135-19b064818e5c\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.528082 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00036a73-dd30-4b48-a135-19b064818e5c-config\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz\" (UID: \"00036a73-dd30-4b48-a135-19b064818e5c\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.528147 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/00036a73-dd30-4b48-a135-19b064818e5c-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz\" (UID: \"00036a73-dd30-4b48-a135-19b064818e5c\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.528275 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00036a73-dd30-4b48-a135-19b064818e5c-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz\" (UID: \"00036a73-dd30-4b48-a135-19b064818e5c\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.529717 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.530177 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-http" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.530347 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway-ca-bundle" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.530483 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-client-http" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.530569 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.530692 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.530777 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.538935 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/00036a73-dd30-4b48-a135-19b064818e5c-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz\" (UID: \"00036a73-dd30-4b48-a135-19b064818e5c\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.551400 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-dockercfg-klrb8" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.552589 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26"] Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.557822 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/00036a73-dd30-4b48-a135-19b064818e5c-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz\" (UID: \"00036a73-dd30-4b48-a135-19b064818e5c\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.585872 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mt9p\" (UniqueName: \"kubernetes.io/projected/00036a73-dd30-4b48-a135-19b064818e5c-kube-api-access-2mt9p\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz\" (UID: \"00036a73-dd30-4b48-a135-19b064818e5c\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.593520 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb"] Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.629677 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/e596742a-2a5e-4a0c-9177-2b5a1ce00651-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-j7gxb\" (UID: \"e596742a-2a5e-4a0c-9177-2b5a1ce00651\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.629776 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/864c0a91-5aa3-4a84-8b75-6f75e0883aea-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-n8c26\" (UID: \"864c0a91-5aa3-4a84-8b75-6f75e0883aea\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.629834 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/864c0a91-5aa3-4a84-8b75-6f75e0883aea-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-n8c26\" (UID: \"864c0a91-5aa3-4a84-8b75-6f75e0883aea\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.629864 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/e596742a-2a5e-4a0c-9177-2b5a1ce00651-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-j7gxb\" (UID: \"e596742a-2a5e-4a0c-9177-2b5a1ce00651\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.629906 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsq2c\" (UniqueName: \"kubernetes.io/projected/e596742a-2a5e-4a0c-9177-2b5a1ce00651-kube-api-access-bsq2c\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-j7gxb\" (UID: \"e596742a-2a5e-4a0c-9177-2b5a1ce00651\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.629940 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e596742a-2a5e-4a0c-9177-2b5a1ce00651-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-j7gxb\" (UID: \"e596742a-2a5e-4a0c-9177-2b5a1ce00651\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.630010 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/864c0a91-5aa3-4a84-8b75-6f75e0883aea-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-n8c26\" (UID: \"864c0a91-5aa3-4a84-8b75-6f75e0883aea\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.630065 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e596742a-2a5e-4a0c-9177-2b5a1ce00651-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-j7gxb\" (UID: \"e596742a-2a5e-4a0c-9177-2b5a1ce00651\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.630101 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/864c0a91-5aa3-4a84-8b75-6f75e0883aea-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-n8c26\" (UID: \"864c0a91-5aa3-4a84-8b75-6f75e0883aea\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.630153 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/e596742a-2a5e-4a0c-9177-2b5a1ce00651-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-j7gxb\" (UID: \"e596742a-2a5e-4a0c-9177-2b5a1ce00651\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.630171 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/e596742a-2a5e-4a0c-9177-2b5a1ce00651-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-j7gxb\" (UID: \"e596742a-2a5e-4a0c-9177-2b5a1ce00651\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.630227 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e596742a-2a5e-4a0c-9177-2b5a1ce00651-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-j7gxb\" (UID: \"e596742a-2a5e-4a0c-9177-2b5a1ce00651\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.630261 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/864c0a91-5aa3-4a84-8b75-6f75e0883aea-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-n8c26\" (UID: \"864c0a91-5aa3-4a84-8b75-6f75e0883aea\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.630311 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/e596742a-2a5e-4a0c-9177-2b5a1ce00651-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-j7gxb\" (UID: \"e596742a-2a5e-4a0c-9177-2b5a1ce00651\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.630330 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/864c0a91-5aa3-4a84-8b75-6f75e0883aea-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-n8c26\" (UID: \"864c0a91-5aa3-4a84-8b75-6f75e0883aea\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.630378 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/864c0a91-5aa3-4a84-8b75-6f75e0883aea-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-n8c26\" (UID: \"864c0a91-5aa3-4a84-8b75-6f75e0883aea\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.630412 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs4rf\" (UniqueName: \"kubernetes.io/projected/864c0a91-5aa3-4a84-8b75-6f75e0883aea-kube-api-access-vs4rf\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-n8c26\" (UID: \"864c0a91-5aa3-4a84-8b75-6f75e0883aea\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.630471 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/864c0a91-5aa3-4a84-8b75-6f75e0883aea-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-n8c26\" (UID: \"864c0a91-5aa3-4a84-8b75-6f75e0883aea\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.658652 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.732355 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/864c0a91-5aa3-4a84-8b75-6f75e0883aea-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-n8c26\" (UID: \"864c0a91-5aa3-4a84-8b75-6f75e0883aea\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.732421 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/e596742a-2a5e-4a0c-9177-2b5a1ce00651-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-j7gxb\" (UID: \"e596742a-2a5e-4a0c-9177-2b5a1ce00651\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.732448 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsq2c\" (UniqueName: \"kubernetes.io/projected/e596742a-2a5e-4a0c-9177-2b5a1ce00651-kube-api-access-bsq2c\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-j7gxb\" (UID: \"e596742a-2a5e-4a0c-9177-2b5a1ce00651\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.732486 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e596742a-2a5e-4a0c-9177-2b5a1ce00651-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-j7gxb\" (UID: \"e596742a-2a5e-4a0c-9177-2b5a1ce00651\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.732529 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/864c0a91-5aa3-4a84-8b75-6f75e0883aea-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-n8c26\" (UID: \"864c0a91-5aa3-4a84-8b75-6f75e0883aea\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.732561 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e596742a-2a5e-4a0c-9177-2b5a1ce00651-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-j7gxb\" (UID: \"e596742a-2a5e-4a0c-9177-2b5a1ce00651\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.732603 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/864c0a91-5aa3-4a84-8b75-6f75e0883aea-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-n8c26\" (UID: \"864c0a91-5aa3-4a84-8b75-6f75e0883aea\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.732633 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/e596742a-2a5e-4a0c-9177-2b5a1ce00651-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-j7gxb\" (UID: \"e596742a-2a5e-4a0c-9177-2b5a1ce00651\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.732651 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/e596742a-2a5e-4a0c-9177-2b5a1ce00651-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-j7gxb\" (UID: \"e596742a-2a5e-4a0c-9177-2b5a1ce00651\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.732687 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e596742a-2a5e-4a0c-9177-2b5a1ce00651-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-j7gxb\" (UID: \"e596742a-2a5e-4a0c-9177-2b5a1ce00651\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.732713 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/864c0a91-5aa3-4a84-8b75-6f75e0883aea-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-n8c26\" (UID: \"864c0a91-5aa3-4a84-8b75-6f75e0883aea\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.732737 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/e596742a-2a5e-4a0c-9177-2b5a1ce00651-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-j7gxb\" (UID: \"e596742a-2a5e-4a0c-9177-2b5a1ce00651\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.732759 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/864c0a91-5aa3-4a84-8b75-6f75e0883aea-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-n8c26\" (UID: \"864c0a91-5aa3-4a84-8b75-6f75e0883aea\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.732785 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/864c0a91-5aa3-4a84-8b75-6f75e0883aea-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-n8c26\" (UID: \"864c0a91-5aa3-4a84-8b75-6f75e0883aea\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.732823 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs4rf\" (UniqueName: \"kubernetes.io/projected/864c0a91-5aa3-4a84-8b75-6f75e0883aea-kube-api-access-vs4rf\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-n8c26\" (UID: \"864c0a91-5aa3-4a84-8b75-6f75e0883aea\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.732864 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/864c0a91-5aa3-4a84-8b75-6f75e0883aea-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-n8c26\" (UID: \"864c0a91-5aa3-4a84-8b75-6f75e0883aea\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.732933 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/e596742a-2a5e-4a0c-9177-2b5a1ce00651-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-j7gxb\" (UID: \"e596742a-2a5e-4a0c-9177-2b5a1ce00651\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.733011 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/864c0a91-5aa3-4a84-8b75-6f75e0883aea-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-n8c26\" (UID: \"864c0a91-5aa3-4a84-8b75-6f75e0883aea\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" Feb 18 14:16:05 crc kubenswrapper[4817]: E0218 14:16:05.733168 4817 secret.go:188] Couldn't get secret openstack/cloudkitty-lokistack-gateway-http: secret "cloudkitty-lokistack-gateway-http" not found Feb 18 14:16:05 crc kubenswrapper[4817]: E0218 14:16:05.733235 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e596742a-2a5e-4a0c-9177-2b5a1ce00651-tls-secret podName:e596742a-2a5e-4a0c-9177-2b5a1ce00651 nodeName:}" failed. No retries permitted until 2026-02-18 14:16:06.233214521 +0000 UTC m=+1028.808750504 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/e596742a-2a5e-4a0c-9177-2b5a1ce00651-tls-secret") pod "cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" (UID: "e596742a-2a5e-4a0c-9177-2b5a1ce00651") : secret "cloudkitty-lokistack-gateway-http" not found Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.733932 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/864c0a91-5aa3-4a84-8b75-6f75e0883aea-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-n8c26\" (UID: \"864c0a91-5aa3-4a84-8b75-6f75e0883aea\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.734217 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/864c0a91-5aa3-4a84-8b75-6f75e0883aea-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-n8c26\" (UID: \"864c0a91-5aa3-4a84-8b75-6f75e0883aea\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.734901 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/e596742a-2a5e-4a0c-9177-2b5a1ce00651-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-j7gxb\" (UID: \"e596742a-2a5e-4a0c-9177-2b5a1ce00651\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.735566 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e596742a-2a5e-4a0c-9177-2b5a1ce00651-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-j7gxb\" (UID: \"e596742a-2a5e-4a0c-9177-2b5a1ce00651\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.736127 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/e596742a-2a5e-4a0c-9177-2b5a1ce00651-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-j7gxb\" (UID: \"e596742a-2a5e-4a0c-9177-2b5a1ce00651\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" Feb 18 14:16:05 crc kubenswrapper[4817]: E0218 14:16:05.736541 4817 secret.go:188] Couldn't get secret openstack/cloudkitty-lokistack-gateway-http: secret "cloudkitty-lokistack-gateway-http" not found Feb 18 14:16:05 crc kubenswrapper[4817]: E0218 14:16:05.736701 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/864c0a91-5aa3-4a84-8b75-6f75e0883aea-tls-secret podName:864c0a91-5aa3-4a84-8b75-6f75e0883aea nodeName:}" failed. No retries permitted until 2026-02-18 14:16:06.236680008 +0000 UTC m=+1028.812215991 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/864c0a91-5aa3-4a84-8b75-6f75e0883aea-tls-secret") pod "cloudkitty-lokistack-gateway-7f8685b49f-n8c26" (UID: "864c0a91-5aa3-4a84-8b75-6f75e0883aea") : secret "cloudkitty-lokistack-gateway-http" not found Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.737246 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/864c0a91-5aa3-4a84-8b75-6f75e0883aea-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-n8c26\" (UID: \"864c0a91-5aa3-4a84-8b75-6f75e0883aea\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.737438 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/864c0a91-5aa3-4a84-8b75-6f75e0883aea-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-n8c26\" (UID: \"864c0a91-5aa3-4a84-8b75-6f75e0883aea\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.737614 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/e596742a-2a5e-4a0c-9177-2b5a1ce00651-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-j7gxb\" (UID: \"e596742a-2a5e-4a0c-9177-2b5a1ce00651\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.737657 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e596742a-2a5e-4a0c-9177-2b5a1ce00651-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-j7gxb\" (UID: \"e596742a-2a5e-4a0c-9177-2b5a1ce00651\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.738372 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/864c0a91-5aa3-4a84-8b75-6f75e0883aea-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-n8c26\" (UID: \"864c0a91-5aa3-4a84-8b75-6f75e0883aea\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.741712 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/864c0a91-5aa3-4a84-8b75-6f75e0883aea-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-n8c26\" (UID: \"864c0a91-5aa3-4a84-8b75-6f75e0883aea\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.744780 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/e596742a-2a5e-4a0c-9177-2b5a1ce00651-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-j7gxb\" (UID: \"e596742a-2a5e-4a0c-9177-2b5a1ce00651\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.754833 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsq2c\" (UniqueName: \"kubernetes.io/projected/e596742a-2a5e-4a0c-9177-2b5a1ce00651-kube-api-access-bsq2c\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-j7gxb\" (UID: \"e596742a-2a5e-4a0c-9177-2b5a1ce00651\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" Feb 18 14:16:05 crc kubenswrapper[4817]: I0218 14:16:05.755188 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs4rf\" (UniqueName: \"kubernetes.io/projected/864c0a91-5aa3-4a84-8b75-6f75e0883aea-kube-api-access-vs4rf\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-n8c26\" (UID: \"864c0a91-5aa3-4a84-8b75-6f75e0883aea\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.038002 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.039168 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.048961 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-grpc" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.051352 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-http" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.080829 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-config" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.084146 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b-config\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-fmj4p\" (UID: \"f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-fmj4p" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.086528 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7632ade-ab1b-45b8-9f25-9fb98abc4f1a-config\") pod \"cloudkitty-lokistack-querier-58c84b5844-qb4ph\" (UID: \"b7632ade-ab1b-45b8-9f25-9fb98abc4f1a\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-qb4ph" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.092326 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00036a73-dd30-4b48-a135-19b064818e5c-config\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz\" (UID: \"00036a73-dd30-4b48-a135-19b064818e5c\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz" Feb 18 14:16:06 crc kubenswrapper[4817]: E0218 14:16:06.094961 4817 secret.go:188] Couldn't get secret openstack/cloudkitty-lokistack-distributor-http: failed to sync secret cache: timed out waiting for the condition Feb 18 14:16:06 crc kubenswrapper[4817]: E0218 14:16:06.097028 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b-cloudkitty-lokistack-distributor-http podName:f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b nodeName:}" failed. No retries permitted until 2026-02-18 14:16:06.597003035 +0000 UTC m=+1029.172539018 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cloudkitty-lokistack-distributor-http" (UniqueName: "kubernetes.io/secret/f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b-cloudkitty-lokistack-distributor-http") pod "cloudkitty-lokistack-distributor-585d9bcbc-fmj4p" (UID: "f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b") : failed to sync secret cache: timed out waiting for the condition Feb 18 14:16:06 crc kubenswrapper[4817]: E0218 14:16:06.095028 4817 secret.go:188] Couldn't get secret openstack/cloudkitty-lokistack-distributor-grpc: failed to sync secret cache: timed out waiting for the condition Feb 18 14:16:06 crc kubenswrapper[4817]: E0218 14:16:06.096167 4817 configmap.go:193] Couldn't get configMap openstack/cloudkitty-lokistack-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 18 14:16:06 crc kubenswrapper[4817]: E0218 14:16:06.097162 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b-cloudkitty-lokistack-distributor-grpc podName:f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b nodeName:}" failed. No retries permitted until 2026-02-18 14:16:06.597137889 +0000 UTC m=+1029.172673872 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cloudkitty-lokistack-distributor-grpc" (UniqueName: "kubernetes.io/secret/f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b-cloudkitty-lokistack-distributor-grpc") pod "cloudkitty-lokistack-distributor-585d9bcbc-fmj4p" (UID: "f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b") : failed to sync secret cache: timed out waiting for the condition Feb 18 14:16:06 crc kubenswrapper[4817]: E0218 14:16:06.097265 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b-cloudkitty-lokistack-ca-bundle podName:f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b nodeName:}" failed. No retries permitted until 2026-02-18 14:16:06.597227651 +0000 UTC m=+1029.172763634 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cloudkitty-lokistack-ca-bundle" (UniqueName: "kubernetes.io/configmap/f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b-cloudkitty-lokistack-ca-bundle") pod "cloudkitty-lokistack-distributor-585d9bcbc-fmj4p" (UID: "f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b") : failed to sync configmap cache: timed out waiting for the condition Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.099363 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.140912 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f685dd5-8921-4e4a-a4d5-d19a499775f5-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"7f685dd5-8921-4e4a-a4d5-d19a499775f5\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.141096 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f685dd5-8921-4e4a-a4d5-d19a499775f5-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"7f685dd5-8921-4e4a-a4d5-d19a499775f5\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.141123 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/7f685dd5-8921-4e4a-a4d5-d19a499775f5-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"7f685dd5-8921-4e4a-a4d5-d19a499775f5\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.141168 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/7f685dd5-8921-4e4a-a4d5-d19a499775f5-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"7f685dd5-8921-4e4a-a4d5-d19a499775f5\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.141237 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"7f685dd5-8921-4e4a-a4d5-d19a499775f5\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.141308 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg4s6\" (UniqueName: \"kubernetes.io/projected/7f685dd5-8921-4e4a-a4d5-d19a499775f5-kube-api-access-pg4s6\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"7f685dd5-8921-4e4a-a4d5-d19a499775f5\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.141395 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"7f685dd5-8921-4e4a-a4d5-d19a499775f5\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.141422 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/7f685dd5-8921-4e4a-a4d5-d19a499775f5-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"7f685dd5-8921-4e4a-a4d5-d19a499775f5\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.164798 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.166009 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.168479 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-http" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.169263 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-grpc" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.191020 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f31baa1-620e-461a-99cb-cf33a2c9ebe9" path="/var/lib/kubelet/pods/0f31baa1-620e-461a-99cb-cf33a2c9ebe9/volumes" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.191368 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.195827 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-dockercfg-kjn45" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.196869 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-http" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.238613 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca-bundle" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.243434 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg4s6\" (UniqueName: \"kubernetes.io/projected/7f685dd5-8921-4e4a-a4d5-d19a499775f5-kube-api-access-pg4s6\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"7f685dd5-8921-4e4a-a4d5-d19a499775f5\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.243486 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/e596742a-2a5e-4a0c-9177-2b5a1ce00651-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-j7gxb\" (UID: \"e596742a-2a5e-4a0c-9177-2b5a1ce00651\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.243548 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75cbd0d0-2a48-48ba-9cae-d465da658b05-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"75cbd0d0-2a48-48ba-9cae-d465da658b05\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.243601 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"7f685dd5-8921-4e4a-a4d5-d19a499775f5\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.243633 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/7f685dd5-8921-4e4a-a4d5-d19a499775f5-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"7f685dd5-8921-4e4a-a4d5-d19a499775f5\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.243655 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"75cbd0d0-2a48-48ba-9cae-d465da658b05\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.243685 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/75cbd0d0-2a48-48ba-9cae-d465da658b05-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"75cbd0d0-2a48-48ba-9cae-d465da658b05\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.243717 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/864c0a91-5aa3-4a84-8b75-6f75e0883aea-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-n8c26\" (UID: \"864c0a91-5aa3-4a84-8b75-6f75e0883aea\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.243763 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/75cbd0d0-2a48-48ba-9cae-d465da658b05-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"75cbd0d0-2a48-48ba-9cae-d465da658b05\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.243795 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f685dd5-8921-4e4a-a4d5-d19a499775f5-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"7f685dd5-8921-4e4a-a4d5-d19a499775f5\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.243816 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xqp9\" (UniqueName: \"kubernetes.io/projected/75cbd0d0-2a48-48ba-9cae-d465da658b05-kube-api-access-9xqp9\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"75cbd0d0-2a48-48ba-9cae-d465da658b05\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.243871 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/75cbd0d0-2a48-48ba-9cae-d465da658b05-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"75cbd0d0-2a48-48ba-9cae-d465da658b05\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.243898 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75cbd0d0-2a48-48ba-9cae-d465da658b05-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"75cbd0d0-2a48-48ba-9cae-d465da658b05\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.243934 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f685dd5-8921-4e4a-a4d5-d19a499775f5-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"7f685dd5-8921-4e4a-a4d5-d19a499775f5\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.243956 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/7f685dd5-8921-4e4a-a4d5-d19a499775f5-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"7f685dd5-8921-4e4a-a4d5-d19a499775f5\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.243999 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/7f685dd5-8921-4e4a-a4d5-d19a499775f5-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"7f685dd5-8921-4e4a-a4d5-d19a499775f5\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.244032 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"7f685dd5-8921-4e4a-a4d5-d19a499775f5\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.245225 4817 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"7f685dd5-8921-4e4a-a4d5-d19a499775f5\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.245289 4817 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"7f685dd5-8921-4e4a-a4d5-d19a499775f5\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.245537 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f685dd5-8921-4e4a-a4d5-d19a499775f5-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"7f685dd5-8921-4e4a-a4d5-d19a499775f5\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.246276 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f685dd5-8921-4e4a-a4d5-d19a499775f5-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"7f685dd5-8921-4e4a-a4d5-d19a499775f5\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.246383 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7632ade-ab1b-45b8-9f25-9fb98abc4f1a-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-58c84b5844-qb4ph\" (UID: \"b7632ade-ab1b-45b8-9f25-9fb98abc4f1a\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-qb4ph" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.247169 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e596742a-2a5e-4a0c-9177-2b5a1ce00651-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-j7gxb\" (UID: \"e596742a-2a5e-4a0c-9177-2b5a1ce00651\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.247961 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/864c0a91-5aa3-4a84-8b75-6f75e0883aea-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-n8c26\" (UID: \"864c0a91-5aa3-4a84-8b75-6f75e0883aea\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.251408 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00036a73-dd30-4b48-a135-19b064818e5c-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz\" (UID: \"00036a73-dd30-4b48-a135-19b064818e5c\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.252058 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/7f685dd5-8921-4e4a-a4d5-d19a499775f5-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"7f685dd5-8921-4e4a-a4d5-d19a499775f5\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.252090 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/e596742a-2a5e-4a0c-9177-2b5a1ce00651-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-j7gxb\" (UID: \"e596742a-2a5e-4a0c-9177-2b5a1ce00651\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.254296 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/864c0a91-5aa3-4a84-8b75-6f75e0883aea-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-n8c26\" (UID: \"864c0a91-5aa3-4a84-8b75-6f75e0883aea\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.257274 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/7f685dd5-8921-4e4a-a4d5-d19a499775f5-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"7f685dd5-8921-4e4a-a4d5-d19a499775f5\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.263393 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/7f685dd5-8921-4e4a-a4d5-d19a499775f5-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"7f685dd5-8921-4e4a-a4d5-d19a499775f5\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.265221 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg4s6\" (UniqueName: \"kubernetes.io/projected/7f685dd5-8921-4e4a-a4d5-d19a499775f5-kube-api-access-pg4s6\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"7f685dd5-8921-4e4a-a4d5-d19a499775f5\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.267454 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-grpc" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.276493 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"7f685dd5-8921-4e4a-a4d5-d19a499775f5\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.281230 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"7f685dd5-8921-4e4a-a4d5-d19a499775f5\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.322591 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-qb4ph" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.345821 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75cbd0d0-2a48-48ba-9cae-d465da658b05-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"75cbd0d0-2a48-48ba-9cae-d465da658b05\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.345904 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"75cbd0d0-2a48-48ba-9cae-d465da658b05\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.345928 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/75cbd0d0-2a48-48ba-9cae-d465da658b05-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"75cbd0d0-2a48-48ba-9cae-d465da658b05\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.346002 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/75cbd0d0-2a48-48ba-9cae-d465da658b05-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"75cbd0d0-2a48-48ba-9cae-d465da658b05\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.346033 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xqp9\" (UniqueName: \"kubernetes.io/projected/75cbd0d0-2a48-48ba-9cae-d465da658b05-kube-api-access-9xqp9\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"75cbd0d0-2a48-48ba-9cae-d465da658b05\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.346082 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/75cbd0d0-2a48-48ba-9cae-d465da658b05-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"75cbd0d0-2a48-48ba-9cae-d465da658b05\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.346109 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75cbd0d0-2a48-48ba-9cae-d465da658b05-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"75cbd0d0-2a48-48ba-9cae-d465da658b05\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.346181 4817 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"75cbd0d0-2a48-48ba-9cae-d465da658b05\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.347095 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75cbd0d0-2a48-48ba-9cae-d465da658b05-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"75cbd0d0-2a48-48ba-9cae-d465da658b05\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.347716 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75cbd0d0-2a48-48ba-9cae-d465da658b05-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"75cbd0d0-2a48-48ba-9cae-d465da658b05\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.351340 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/75cbd0d0-2a48-48ba-9cae-d465da658b05-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"75cbd0d0-2a48-48ba-9cae-d465da658b05\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.354217 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.354517 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/75cbd0d0-2a48-48ba-9cae-d465da658b05-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"75cbd0d0-2a48-48ba-9cae-d465da658b05\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.356695 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/75cbd0d0-2a48-48ba-9cae-d465da658b05-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"75cbd0d0-2a48-48ba-9cae-d465da658b05\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.367313 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xqp9\" (UniqueName: \"kubernetes.io/projected/75cbd0d0-2a48-48ba-9cae-d465da658b05-kube-api-access-9xqp9\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"75cbd0d0-2a48-48ba-9cae-d465da658b05\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.368922 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"75cbd0d0-2a48-48ba-9cae-d465da658b05\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.383202 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.402556 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"317526d8-4a73-4ae4-9607-b1d7375ba7f6","Type":"ContainerStarted","Data":"8e4d06748b9d1f09b553072523342553222423375757ea6216ba69792e03e19c"} Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.444213 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.449402 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.454232 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-grpc" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.454796 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.454953 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-http" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.473624 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.484545 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.489053 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.550373 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dgl8\" (UniqueName: \"kubernetes.io/projected/6d8b01c7-a1be-49d1-8417-ce412fa834a4-kube-api-access-9dgl8\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"6d8b01c7-a1be-49d1-8417-ce412fa834a4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.550448 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/6d8b01c7-a1be-49d1-8417-ce412fa834a4-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"6d8b01c7-a1be-49d1-8417-ce412fa834a4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.550482 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d8b01c7-a1be-49d1-8417-ce412fa834a4-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"6d8b01c7-a1be-49d1-8417-ce412fa834a4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.550510 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/6d8b01c7-a1be-49d1-8417-ce412fa834a4-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"6d8b01c7-a1be-49d1-8417-ce412fa834a4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.550531 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d8b01c7-a1be-49d1-8417-ce412fa834a4-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"6d8b01c7-a1be-49d1-8417-ce412fa834a4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.550605 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/6d8b01c7-a1be-49d1-8417-ce412fa834a4-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"6d8b01c7-a1be-49d1-8417-ce412fa834a4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.550637 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"6d8b01c7-a1be-49d1-8417-ce412fa834a4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.654062 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"6d8b01c7-a1be-49d1-8417-ce412fa834a4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.654314 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-fmj4p\" (UID: \"f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-fmj4p" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.654344 4817 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"6d8b01c7-a1be-49d1-8417-ce412fa834a4\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.655142 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-fmj4p\" (UID: \"f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-fmj4p" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.655303 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dgl8\" (UniqueName: \"kubernetes.io/projected/6d8b01c7-a1be-49d1-8417-ce412fa834a4-kube-api-access-9dgl8\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"6d8b01c7-a1be-49d1-8417-ce412fa834a4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.655375 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/6d8b01c7-a1be-49d1-8417-ce412fa834a4-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"6d8b01c7-a1be-49d1-8417-ce412fa834a4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.655421 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d8b01c7-a1be-49d1-8417-ce412fa834a4-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"6d8b01c7-a1be-49d1-8417-ce412fa834a4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.655468 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/6d8b01c7-a1be-49d1-8417-ce412fa834a4-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"6d8b01c7-a1be-49d1-8417-ce412fa834a4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.655487 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d8b01c7-a1be-49d1-8417-ce412fa834a4-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"6d8b01c7-a1be-49d1-8417-ce412fa834a4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.655534 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-fmj4p\" (UID: \"f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-fmj4p" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.655608 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-fmj4p\" (UID: \"f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-fmj4p" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.655634 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/6d8b01c7-a1be-49d1-8417-ce412fa834a4-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"6d8b01c7-a1be-49d1-8417-ce412fa834a4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.656733 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d8b01c7-a1be-49d1-8417-ce412fa834a4-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"6d8b01c7-a1be-49d1-8417-ce412fa834a4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.656824 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d8b01c7-a1be-49d1-8417-ce412fa834a4-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"6d8b01c7-a1be-49d1-8417-ce412fa834a4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.660822 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-fmj4p\" (UID: \"f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-fmj4p" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.663692 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-fmj4p\" (UID: \"f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-fmj4p" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.672546 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/6d8b01c7-a1be-49d1-8417-ce412fa834a4-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"6d8b01c7-a1be-49d1-8417-ce412fa834a4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.677512 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/6d8b01c7-a1be-49d1-8417-ce412fa834a4-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"6d8b01c7-a1be-49d1-8417-ce412fa834a4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.677958 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/6d8b01c7-a1be-49d1-8417-ce412fa834a4-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"6d8b01c7-a1be-49d1-8417-ce412fa834a4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.684639 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dgl8\" (UniqueName: \"kubernetes.io/projected/6d8b01c7-a1be-49d1-8417-ce412fa834a4-kube-api-access-9dgl8\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"6d8b01c7-a1be-49d1-8417-ce412fa834a4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.690266 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"6d8b01c7-a1be-49d1-8417-ce412fa834a4\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.692161 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-fmj4p" Feb 18 14:16:06 crc kubenswrapper[4817]: E0218 14:16:06.711184 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 18 14:16:06 crc kubenswrapper[4817]: E0218 14:16:06.711344 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g4q62,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-lfrgc_openstack(5ffa1255-1d7c-43a4-a197-931d34164a31): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 14:16:06 crc kubenswrapper[4817]: E0218 14:16:06.713438 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-lfrgc" podUID="5ffa1255-1d7c-43a4-a197-931d34164a31" Feb 18 14:16:06 crc kubenswrapper[4817]: I0218 14:16:06.785962 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 18 14:16:07 crc kubenswrapper[4817]: I0218 14:16:07.110499 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Feb 18 14:16:07 crc kubenswrapper[4817]: I0218 14:16:07.126127 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz"] Feb 18 14:16:07 crc kubenswrapper[4817]: W0218 14:16:07.127312 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75cbd0d0_2a48_48ba_9cae_d465da658b05.slice/crio-f3eb39af1dd4352636b4f55105315713bf0fd6ad590dd6c015baf3cf8a643db5 WatchSource:0}: Error finding container f3eb39af1dd4352636b4f55105315713bf0fd6ad590dd6c015baf3cf8a643db5: Status 404 returned error can't find the container with id f3eb39af1dd4352636b4f55105315713bf0fd6ad590dd6c015baf3cf8a643db5 Feb 18 14:16:07 crc kubenswrapper[4817]: W0218 14:16:07.132559 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00036a73_dd30_4b48_a135_19b064818e5c.slice/crio-da228ba0f633ed20edb5e6fd8eab7c376ed5423039b73c80199fbb698aaf2f0d WatchSource:0}: Error finding container da228ba0f633ed20edb5e6fd8eab7c376ed5423039b73c80199fbb698aaf2f0d: Status 404 returned error can't find the container with id da228ba0f633ed20edb5e6fd8eab7c376ed5423039b73c80199fbb698aaf2f0d Feb 18 14:16:07 crc kubenswrapper[4817]: I0218 14:16:07.255182 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26"] Feb 18 14:16:07 crc kubenswrapper[4817]: I0218 14:16:07.360389 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Feb 18 14:16:07 crc kubenswrapper[4817]: I0218 14:16:07.409189 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb"] Feb 18 14:16:07 crc kubenswrapper[4817]: W0218 14:16:07.410332 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7632ade_ab1b_45b8_9f25_9fb98abc4f1a.slice/crio-06c4df5877f53f28e6c94890947152a0613ad6eee1f1e3a96274bf36ec18fcc2 WatchSource:0}: Error finding container 06c4df5877f53f28e6c94890947152a0613ad6eee1f1e3a96274bf36ec18fcc2: Status 404 returned error can't find the container with id 06c4df5877f53f28e6c94890947152a0613ad6eee1f1e3a96274bf36ec18fcc2 Feb 18 14:16:07 crc kubenswrapper[4817]: I0218 14:16:07.430747 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-58c84b5844-qb4ph"] Feb 18 14:16:07 crc kubenswrapper[4817]: I0218 14:16:07.432105 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"7f685dd5-8921-4e4a-a4d5-d19a499775f5","Type":"ContainerStarted","Data":"769861d2ea8526ec1ae406e85ef81c28a4c9ca89ba4623da2e9bdae22f804bde"} Feb 18 14:16:07 crc kubenswrapper[4817]: I0218 14:16:07.450532 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-585d9bcbc-fmj4p"] Feb 18 14:16:07 crc kubenswrapper[4817]: I0218 14:16:07.457525 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Feb 18 14:16:07 crc kubenswrapper[4817]: I0218 14:16:07.460807 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"75cbd0d0-2a48-48ba-9cae-d465da658b05","Type":"ContainerStarted","Data":"f3eb39af1dd4352636b4f55105315713bf0fd6ad590dd6c015baf3cf8a643db5"} Feb 18 14:16:07 crc kubenswrapper[4817]: I0218 14:16:07.465723 4817 generic.go:334] "Generic (PLEG): container finished" podID="c43d705a-6aa5-43a0-839d-1ab705e28be6" containerID="0e5784205d0c73ad98cf5439c747c096092e48a2f0b513854e238d775e9212d8" exitCode=0 Feb 18 14:16:07 crc kubenswrapper[4817]: I0218 14:16:07.465800 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-lczjn" event={"ID":"c43d705a-6aa5-43a0-839d-1ab705e28be6","Type":"ContainerDied","Data":"0e5784205d0c73ad98cf5439c747c096092e48a2f0b513854e238d775e9212d8"} Feb 18 14:16:07 crc kubenswrapper[4817]: I0218 14:16:07.468786 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" event={"ID":"e596742a-2a5e-4a0c-9177-2b5a1ce00651","Type":"ContainerStarted","Data":"7676f505618f66eeeb8248c263b8435e20f659604d25deea14ca143778ab87d2"} Feb 18 14:16:07 crc kubenswrapper[4817]: I0218 14:16:07.470555 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" event={"ID":"864c0a91-5aa3-4a84-8b75-6f75e0883aea","Type":"ContainerStarted","Data":"8729d995cc06d07d95d67bb61dbd8739c191a692aa1f4d1a74d2071094409dca"} Feb 18 14:16:07 crc kubenswrapper[4817]: I0218 14:16:07.472287 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz" event={"ID":"00036a73-dd30-4b48-a135-19b064818e5c","Type":"ContainerStarted","Data":"da228ba0f633ed20edb5e6fd8eab7c376ed5423039b73c80199fbb698aaf2f0d"} Feb 18 14:16:07 crc kubenswrapper[4817]: E0218 14:16:07.552317 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 18 14:16:07 crc kubenswrapper[4817]: E0218 14:16:07.552474 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2gdzv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-d4d9k_openstack(a0f55bee-e270-4511-b150-5fe86f80e614): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 14:16:07 crc kubenswrapper[4817]: E0218 14:16:07.553790 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-d4d9k" podUID="a0f55bee-e270-4511-b150-5fe86f80e614" Feb 18 14:16:07 crc kubenswrapper[4817]: E0218 14:16:07.855404 4817 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Feb 18 14:16:07 crc kubenswrapper[4817]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/c43d705a-6aa5-43a0-839d-1ab705e28be6/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 18 14:16:07 crc kubenswrapper[4817]: > podSandboxID="f2aadc52b196d6aa00c6e8eb3526ecad5b6465654003e4e07159bf48070be824" Feb 18 14:16:07 crc kubenswrapper[4817]: E0218 14:16:07.855570 4817 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 18 14:16:07 crc kubenswrapper[4817]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-42gj8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-lczjn_openstack(c43d705a-6aa5-43a0-839d-1ab705e28be6): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/c43d705a-6aa5-43a0-839d-1ab705e28be6/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 18 14:16:07 crc kubenswrapper[4817]: > logger="UnhandledError" Feb 18 14:16:07 crc kubenswrapper[4817]: E0218 14:16:07.858048 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/c43d705a-6aa5-43a0-839d-1ab705e28be6/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-666b6646f7-lczjn" podUID="c43d705a-6aa5-43a0-839d-1ab705e28be6" Feb 18 14:16:08 crc kubenswrapper[4817]: I0218 14:16:08.501238 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-fmj4p" event={"ID":"f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b","Type":"ContainerStarted","Data":"aa6ba756e51d475e05e10c05e3c71324ebcdfb0ced51b3d11698006e6927fb37"} Feb 18 14:16:08 crc kubenswrapper[4817]: I0218 14:16:08.503712 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"6d8b01c7-a1be-49d1-8417-ce412fa834a4","Type":"ContainerStarted","Data":"94045979667fcef08338be123a167de2b501a62e66679167b3c8dd49bcda1437"} Feb 18 14:16:08 crc kubenswrapper[4817]: I0218 14:16:08.504955 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-qb4ph" event={"ID":"b7632ade-ab1b-45b8-9f25-9fb98abc4f1a","Type":"ContainerStarted","Data":"06c4df5877f53f28e6c94890947152a0613ad6eee1f1e3a96274bf36ec18fcc2"} Feb 18 14:16:08 crc kubenswrapper[4817]: I0218 14:16:08.508390 4817 generic.go:334] "Generic (PLEG): container finished" podID="5ffa1255-1d7c-43a4-a197-931d34164a31" containerID="c83540ab8f3e1e8a5f97b3084ce4f8bcdd393590521f6335665d87036b68375b" exitCode=0 Feb 18 14:16:08 crc kubenswrapper[4817]: I0218 14:16:08.508474 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-lfrgc" event={"ID":"5ffa1255-1d7c-43a4-a197-931d34164a31","Type":"ContainerDied","Data":"c83540ab8f3e1e8a5f97b3084ce4f8bcdd393590521f6335665d87036b68375b"} Feb 18 14:16:10 crc kubenswrapper[4817]: I0218 14:16:10.007379 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-d4d9k" Feb 18 14:16:10 crc kubenswrapper[4817]: I0218 14:16:10.129233 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gdzv\" (UniqueName: \"kubernetes.io/projected/a0f55bee-e270-4511-b150-5fe86f80e614-kube-api-access-2gdzv\") pod \"a0f55bee-e270-4511-b150-5fe86f80e614\" (UID: \"a0f55bee-e270-4511-b150-5fe86f80e614\") " Feb 18 14:16:10 crc kubenswrapper[4817]: I0218 14:16:10.129363 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0f55bee-e270-4511-b150-5fe86f80e614-dns-svc\") pod \"a0f55bee-e270-4511-b150-5fe86f80e614\" (UID: \"a0f55bee-e270-4511-b150-5fe86f80e614\") " Feb 18 14:16:10 crc kubenswrapper[4817]: I0218 14:16:10.129656 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0f55bee-e270-4511-b150-5fe86f80e614-config\") pod \"a0f55bee-e270-4511-b150-5fe86f80e614\" (UID: \"a0f55bee-e270-4511-b150-5fe86f80e614\") " Feb 18 14:16:10 crc kubenswrapper[4817]: I0218 14:16:10.130178 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0f55bee-e270-4511-b150-5fe86f80e614-config" (OuterVolumeSpecName: "config") pod "a0f55bee-e270-4511-b150-5fe86f80e614" (UID: "a0f55bee-e270-4511-b150-5fe86f80e614"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:16:10 crc kubenswrapper[4817]: I0218 14:16:10.130555 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0f55bee-e270-4511-b150-5fe86f80e614-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a0f55bee-e270-4511-b150-5fe86f80e614" (UID: "a0f55bee-e270-4511-b150-5fe86f80e614"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:16:10 crc kubenswrapper[4817]: I0218 14:16:10.158729 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0f55bee-e270-4511-b150-5fe86f80e614-kube-api-access-2gdzv" (OuterVolumeSpecName: "kube-api-access-2gdzv") pod "a0f55bee-e270-4511-b150-5fe86f80e614" (UID: "a0f55bee-e270-4511-b150-5fe86f80e614"). InnerVolumeSpecName "kube-api-access-2gdzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:16:10 crc kubenswrapper[4817]: I0218 14:16:10.232097 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0f55bee-e270-4511-b150-5fe86f80e614-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:16:10 crc kubenswrapper[4817]: I0218 14:16:10.232141 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gdzv\" (UniqueName: \"kubernetes.io/projected/a0f55bee-e270-4511-b150-5fe86f80e614-kube-api-access-2gdzv\") on node \"crc\" DevicePath \"\"" Feb 18 14:16:10 crc kubenswrapper[4817]: I0218 14:16:10.232155 4817 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0f55bee-e270-4511-b150-5fe86f80e614-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 14:16:10 crc kubenswrapper[4817]: I0218 14:16:10.523164 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-d4d9k" event={"ID":"a0f55bee-e270-4511-b150-5fe86f80e614","Type":"ContainerDied","Data":"e7376d29005cf9cbf6b4d2fddee5837f70d9c5122f50ac3c0bbbeaad505304b7"} Feb 18 14:16:10 crc kubenswrapper[4817]: I0218 14:16:10.523204 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-d4d9k" Feb 18 14:16:10 crc kubenswrapper[4817]: I0218 14:16:10.565339 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-d4d9k"] Feb 18 14:16:10 crc kubenswrapper[4817]: I0218 14:16:10.571766 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-d4d9k"] Feb 18 14:16:12 crc kubenswrapper[4817]: I0218 14:16:12.183867 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0f55bee-e270-4511-b150-5fe86f80e614" path="/var/lib/kubelet/pods/a0f55bee-e270-4511-b150-5fe86f80e614/volumes" Feb 18 14:16:12 crc kubenswrapper[4817]: I0218 14:16:12.863476 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:16:12 crc kubenswrapper[4817]: I0218 14:16:12.863549 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:16:18 crc kubenswrapper[4817]: E0218 14:16:18.739916 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified" Feb 18 14:16:18 crc kubenswrapper[4817]: E0218 14:16:18.741087 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:ovsdb-server-init,Image:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,Command:[/usr/local/bin/container-scripts/init-ovsdb-server.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n596hdch667h5bh6dh8dh76h59bhf4hd8h5fch9dhd4h68hbch5cbh89h675hbbh5d6h55fh668h66fh68fh88hd6h5bbh598hf9h7bh695h9fq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-ovs,ReadOnly:false,MountPath:/etc/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log,ReadOnly:false,MountPath:/var/log/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-lib,ReadOnly:false,MountPath:/var/lib/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v56lc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-ovs-46wx9_openstack(2a166377-16ac-4c6b-9207-cddf8c814dc1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 14:16:18 crc kubenswrapper[4817]: E0218 14:16:18.742348 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdb-server-init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-ovs-46wx9" podUID="2a166377-16ac-4c6b-9207-cddf8c814dc1" Feb 18 14:16:19 crc kubenswrapper[4817]: E0218 14:16:19.599936 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdb-server-init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified\\\"\"" pod="openstack/ovn-controller-ovs-46wx9" podUID="2a166377-16ac-4c6b-9207-cddf8c814dc1" Feb 18 14:16:20 crc kubenswrapper[4817]: E0218 14:16:20.858795 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Feb 18 14:16:20 crc kubenswrapper[4817]: E0218 14:16:20.859289 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lthwz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(def7b080-de6e-49f1-9437-44d6f40b48c4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 14:16:20 crc kubenswrapper[4817]: E0218 14:16:20.860483 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="def7b080-de6e-49f1-9437-44d6f40b48c4" Feb 18 14:16:21 crc kubenswrapper[4817]: E0218 14:16:21.001826 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Feb 18 14:16:21 crc kubenswrapper[4817]: E0218 14:16:21.002030 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wgh5g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 14:16:21 crc kubenswrapper[4817]: E0218 14:16:21.003256 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1" Feb 18 14:16:21 crc kubenswrapper[4817]: E0218 14:16:21.184858 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Feb 18 14:16:21 crc kubenswrapper[4817]: E0218 14:16:21.185479 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-prh8r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(641785a9-2372-4857-8882-192bf7d7fe45): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 14:16:21 crc kubenswrapper[4817]: E0218 14:16:21.186774 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="641785a9-2372-4857-8882-192bf7d7fe45" Feb 18 14:16:21 crc kubenswrapper[4817]: E0218 14:16:21.557518 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:41eda20b890c200ee7fce0b56b5d168445cd9a6486d560f39ce73d0704e03934" Feb 18 14:16:21 crc kubenswrapper[4817]: E0218 14:16:21.557758 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:gateway,Image:registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:41eda20b890c200ee7fce0b56b5d168445cd9a6486d560f39ce73d0704e03934,Command:[],Args:[--debug.name=lokistack-gateway --web.listen=0.0.0.0:8080 --web.internal.listen=0.0.0.0:8081 --web.healthchecks.url=https://localhost:8080 --log.level=warn --logs.read.endpoint=https://cloudkitty-lokistack-query-frontend-http.openstack.svc.cluster.local:3100 --logs.tail.endpoint=https://cloudkitty-lokistack-query-frontend-http.openstack.svc.cluster.local:3100 --logs.write.endpoint=https://cloudkitty-lokistack-distributor-http.openstack.svc.cluster.local:3100 --logs.write-timeout=4m0s --rbac.config=/etc/lokistack-gateway/rbac.yaml --tenants.config=/etc/lokistack-gateway/tenants.yaml --server.read-timeout=48s --server.write-timeout=6m0s --tls.min-version=VersionTLS12 --tls.server.cert-file=/var/run/tls/http/server/tls.crt --tls.server.key-file=/var/run/tls/http/server/tls.key --tls.healthchecks.server-ca-file=/var/run/ca/server/service-ca.crt --tls.healthchecks.server-name=cloudkitty-lokistack-gateway-http.openstack.svc.cluster.local --tls.internal.server.cert-file=/var/run/tls/http/server/tls.crt --tls.internal.server.key-file=/var/run/tls/http/server/tls.key --tls.min-version=VersionTLS12 --tls.cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --logs.tls.ca-file=/var/run/ca/upstream/service-ca.crt --logs.tls.cert-file=/var/run/tls/http/upstream/tls.crt --logs.tls.key-file=/var/run/tls/http/upstream/tls.key --tls.client-auth-type=RequestClientCert],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},ContainerPort{Name:public,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rbac,ReadOnly:true,MountPath:/etc/lokistack-gateway/rbac.yaml,SubPath:rbac.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tenants,ReadOnly:true,MountPath:/etc/lokistack-gateway/tenants.yaml,SubPath:tenants.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:lokistack-gateway,ReadOnly:true,MountPath:/etc/lokistack-gateway/lokistack-gateway.rego,SubPath:lokistack-gateway.rego,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tls-secret,ReadOnly:true,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-gateway-client-http,ReadOnly:true,MountPath:/var/run/tls/http/upstream,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:true,MountPath:/var/run/ca/upstream,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-gateway-ca-bundle,ReadOnly:true,MountPath:/var/run/ca/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-ca-bundle,ReadOnly:false,MountPath:/var/run/tenants-ca/cloudkitty,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bsq2c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/live,Port:{0 8081 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 8081 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:12,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-gateway-7f8685b49f-j7gxb_openstack(e596742a-2a5e-4a0c-9177-2b5a1ce00651): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 18 14:16:21 crc kubenswrapper[4817]: E0218 14:16:21.559033 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"gateway\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" podUID="e596742a-2a5e-4a0c-9177-2b5a1ce00651" Feb 18 14:16:21 crc kubenswrapper[4817]: E0218 14:16:21.610865 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"gateway\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:41eda20b890c200ee7fce0b56b5d168445cd9a6486d560f39ce73d0704e03934\\\"\"" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" podUID="e596742a-2a5e-4a0c-9177-2b5a1ce00651" Feb 18 14:16:21 crc kubenswrapper[4817]: E0218 14:16:21.610971 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="def7b080-de6e-49f1-9437-44d6f40b48c4" Feb 18 14:16:21 crc kubenswrapper[4817]: E0218 14:16:21.612045 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1" Feb 18 14:16:21 crc kubenswrapper[4817]: E0218 14:16:21.612062 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="641785a9-2372-4857-8882-192bf7d7fe45" Feb 18 14:16:22 crc kubenswrapper[4817]: E0218 14:16:22.593364 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981" Feb 18 14:16:22 crc kubenswrapper[4817]: E0218 14:16:22.593599 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-querier,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981,Command:[],Args:[-target=querier -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:gossip-ring,HostPort:0,ContainerPort:7946,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:AWS_ACCESS_KEY_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_id,Optional:nil,},},},EnvVar{Name:AWS_ACCESS_KEY_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_secret,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-querier-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-loki-s3,ReadOnly:false,MountPath:/etc/storage/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-querier-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qxhzn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-querier-58c84b5844-qb4ph_openstack(b7632ade-ab1b-45b8-9f25-9fb98abc4f1a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 18 14:16:22 crc kubenswrapper[4817]: E0218 14:16:22.594946 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-querier\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-qb4ph" podUID="b7632ade-ab1b-45b8-9f25-9fb98abc4f1a" Feb 18 14:16:22 crc kubenswrapper[4817]: E0218 14:16:22.618942 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-querier\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981\\\"\"" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-qb4ph" podUID="b7632ade-ab1b-45b8-9f25-9fb98abc4f1a" Feb 18 14:16:22 crc kubenswrapper[4817]: E0218 14:16:22.854440 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified" Feb 18 14:16:22 crc kubenswrapper[4817]: E0218 14:16:22.854722 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-sb,Image:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nc4h565h87h576h64fhcfh5fdh65h667h5f5hbbh5ch545h5fh687h5b6h569h5b5h6hfhfh569hd7h5fbh687h679h99h5c5h5b9h64bh9fh5f4q,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-sb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pfgk2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-sb-0_openstack(317526d8-4a73-4ae4-9607-b1d7375ba7f6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 14:16:23 crc kubenswrapper[4817]: E0218 14:16:23.217570 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified" Feb 18 14:16:23 crc kubenswrapper[4817]: E0218 14:16:23.217757 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovn-controller,Image:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,Command:[ovn-controller --pidfile unix:/run/openvswitch/db.sock --certificate=/etc/pki/tls/certs/ovndb.crt --private-key=/etc/pki/tls/private/ovndb.key --ca-cert=/etc/pki/tls/certs/ovndbca.crt],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n596hdch667h5bh6dh8dh76h59bhf4hd8h5fch9dhd4h68hbch5cbh89h675hbbh5d6h55fh668h66fh68fh88hd6h5bbh598hf9h7bh695h9fq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run-ovn,ReadOnly:false,MountPath:/var/run/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log-ovn,ReadOnly:false,MountPath:/var/log/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6lplc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_liveness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_readiness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/share/ovn/scripts/ovn-ctl stop_controller],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-9bcxg_openstack(ddb73215-bd2a-47eb-bbcf-b4708117244f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 14:16:23 crc kubenswrapper[4817]: E0218 14:16:23.219045 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-9bcxg" podUID="ddb73215-bd2a-47eb-bbcf-b4708117244f" Feb 18 14:16:23 crc kubenswrapper[4817]: E0218 14:16:23.518349 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified" Feb 18 14:16:23 crc kubenswrapper[4817]: E0218 14:16:23.518505 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-nb,Image:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n595hcfh5f7h5c4h75h557h67chd9h56ch6hfh678h56ch54fh646h56dh658h5bdh5fh75h675h66ch87h5bbh547h558h99h595h677h56fh547h55fq,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-nb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4jg7s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-nb-0_openstack(b89d17a9-16cb-4abe-ba88-107ce95dbceb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 14:16:23 crc kubenswrapper[4817]: E0218 14:16:23.628838 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified\\\"\"" pod="openstack/ovn-controller-9bcxg" podUID="ddb73215-bd2a-47eb-bbcf-b4708117244f" Feb 18 14:16:24 crc kubenswrapper[4817]: I0218 14:16:24.638001 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-lfrgc" event={"ID":"5ffa1255-1d7c-43a4-a197-931d34164a31","Type":"ContainerStarted","Data":"57f72dcd454daf960339e19ab0c7a980f7d3db07916e4786141c767225a3b2ac"} Feb 18 14:16:24 crc kubenswrapper[4817]: I0218 14:16:24.638538 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-lfrgc" Feb 18 14:16:24 crc kubenswrapper[4817]: I0218 14:16:24.665269 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-lfrgc" podStartSLOduration=-9223371999.189528 podStartE2EDuration="37.665247418s" podCreationTimestamp="2026-02-18 14:15:47 +0000 UTC" firstStartedPulling="2026-02-18 14:15:48.318794576 +0000 UTC m=+1010.894330559" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:16:24.658236682 +0000 UTC m=+1047.233772685" watchObservedRunningTime="2026-02-18 14:16:24.665247418 +0000 UTC m=+1047.240783401" Feb 18 14:16:26 crc kubenswrapper[4817]: I0218 14:16:26.657140 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" event={"ID":"864c0a91-5aa3-4a84-8b75-6f75e0883aea","Type":"ContainerStarted","Data":"e48e858b8c4b16093fc5ebedab50a57eac1a6fd272d6a6ab6167f14ffaae06be"} Feb 18 14:16:26 crc kubenswrapper[4817]: I0218 14:16:26.657801 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" Feb 18 14:16:26 crc kubenswrapper[4817]: I0218 14:16:26.659684 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"cb347a6f-041d-41e7-be8b-b151f150e6ab","Type":"ContainerStarted","Data":"b87d4333deb83bffddf631c35a07ce191b062de5ae7eb8d6470f2242217721fe"} Feb 18 14:16:26 crc kubenswrapper[4817]: I0218 14:16:26.660071 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 18 14:16:26 crc kubenswrapper[4817]: I0218 14:16:26.661144 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"6d8b01c7-a1be-49d1-8417-ce412fa834a4","Type":"ContainerStarted","Data":"f274073c41518ee0a6368b64daf00715189bcddade2f4279d4c7a4fe214495ea"} Feb 18 14:16:26 crc kubenswrapper[4817]: I0218 14:16:26.661210 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 18 14:16:26 crc kubenswrapper[4817]: I0218 14:16:26.663794 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-lczjn" event={"ID":"c43d705a-6aa5-43a0-839d-1ab705e28be6","Type":"ContainerStarted","Data":"2982da1050d613096062e7dc3c83a58f7673e71ec3b39f1cb513bce1ddb34824"} Feb 18 14:16:26 crc kubenswrapper[4817]: I0218 14:16:26.664004 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-lczjn" Feb 18 14:16:26 crc kubenswrapper[4817]: I0218 14:16:26.671669 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" Feb 18 14:16:26 crc kubenswrapper[4817]: I0218 14:16:26.683497 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" podStartSLOduration=5.381866285 podStartE2EDuration="21.683474147s" podCreationTimestamp="2026-02-18 14:16:05 +0000 UTC" firstStartedPulling="2026-02-18 14:16:07.281448 +0000 UTC m=+1029.856983993" lastFinishedPulling="2026-02-18 14:16:23.583055882 +0000 UTC m=+1046.158591855" observedRunningTime="2026-02-18 14:16:26.678775649 +0000 UTC m=+1049.254311642" watchObservedRunningTime="2026-02-18 14:16:26.683474147 +0000 UTC m=+1049.259010120" Feb 18 14:16:26 crc kubenswrapper[4817]: I0218 14:16:26.729959 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-index-gateway-0" podStartSLOduration=5.577970569 podStartE2EDuration="21.729937502s" podCreationTimestamp="2026-02-18 14:16:05 +0000 UTC" firstStartedPulling="2026-02-18 14:16:07.432232068 +0000 UTC m=+1030.007768051" lastFinishedPulling="2026-02-18 14:16:23.584199001 +0000 UTC m=+1046.159734984" observedRunningTime="2026-02-18 14:16:26.708433853 +0000 UTC m=+1049.283969836" watchObservedRunningTime="2026-02-18 14:16:26.729937502 +0000 UTC m=+1049.305473495" Feb 18 14:16:26 crc kubenswrapper[4817]: I0218 14:16:26.756307 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-lczjn" podStartSLOduration=21.170950143 podStartE2EDuration="39.756285243s" podCreationTimestamp="2026-02-18 14:15:47 +0000 UTC" firstStartedPulling="2026-02-18 14:15:48.039399976 +0000 UTC m=+1010.614935959" lastFinishedPulling="2026-02-18 14:16:06.624735076 +0000 UTC m=+1029.200271059" observedRunningTime="2026-02-18 14:16:26.748546819 +0000 UTC m=+1049.324082812" watchObservedRunningTime="2026-02-18 14:16:26.756285243 +0000 UTC m=+1049.331821226" Feb 18 14:16:26 crc kubenswrapper[4817]: I0218 14:16:26.779215 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=16.991493305 podStartE2EDuration="35.779190707s" podCreationTimestamp="2026-02-18 14:15:51 +0000 UTC" firstStartedPulling="2026-02-18 14:16:04.013918355 +0000 UTC m=+1026.589454338" lastFinishedPulling="2026-02-18 14:16:22.801615757 +0000 UTC m=+1045.377151740" observedRunningTime="2026-02-18 14:16:26.772547221 +0000 UTC m=+1049.348083214" watchObservedRunningTime="2026-02-18 14:16:26.779190707 +0000 UTC m=+1049.354726690" Feb 18 14:16:27 crc kubenswrapper[4817]: I0218 14:16:27.673654 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"7f685dd5-8921-4e4a-a4d5-d19a499775f5","Type":"ContainerStarted","Data":"cba08099f82ee6f50a51378deef8b46efee6b0dba36801c07f6fbc0023187cb0"} Feb 18 14:16:27 crc kubenswrapper[4817]: I0218 14:16:27.674007 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 18 14:16:27 crc kubenswrapper[4817]: I0218 14:16:27.679049 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz" event={"ID":"00036a73-dd30-4b48-a135-19b064818e5c","Type":"ContainerStarted","Data":"edefdb5d37e44040c9100da30c4075b793f343dd810897ea2ccce814d5072571"} Feb 18 14:16:27 crc kubenswrapper[4817]: I0218 14:16:27.679408 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz" Feb 18 14:16:27 crc kubenswrapper[4817]: I0218 14:16:27.681487 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"14e634c8-da00-43a5-96a8-33e8bf806873","Type":"ContainerStarted","Data":"9a5e9d303e366c3bb999a78527cf3761a74f192fe80921e3e7ff942f31d4204e"} Feb 18 14:16:27 crc kubenswrapper[4817]: I0218 14:16:27.703384 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-ingester-0" podStartSLOduration=7.323407143 podStartE2EDuration="23.703357322s" podCreationTimestamp="2026-02-18 14:16:04 +0000 UTC" firstStartedPulling="2026-02-18 14:16:07.397696313 +0000 UTC m=+1029.973232306" lastFinishedPulling="2026-02-18 14:16:23.777646502 +0000 UTC m=+1046.353182485" observedRunningTime="2026-02-18 14:16:27.696434888 +0000 UTC m=+1050.271970871" watchObservedRunningTime="2026-02-18 14:16:27.703357322 +0000 UTC m=+1050.278893305" Feb 18 14:16:27 crc kubenswrapper[4817]: E0218 14:16:27.745313 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-sb-0" podUID="317526d8-4a73-4ae4-9607-b1d7375ba7f6" Feb 18 14:16:27 crc kubenswrapper[4817]: I0218 14:16:27.761225 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz" podStartSLOduration=5.87562957 podStartE2EDuration="22.761204602s" podCreationTimestamp="2026-02-18 14:16:05 +0000 UTC" firstStartedPulling="2026-02-18 14:16:07.135990666 +0000 UTC m=+1029.711526649" lastFinishedPulling="2026-02-18 14:16:24.021565698 +0000 UTC m=+1046.597101681" observedRunningTime="2026-02-18 14:16:27.754829352 +0000 UTC m=+1050.330365355" watchObservedRunningTime="2026-02-18 14:16:27.761204602 +0000 UTC m=+1050.336740585" Feb 18 14:16:27 crc kubenswrapper[4817]: E0218 14:16:27.870652 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-nb-0" podUID="b89d17a9-16cb-4abe-ba88-107ce95dbceb" Feb 18 14:16:28 crc kubenswrapper[4817]: I0218 14:16:28.691638 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"317526d8-4a73-4ae4-9607-b1d7375ba7f6","Type":"ContainerStarted","Data":"9d72fe6d5fa500fcc890c98b0571bda9f4c6bc95295439d8ad72155133d38169"} Feb 18 14:16:28 crc kubenswrapper[4817]: E0218 14:16:28.693533 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-sb-0" podUID="317526d8-4a73-4ae4-9607-b1d7375ba7f6" Feb 18 14:16:28 crc kubenswrapper[4817]: I0218 14:16:28.694647 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"9e5146f3-4a88-4e31-82e7-0e0f72188d22","Type":"ContainerStarted","Data":"4550af354e970212218555e1f005bf49df45aae74b88d6f3ee3117d435e93783"} Feb 18 14:16:28 crc kubenswrapper[4817]: I0218 14:16:28.696799 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b50fdcb5-1983-4d14-ab5f-390b3dc090ce","Type":"ContainerStarted","Data":"024c6cade3c38911d1846544533703753c41b2dbdc33336d6261e602485f6d9b"} Feb 18 14:16:28 crc kubenswrapper[4817]: I0218 14:16:28.696908 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 18 14:16:28 crc kubenswrapper[4817]: I0218 14:16:28.699846 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b89d17a9-16cb-4abe-ba88-107ce95dbceb","Type":"ContainerStarted","Data":"f0029a2ae0f28a03d03dd2c46b2a0f76bd1d6312948db47073acd5fce82fe6d0"} Feb 18 14:16:28 crc kubenswrapper[4817]: E0218 14:16:28.701475 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="b89d17a9-16cb-4abe-ba88-107ce95dbceb" Feb 18 14:16:28 crc kubenswrapper[4817]: I0218 14:16:28.702911 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"75cbd0d0-2a48-48ba-9cae-d465da658b05","Type":"ContainerStarted","Data":"9b41a7d72834ac4d316339f4c211a26f2c0cbfdb0d18cbdc8cecf0955cd053d5"} Feb 18 14:16:28 crc kubenswrapper[4817]: I0218 14:16:28.703019 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 18 14:16:28 crc kubenswrapper[4817]: I0218 14:16:28.706053 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d5d448f4-839e-4b71-ac6e-0c941ccd5a14","Type":"ContainerStarted","Data":"c7916262128ab97149b2ff1a8ebd8ab2eaf07d5003c9118d7e36bb46bc6a4812"} Feb 18 14:16:28 crc kubenswrapper[4817]: I0218 14:16:28.708213 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-fmj4p" event={"ID":"f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b","Type":"ContainerStarted","Data":"bbd48c703832e79e50d9fee662eb5fa65c8ad2f48c46d251df71a548ee90a42d"} Feb 18 14:16:28 crc kubenswrapper[4817]: I0218 14:16:28.787260 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-fmj4p" podStartSLOduration=8.441805696 podStartE2EDuration="24.78724355s" podCreationTimestamp="2026-02-18 14:16:04 +0000 UTC" firstStartedPulling="2026-02-18 14:16:07.432227758 +0000 UTC m=+1030.007763741" lastFinishedPulling="2026-02-18 14:16:23.777665612 +0000 UTC m=+1046.353201595" observedRunningTime="2026-02-18 14:16:28.785046395 +0000 UTC m=+1051.360582378" watchObservedRunningTime="2026-02-18 14:16:28.78724355 +0000 UTC m=+1051.362779533" Feb 18 14:16:28 crc kubenswrapper[4817]: I0218 14:16:28.815250 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-compactor-0" podStartSLOduration=7.167234647 podStartE2EDuration="23.815227412s" podCreationTimestamp="2026-02-18 14:16:05 +0000 UTC" firstStartedPulling="2026-02-18 14:16:07.130824266 +0000 UTC m=+1029.706360249" lastFinishedPulling="2026-02-18 14:16:23.778817031 +0000 UTC m=+1046.354353014" observedRunningTime="2026-02-18 14:16:28.804519093 +0000 UTC m=+1051.380055096" watchObservedRunningTime="2026-02-18 14:16:28.815227412 +0000 UTC m=+1051.390763395" Feb 18 14:16:28 crc kubenswrapper[4817]: I0218 14:16:28.834441 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=11.621817063 podStartE2EDuration="34.834416013s" podCreationTimestamp="2026-02-18 14:15:54 +0000 UTC" firstStartedPulling="2026-02-18 14:16:04.044495311 +0000 UTC m=+1026.620031294" lastFinishedPulling="2026-02-18 14:16:27.257094261 +0000 UTC m=+1049.832630244" observedRunningTime="2026-02-18 14:16:28.827417377 +0000 UTC m=+1051.402953380" watchObservedRunningTime="2026-02-18 14:16:28.834416013 +0000 UTC m=+1051.409951996" Feb 18 14:16:29 crc kubenswrapper[4817]: I0218 14:16:29.715812 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-fmj4p" Feb 18 14:16:29 crc kubenswrapper[4817]: E0218 14:16:29.717595 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-sb-0" podUID="317526d8-4a73-4ae4-9607-b1d7375ba7f6" Feb 18 14:16:29 crc kubenswrapper[4817]: E0218 14:16:29.717623 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="b89d17a9-16cb-4abe-ba88-107ce95dbceb" Feb 18 14:16:31 crc kubenswrapper[4817]: I0218 14:16:31.737596 4817 generic.go:334] "Generic (PLEG): container finished" podID="2a166377-16ac-4c6b-9207-cddf8c814dc1" containerID="f0be46051fed9bc97ad149e5c230305b5c7a53cbd342a60be333f9e179df0b1a" exitCode=0 Feb 18 14:16:31 crc kubenswrapper[4817]: I0218 14:16:31.737690 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-46wx9" event={"ID":"2a166377-16ac-4c6b-9207-cddf8c814dc1","Type":"ContainerDied","Data":"f0be46051fed9bc97ad149e5c230305b5c7a53cbd342a60be333f9e179df0b1a"} Feb 18 14:16:31 crc kubenswrapper[4817]: I0218 14:16:31.807205 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 18 14:16:32 crc kubenswrapper[4817]: I0218 14:16:32.431181 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-lczjn" Feb 18 14:16:32 crc kubenswrapper[4817]: I0218 14:16:32.766768 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-46wx9" event={"ID":"2a166377-16ac-4c6b-9207-cddf8c814dc1","Type":"ContainerStarted","Data":"3552ead053575717b3780faa919eeb14483c2886d5270476012b5dc382807e7a"} Feb 18 14:16:32 crc kubenswrapper[4817]: I0218 14:16:32.766816 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-46wx9" event={"ID":"2a166377-16ac-4c6b-9207-cddf8c814dc1","Type":"ContainerStarted","Data":"15837fd7c12dc56b5f26884c40e736136135838ade6dc3515068827dc3f4f27e"} Feb 18 14:16:32 crc kubenswrapper[4817]: I0218 14:16:32.767993 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-46wx9" Feb 18 14:16:32 crc kubenswrapper[4817]: I0218 14:16:32.768056 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-46wx9" Feb 18 14:16:32 crc kubenswrapper[4817]: I0218 14:16:32.790017 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-46wx9" podStartSLOduration=10.327491854 podStartE2EDuration="36.789994171s" podCreationTimestamp="2026-02-18 14:15:56 +0000 UTC" firstStartedPulling="2026-02-18 14:16:04.217573117 +0000 UTC m=+1026.793109090" lastFinishedPulling="2026-02-18 14:16:30.680075414 +0000 UTC m=+1053.255611407" observedRunningTime="2026-02-18 14:16:32.7891347 +0000 UTC m=+1055.364670683" watchObservedRunningTime="2026-02-18 14:16:32.789994171 +0000 UTC m=+1055.365530154" Feb 18 14:16:32 crc kubenswrapper[4817]: I0218 14:16:32.809134 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-lfrgc" Feb 18 14:16:32 crc kubenswrapper[4817]: I0218 14:16:32.895001 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-lczjn"] Feb 18 14:16:32 crc kubenswrapper[4817]: I0218 14:16:32.895398 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-lczjn" podUID="c43d705a-6aa5-43a0-839d-1ab705e28be6" containerName="dnsmasq-dns" containerID="cri-o://2982da1050d613096062e7dc3c83a58f7673e71ec3b39f1cb513bce1ddb34824" gracePeriod=10 Feb 18 14:16:33 crc kubenswrapper[4817]: I0218 14:16:33.787699 4817 generic.go:334] "Generic (PLEG): container finished" podID="c43d705a-6aa5-43a0-839d-1ab705e28be6" containerID="2982da1050d613096062e7dc3c83a58f7673e71ec3b39f1cb513bce1ddb34824" exitCode=0 Feb 18 14:16:33 crc kubenswrapper[4817]: I0218 14:16:33.787776 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-lczjn" event={"ID":"c43d705a-6aa5-43a0-839d-1ab705e28be6","Type":"ContainerDied","Data":"2982da1050d613096062e7dc3c83a58f7673e71ec3b39f1cb513bce1ddb34824"} Feb 18 14:16:34 crc kubenswrapper[4817]: I0218 14:16:34.465610 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-95m6m"] Feb 18 14:16:34 crc kubenswrapper[4817]: I0218 14:16:34.469525 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-95m6m" Feb 18 14:16:34 crc kubenswrapper[4817]: I0218 14:16:34.486423 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-95m6m"] Feb 18 14:16:34 crc kubenswrapper[4817]: I0218 14:16:34.509100 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6-config\") pod \"dnsmasq-dns-7cb5889db5-95m6m\" (UID: \"ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6\") " pod="openstack/dnsmasq-dns-7cb5889db5-95m6m" Feb 18 14:16:34 crc kubenswrapper[4817]: I0218 14:16:34.509172 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-95m6m\" (UID: \"ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6\") " pod="openstack/dnsmasq-dns-7cb5889db5-95m6m" Feb 18 14:16:34 crc kubenswrapper[4817]: I0218 14:16:34.509300 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4qwr\" (UniqueName: \"kubernetes.io/projected/ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6-kube-api-access-n4qwr\") pod \"dnsmasq-dns-7cb5889db5-95m6m\" (UID: \"ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6\") " pod="openstack/dnsmasq-dns-7cb5889db5-95m6m" Feb 18 14:16:34 crc kubenswrapper[4817]: I0218 14:16:34.611107 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6-config\") pod \"dnsmasq-dns-7cb5889db5-95m6m\" (UID: \"ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6\") " pod="openstack/dnsmasq-dns-7cb5889db5-95m6m" Feb 18 14:16:34 crc kubenswrapper[4817]: I0218 14:16:34.611526 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-95m6m\" (UID: \"ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6\") " pod="openstack/dnsmasq-dns-7cb5889db5-95m6m" Feb 18 14:16:34 crc kubenswrapper[4817]: I0218 14:16:34.611625 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4qwr\" (UniqueName: \"kubernetes.io/projected/ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6-kube-api-access-n4qwr\") pod \"dnsmasq-dns-7cb5889db5-95m6m\" (UID: \"ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6\") " pod="openstack/dnsmasq-dns-7cb5889db5-95m6m" Feb 18 14:16:34 crc kubenswrapper[4817]: I0218 14:16:34.612359 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6-config\") pod \"dnsmasq-dns-7cb5889db5-95m6m\" (UID: \"ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6\") " pod="openstack/dnsmasq-dns-7cb5889db5-95m6m" Feb 18 14:16:34 crc kubenswrapper[4817]: I0218 14:16:34.612892 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-95m6m\" (UID: \"ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6\") " pod="openstack/dnsmasq-dns-7cb5889db5-95m6m" Feb 18 14:16:34 crc kubenswrapper[4817]: I0218 14:16:34.637869 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4qwr\" (UniqueName: \"kubernetes.io/projected/ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6-kube-api-access-n4qwr\") pod \"dnsmasq-dns-7cb5889db5-95m6m\" (UID: \"ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6\") " pod="openstack/dnsmasq-dns-7cb5889db5-95m6m" Feb 18 14:16:34 crc kubenswrapper[4817]: I0218 14:16:34.704226 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 18 14:16:34 crc kubenswrapper[4817]: I0218 14:16:34.800221 4817 generic.go:334] "Generic (PLEG): container finished" podID="d5d448f4-839e-4b71-ac6e-0c941ccd5a14" containerID="c7916262128ab97149b2ff1a8ebd8ab2eaf07d5003c9118d7e36bb46bc6a4812" exitCode=0 Feb 18 14:16:34 crc kubenswrapper[4817]: I0218 14:16:34.801163 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d5d448f4-839e-4b71-ac6e-0c941ccd5a14","Type":"ContainerDied","Data":"c7916262128ab97149b2ff1a8ebd8ab2eaf07d5003c9118d7e36bb46bc6a4812"} Feb 18 14:16:34 crc kubenswrapper[4817]: I0218 14:16:34.804473 4817 generic.go:334] "Generic (PLEG): container finished" podID="9e5146f3-4a88-4e31-82e7-0e0f72188d22" containerID="4550af354e970212218555e1f005bf49df45aae74b88d6f3ee3117d435e93783" exitCode=0 Feb 18 14:16:34 crc kubenswrapper[4817]: I0218 14:16:34.804615 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"9e5146f3-4a88-4e31-82e7-0e0f72188d22","Type":"ContainerDied","Data":"4550af354e970212218555e1f005bf49df45aae74b88d6f3ee3117d435e93783"} Feb 18 14:16:34 crc kubenswrapper[4817]: I0218 14:16:34.848156 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-95m6m" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.327341 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-95m6m"] Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.538205 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-lczjn" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.641884 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c43d705a-6aa5-43a0-839d-1ab705e28be6-config\") pod \"c43d705a-6aa5-43a0-839d-1ab705e28be6\" (UID: \"c43d705a-6aa5-43a0-839d-1ab705e28be6\") " Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.642017 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c43d705a-6aa5-43a0-839d-1ab705e28be6-dns-svc\") pod \"c43d705a-6aa5-43a0-839d-1ab705e28be6\" (UID: \"c43d705a-6aa5-43a0-839d-1ab705e28be6\") " Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.642189 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42gj8\" (UniqueName: \"kubernetes.io/projected/c43d705a-6aa5-43a0-839d-1ab705e28be6-kube-api-access-42gj8\") pod \"c43d705a-6aa5-43a0-839d-1ab705e28be6\" (UID: \"c43d705a-6aa5-43a0-839d-1ab705e28be6\") " Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.651046 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c43d705a-6aa5-43a0-839d-1ab705e28be6-kube-api-access-42gj8" (OuterVolumeSpecName: "kube-api-access-42gj8") pod "c43d705a-6aa5-43a0-839d-1ab705e28be6" (UID: "c43d705a-6aa5-43a0-839d-1ab705e28be6"). InnerVolumeSpecName "kube-api-access-42gj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.663215 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 18 14:16:35 crc kubenswrapper[4817]: E0218 14:16:35.663628 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c43d705a-6aa5-43a0-839d-1ab705e28be6" containerName="dnsmasq-dns" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.663642 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="c43d705a-6aa5-43a0-839d-1ab705e28be6" containerName="dnsmasq-dns" Feb 18 14:16:35 crc kubenswrapper[4817]: E0218 14:16:35.663674 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c43d705a-6aa5-43a0-839d-1ab705e28be6" containerName="init" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.663681 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="c43d705a-6aa5-43a0-839d-1ab705e28be6" containerName="init" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.663914 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="c43d705a-6aa5-43a0-839d-1ab705e28be6" containerName="dnsmasq-dns" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.671860 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.683793 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-2n8gh" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.685567 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.685857 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.686026 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.700936 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.744425 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42gj8\" (UniqueName: \"kubernetes.io/projected/c43d705a-6aa5-43a0-839d-1ab705e28be6-kube-api-access-42gj8\") on node \"crc\" DevicePath \"\"" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.790091 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c43d705a-6aa5-43a0-839d-1ab705e28be6-config" (OuterVolumeSpecName: "config") pod "c43d705a-6aa5-43a0-839d-1ab705e28be6" (UID: "c43d705a-6aa5-43a0-839d-1ab705e28be6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.825578 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" event={"ID":"e596742a-2a5e-4a0c-9177-2b5a1ce00651","Type":"ContainerStarted","Data":"e29cab569fca955b5e5ffbd213ba5fe699d48dc7dc91f76bea530ecf1ea9adc6"} Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.826504 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c43d705a-6aa5-43a0-839d-1ab705e28be6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c43d705a-6aa5-43a0-839d-1ab705e28be6" (UID: "c43d705a-6aa5-43a0-839d-1ab705e28be6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.826804 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.842793 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1","Type":"ContainerStarted","Data":"3a7b1cf4852e8739319f69c8a261a36817c331f73c78b92b691e2916854cabb5"} Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.845905 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/77de7364-0925-438c-89e2-6ff0d3cb0776-etc-swift\") pod \"swift-storage-0\" (UID: \"77de7364-0925-438c-89e2-6ff0d3cb0776\") " pod="openstack/swift-storage-0" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.846094 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77de7364-0925-438c-89e2-6ff0d3cb0776-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"77de7364-0925-438c-89e2-6ff0d3cb0776\") " pod="openstack/swift-storage-0" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.846131 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7f6bb2f6-dd37-4018-8cad-b9e1c48732ce\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7f6bb2f6-dd37-4018-8cad-b9e1c48732ce\") pod \"swift-storage-0\" (UID: \"77de7364-0925-438c-89e2-6ff0d3cb0776\") " pod="openstack/swift-storage-0" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.846259 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s4ch\" (UniqueName: \"kubernetes.io/projected/77de7364-0925-438c-89e2-6ff0d3cb0776-kube-api-access-6s4ch\") pod \"swift-storage-0\" (UID: \"77de7364-0925-438c-89e2-6ff0d3cb0776\") " pod="openstack/swift-storage-0" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.846304 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/77de7364-0925-438c-89e2-6ff0d3cb0776-lock\") pod \"swift-storage-0\" (UID: \"77de7364-0925-438c-89e2-6ff0d3cb0776\") " pod="openstack/swift-storage-0" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.846324 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/77de7364-0925-438c-89e2-6ff0d3cb0776-cache\") pod \"swift-storage-0\" (UID: \"77de7364-0925-438c-89e2-6ff0d3cb0776\") " pod="openstack/swift-storage-0" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.846444 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c43d705a-6aa5-43a0-839d-1ab705e28be6-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.846466 4817 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c43d705a-6aa5-43a0-839d-1ab705e28be6-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.849073 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-95m6m" event={"ID":"ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6","Type":"ContainerStarted","Data":"96b3cb0210ede19a1b34dad9cc238406b102c33b4a6057e308d3d517efbbed6a"} Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.849129 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-95m6m" event={"ID":"ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6","Type":"ContainerStarted","Data":"6b624401038eb3d722209c601c6b666c4997cae6e3ecbaec414ec63c6a91c945"} Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.857114 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" podStartSLOduration=-9223372005.997679 podStartE2EDuration="30.85709779s" podCreationTimestamp="2026-02-18 14:16:05 +0000 UTC" firstStartedPulling="2026-02-18 14:16:07.380948493 +0000 UTC m=+1029.956484476" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:16:35.856376742 +0000 UTC m=+1058.431912725" watchObservedRunningTime="2026-02-18 14:16:35.85709779 +0000 UTC m=+1058.432633773" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.860819 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-qb4ph" event={"ID":"b7632ade-ab1b-45b8-9f25-9fb98abc4f1a","Type":"ContainerStarted","Data":"dee78ada4db7e863d700c3e9308b1569f2d844f67b18e34637d0eae634d24fce"} Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.862044 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-j7gxb" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.892773 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-lczjn" event={"ID":"c43d705a-6aa5-43a0-839d-1ab705e28be6","Type":"ContainerDied","Data":"f2aadc52b196d6aa00c6e8eb3526ecad5b6465654003e4e07159bf48070be824"} Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.892854 4817 scope.go:117] "RemoveContainer" containerID="2982da1050d613096062e7dc3c83a58f7673e71ec3b39f1cb513bce1ddb34824" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.893145 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-lczjn" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.929592 4817 scope.go:117] "RemoveContainer" containerID="0e5784205d0c73ad98cf5439c747c096092e48a2f0b513854e238d775e9212d8" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.948446 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/77de7364-0925-438c-89e2-6ff0d3cb0776-lock\") pod \"swift-storage-0\" (UID: \"77de7364-0925-438c-89e2-6ff0d3cb0776\") " pod="openstack/swift-storage-0" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.948490 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/77de7364-0925-438c-89e2-6ff0d3cb0776-cache\") pod \"swift-storage-0\" (UID: \"77de7364-0925-438c-89e2-6ff0d3cb0776\") " pod="openstack/swift-storage-0" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.948558 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/77de7364-0925-438c-89e2-6ff0d3cb0776-etc-swift\") pod \"swift-storage-0\" (UID: \"77de7364-0925-438c-89e2-6ff0d3cb0776\") " pod="openstack/swift-storage-0" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.948698 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77de7364-0925-438c-89e2-6ff0d3cb0776-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"77de7364-0925-438c-89e2-6ff0d3cb0776\") " pod="openstack/swift-storage-0" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.948727 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7f6bb2f6-dd37-4018-8cad-b9e1c48732ce\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7f6bb2f6-dd37-4018-8cad-b9e1c48732ce\") pod \"swift-storage-0\" (UID: \"77de7364-0925-438c-89e2-6ff0d3cb0776\") " pod="openstack/swift-storage-0" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.949122 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s4ch\" (UniqueName: \"kubernetes.io/projected/77de7364-0925-438c-89e2-6ff0d3cb0776-kube-api-access-6s4ch\") pod \"swift-storage-0\" (UID: \"77de7364-0925-438c-89e2-6ff0d3cb0776\") " pod="openstack/swift-storage-0" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.949153 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/77de7364-0925-438c-89e2-6ff0d3cb0776-lock\") pod \"swift-storage-0\" (UID: \"77de7364-0925-438c-89e2-6ff0d3cb0776\") " pod="openstack/swift-storage-0" Feb 18 14:16:35 crc kubenswrapper[4817]: E0218 14:16:35.949320 4817 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 18 14:16:35 crc kubenswrapper[4817]: E0218 14:16:35.949341 4817 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 18 14:16:35 crc kubenswrapper[4817]: E0218 14:16:35.949400 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/77de7364-0925-438c-89e2-6ff0d3cb0776-etc-swift podName:77de7364-0925-438c-89e2-6ff0d3cb0776 nodeName:}" failed. No retries permitted until 2026-02-18 14:16:36.449374834 +0000 UTC m=+1059.024910817 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/77de7364-0925-438c-89e2-6ff0d3cb0776-etc-swift") pod "swift-storage-0" (UID: "77de7364-0925-438c-89e2-6ff0d3cb0776") : configmap "swift-ring-files" not found Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.949436 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/77de7364-0925-438c-89e2-6ff0d3cb0776-cache\") pod \"swift-storage-0\" (UID: \"77de7364-0925-438c-89e2-6ff0d3cb0776\") " pod="openstack/swift-storage-0" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.961295 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-lczjn"] Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.962327 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77de7364-0925-438c-89e2-6ff0d3cb0776-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"77de7364-0925-438c-89e2-6ff0d3cb0776\") " pod="openstack/swift-storage-0" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.962827 4817 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.962872 4817 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7f6bb2f6-dd37-4018-8cad-b9e1c48732ce\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7f6bb2f6-dd37-4018-8cad-b9e1c48732ce\") pod \"swift-storage-0\" (UID: \"77de7364-0925-438c-89e2-6ff0d3cb0776\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d25be36f3a38fcc02a65c889fbfc5633945a3d1f3e7294c1dfdedde0d3726d1b/globalmount\"" pod="openstack/swift-storage-0" Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.978945 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-lczjn"] Feb 18 14:16:35 crc kubenswrapper[4817]: I0218 14:16:35.981783 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s4ch\" (UniqueName: \"kubernetes.io/projected/77de7364-0925-438c-89e2-6ff0d3cb0776-kube-api-access-6s4ch\") pod \"swift-storage-0\" (UID: \"77de7364-0925-438c-89e2-6ff0d3cb0776\") " pod="openstack/swift-storage-0" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.013888 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-qb4ph" podStartSLOduration=-9223372005.840912 podStartE2EDuration="31.013863392s" podCreationTimestamp="2026-02-18 14:16:05 +0000 UTC" firstStartedPulling="2026-02-18 14:16:07.431412437 +0000 UTC m=+1030.006948420" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:16:35.994394763 +0000 UTC m=+1058.569930766" watchObservedRunningTime="2026-02-18 14:16:36.013863392 +0000 UTC m=+1058.589399375" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.038310 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7f6bb2f6-dd37-4018-8cad-b9e1c48732ce\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7f6bb2f6-dd37-4018-8cad-b9e1c48732ce\") pod \"swift-storage-0\" (UID: \"77de7364-0925-438c-89e2-6ff0d3cb0776\") " pod="openstack/swift-storage-0" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.183455 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c43d705a-6aa5-43a0-839d-1ab705e28be6" path="/var/lib/kubelet/pods/c43d705a-6aa5-43a0-839d-1ab705e28be6/volumes" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.213545 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-7sxtp"] Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.216486 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7sxtp" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.222764 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.222879 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.223117 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.268488 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-7sxtp"] Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.295171 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-7sxtp"] Feb 18 14:16:36 crc kubenswrapper[4817]: E0218 14:16:36.296011 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-4fc8k ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-7sxtp" podUID="d0dc9c71-c03e-4aff-a16b-2878ca7eda5f" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.323519 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-qb4ph" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.324287 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-c5btx"] Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.325806 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-c5btx" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.338501 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-c5btx"] Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.360216 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-dispersionconf\") pod \"swift-ring-rebalance-7sxtp\" (UID: \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\") " pod="openstack/swift-ring-rebalance-7sxtp" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.360273 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-swiftconf\") pod \"swift-ring-rebalance-7sxtp\" (UID: \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\") " pod="openstack/swift-ring-rebalance-7sxtp" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.360316 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-etc-swift\") pod \"swift-ring-rebalance-7sxtp\" (UID: \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\") " pod="openstack/swift-ring-rebalance-7sxtp" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.360348 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-ring-data-devices\") pod \"swift-ring-rebalance-7sxtp\" (UID: \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\") " pod="openstack/swift-ring-rebalance-7sxtp" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.360376 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-combined-ca-bundle\") pod \"swift-ring-rebalance-7sxtp\" (UID: \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\") " pod="openstack/swift-ring-rebalance-7sxtp" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.360649 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-scripts\") pod \"swift-ring-rebalance-7sxtp\" (UID: \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\") " pod="openstack/swift-ring-rebalance-7sxtp" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.360779 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fc8k\" (UniqueName: \"kubernetes.io/projected/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-kube-api-access-4fc8k\") pod \"swift-ring-rebalance-7sxtp\" (UID: \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\") " pod="openstack/swift-ring-rebalance-7sxtp" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.463176 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-scripts\") pod \"swift-ring-rebalance-7sxtp\" (UID: \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\") " pod="openstack/swift-ring-rebalance-7sxtp" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.463906 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n29bh\" (UniqueName: \"kubernetes.io/projected/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-kube-api-access-n29bh\") pod \"swift-ring-rebalance-c5btx\" (UID: \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\") " pod="openstack/swift-ring-rebalance-c5btx" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.464093 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fc8k\" (UniqueName: \"kubernetes.io/projected/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-kube-api-access-4fc8k\") pod \"swift-ring-rebalance-7sxtp\" (UID: \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\") " pod="openstack/swift-ring-rebalance-7sxtp" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.464156 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-scripts\") pod \"swift-ring-rebalance-c5btx\" (UID: \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\") " pod="openstack/swift-ring-rebalance-c5btx" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.464211 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-combined-ca-bundle\") pod \"swift-ring-rebalance-c5btx\" (UID: \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\") " pod="openstack/swift-ring-rebalance-c5btx" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.464308 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-dispersionconf\") pod \"swift-ring-rebalance-7sxtp\" (UID: \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\") " pod="openstack/swift-ring-rebalance-7sxtp" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.464343 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-swiftconf\") pod \"swift-ring-rebalance-7sxtp\" (UID: \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\") " pod="openstack/swift-ring-rebalance-7sxtp" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.464402 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-etc-swift\") pod \"swift-ring-rebalance-7sxtp\" (UID: \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\") " pod="openstack/swift-ring-rebalance-7sxtp" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.464445 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-ring-data-devices\") pod \"swift-ring-rebalance-7sxtp\" (UID: \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\") " pod="openstack/swift-ring-rebalance-7sxtp" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.464478 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-combined-ca-bundle\") pod \"swift-ring-rebalance-7sxtp\" (UID: \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\") " pod="openstack/swift-ring-rebalance-7sxtp" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.464523 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-swiftconf\") pod \"swift-ring-rebalance-c5btx\" (UID: \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\") " pod="openstack/swift-ring-rebalance-c5btx" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.464583 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-ring-data-devices\") pod \"swift-ring-rebalance-c5btx\" (UID: \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\") " pod="openstack/swift-ring-rebalance-c5btx" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.464639 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-etc-swift\") pod \"swift-ring-rebalance-c5btx\" (UID: \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\") " pod="openstack/swift-ring-rebalance-c5btx" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.464728 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-dispersionconf\") pod \"swift-ring-rebalance-c5btx\" (UID: \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\") " pod="openstack/swift-ring-rebalance-c5btx" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.464780 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/77de7364-0925-438c-89e2-6ff0d3cb0776-etc-swift\") pod \"swift-storage-0\" (UID: \"77de7364-0925-438c-89e2-6ff0d3cb0776\") " pod="openstack/swift-storage-0" Feb 18 14:16:36 crc kubenswrapper[4817]: E0218 14:16:36.464994 4817 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 18 14:16:36 crc kubenswrapper[4817]: E0218 14:16:36.465015 4817 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 18 14:16:36 crc kubenswrapper[4817]: E0218 14:16:36.465070 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/77de7364-0925-438c-89e2-6ff0d3cb0776-etc-swift podName:77de7364-0925-438c-89e2-6ff0d3cb0776 nodeName:}" failed. No retries permitted until 2026-02-18 14:16:37.465052096 +0000 UTC m=+1060.040588079 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/77de7364-0925-438c-89e2-6ff0d3cb0776-etc-swift") pod "swift-storage-0" (UID: "77de7364-0925-438c-89e2-6ff0d3cb0776") : configmap "swift-ring-files" not found Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.466248 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-scripts\") pod \"swift-ring-rebalance-7sxtp\" (UID: \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\") " pod="openstack/swift-ring-rebalance-7sxtp" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.466398 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-etc-swift\") pod \"swift-ring-rebalance-7sxtp\" (UID: \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\") " pod="openstack/swift-ring-rebalance-7sxtp" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.467234 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-ring-data-devices\") pod \"swift-ring-rebalance-7sxtp\" (UID: \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\") " pod="openstack/swift-ring-rebalance-7sxtp" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.471784 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-dispersionconf\") pod \"swift-ring-rebalance-7sxtp\" (UID: \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\") " pod="openstack/swift-ring-rebalance-7sxtp" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.472063 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-swiftconf\") pod \"swift-ring-rebalance-7sxtp\" (UID: \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\") " pod="openstack/swift-ring-rebalance-7sxtp" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.480793 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-combined-ca-bundle\") pod \"swift-ring-rebalance-7sxtp\" (UID: \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\") " pod="openstack/swift-ring-rebalance-7sxtp" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.490619 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fc8k\" (UniqueName: \"kubernetes.io/projected/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-kube-api-access-4fc8k\") pod \"swift-ring-rebalance-7sxtp\" (UID: \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\") " pod="openstack/swift-ring-rebalance-7sxtp" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.566557 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-swiftconf\") pod \"swift-ring-rebalance-c5btx\" (UID: \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\") " pod="openstack/swift-ring-rebalance-c5btx" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.566634 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-ring-data-devices\") pod \"swift-ring-rebalance-c5btx\" (UID: \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\") " pod="openstack/swift-ring-rebalance-c5btx" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.566695 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-etc-swift\") pod \"swift-ring-rebalance-c5btx\" (UID: \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\") " pod="openstack/swift-ring-rebalance-c5btx" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.567359 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-etc-swift\") pod \"swift-ring-rebalance-c5btx\" (UID: \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\") " pod="openstack/swift-ring-rebalance-c5btx" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.567480 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-ring-data-devices\") pod \"swift-ring-rebalance-c5btx\" (UID: \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\") " pod="openstack/swift-ring-rebalance-c5btx" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.566739 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-dispersionconf\") pod \"swift-ring-rebalance-c5btx\" (UID: \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\") " pod="openstack/swift-ring-rebalance-c5btx" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.567616 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n29bh\" (UniqueName: \"kubernetes.io/projected/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-kube-api-access-n29bh\") pod \"swift-ring-rebalance-c5btx\" (UID: \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\") " pod="openstack/swift-ring-rebalance-c5btx" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.567684 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-scripts\") pod \"swift-ring-rebalance-c5btx\" (UID: \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\") " pod="openstack/swift-ring-rebalance-c5btx" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.567716 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-combined-ca-bundle\") pod \"swift-ring-rebalance-c5btx\" (UID: \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\") " pod="openstack/swift-ring-rebalance-c5btx" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.568395 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-scripts\") pod \"swift-ring-rebalance-c5btx\" (UID: \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\") " pod="openstack/swift-ring-rebalance-c5btx" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.571467 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-swiftconf\") pod \"swift-ring-rebalance-c5btx\" (UID: \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\") " pod="openstack/swift-ring-rebalance-c5btx" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.571529 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-dispersionconf\") pod \"swift-ring-rebalance-c5btx\" (UID: \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\") " pod="openstack/swift-ring-rebalance-c5btx" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.572347 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-combined-ca-bundle\") pod \"swift-ring-rebalance-c5btx\" (UID: \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\") " pod="openstack/swift-ring-rebalance-c5btx" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.591144 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n29bh\" (UniqueName: \"kubernetes.io/projected/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-kube-api-access-n29bh\") pod \"swift-ring-rebalance-c5btx\" (UID: \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\") " pod="openstack/swift-ring-rebalance-c5btx" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.650835 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-c5btx" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.907575 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"def7b080-de6e-49f1-9437-44d6f40b48c4","Type":"ContainerStarted","Data":"c89cc07019f8c3da87501733220247d75e8f81921605054514a93c7eb576ac97"} Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.913738 4817 generic.go:334] "Generic (PLEG): container finished" podID="ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6" containerID="96b3cb0210ede19a1b34dad9cc238406b102c33b4a6057e308d3d517efbbed6a" exitCode=0 Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.913818 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-95m6m" event={"ID":"ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6","Type":"ContainerDied","Data":"96b3cb0210ede19a1b34dad9cc238406b102c33b4a6057e308d3d517efbbed6a"} Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.924561 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"641785a9-2372-4857-8882-192bf7d7fe45","Type":"ContainerStarted","Data":"4fa82fbd401c78d6c5c6823bf50c11dbdc26a0efaf586a0920b9403a0aa32fb1"} Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.931245 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7sxtp" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.931460 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9bcxg" event={"ID":"ddb73215-bd2a-47eb-bbcf-b4708117244f","Type":"ContainerStarted","Data":"1f52bf86fefcc5b4ba0b8dd6b982f3ce979c0d46ecac8251102821684ebfe082"} Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.931841 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-9bcxg" Feb 18 14:16:36 crc kubenswrapper[4817]: I0218 14:16:36.953480 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7sxtp" Feb 18 14:16:37 crc kubenswrapper[4817]: I0218 14:16:37.018160 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-9bcxg" podStartSLOduration=9.793797807 podStartE2EDuration="41.018112904s" podCreationTimestamp="2026-02-18 14:15:56 +0000 UTC" firstStartedPulling="2026-02-18 14:16:04.469739515 +0000 UTC m=+1027.045275508" lastFinishedPulling="2026-02-18 14:16:35.694054622 +0000 UTC m=+1058.269590605" observedRunningTime="2026-02-18 14:16:37.014744719 +0000 UTC m=+1059.590280722" watchObservedRunningTime="2026-02-18 14:16:37.018112904 +0000 UTC m=+1059.593648907" Feb 18 14:16:37 crc kubenswrapper[4817]: I0218 14:16:37.077101 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-swiftconf\") pod \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\" (UID: \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\") " Feb 18 14:16:37 crc kubenswrapper[4817]: I0218 14:16:37.077215 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-etc-swift\") pod \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\" (UID: \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\") " Feb 18 14:16:37 crc kubenswrapper[4817]: I0218 14:16:37.077258 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-combined-ca-bundle\") pod \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\" (UID: \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\") " Feb 18 14:16:37 crc kubenswrapper[4817]: I0218 14:16:37.077312 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fc8k\" (UniqueName: \"kubernetes.io/projected/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-kube-api-access-4fc8k\") pod \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\" (UID: \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\") " Feb 18 14:16:37 crc kubenswrapper[4817]: I0218 14:16:37.077381 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-ring-data-devices\") pod \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\" (UID: \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\") " Feb 18 14:16:37 crc kubenswrapper[4817]: I0218 14:16:37.077426 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-scripts\") pod \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\" (UID: \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\") " Feb 18 14:16:37 crc kubenswrapper[4817]: I0218 14:16:37.077473 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-dispersionconf\") pod \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\" (UID: \"d0dc9c71-c03e-4aff-a16b-2878ca7eda5f\") " Feb 18 14:16:37 crc kubenswrapper[4817]: I0218 14:16:37.079306 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d0dc9c71-c03e-4aff-a16b-2878ca7eda5f" (UID: "d0dc9c71-c03e-4aff-a16b-2878ca7eda5f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:16:37 crc kubenswrapper[4817]: I0218 14:16:37.079379 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "d0dc9c71-c03e-4aff-a16b-2878ca7eda5f" (UID: "d0dc9c71-c03e-4aff-a16b-2878ca7eda5f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:16:37 crc kubenswrapper[4817]: I0218 14:16:37.080418 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-scripts" (OuterVolumeSpecName: "scripts") pod "d0dc9c71-c03e-4aff-a16b-2878ca7eda5f" (UID: "d0dc9c71-c03e-4aff-a16b-2878ca7eda5f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:16:37 crc kubenswrapper[4817]: I0218 14:16:37.083628 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0dc9c71-c03e-4aff-a16b-2878ca7eda5f" (UID: "d0dc9c71-c03e-4aff-a16b-2878ca7eda5f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:16:37 crc kubenswrapper[4817]: I0218 14:16:37.084452 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-kube-api-access-4fc8k" (OuterVolumeSpecName: "kube-api-access-4fc8k") pod "d0dc9c71-c03e-4aff-a16b-2878ca7eda5f" (UID: "d0dc9c71-c03e-4aff-a16b-2878ca7eda5f"). InnerVolumeSpecName "kube-api-access-4fc8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:16:37 crc kubenswrapper[4817]: I0218 14:16:37.084896 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "d0dc9c71-c03e-4aff-a16b-2878ca7eda5f" (UID: "d0dc9c71-c03e-4aff-a16b-2878ca7eda5f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:16:37 crc kubenswrapper[4817]: I0218 14:16:37.084937 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "d0dc9c71-c03e-4aff-a16b-2878ca7eda5f" (UID: "d0dc9c71-c03e-4aff-a16b-2878ca7eda5f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:16:37 crc kubenswrapper[4817]: I0218 14:16:37.181688 4817 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 18 14:16:37 crc kubenswrapper[4817]: I0218 14:16:37.182154 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:16:37 crc kubenswrapper[4817]: I0218 14:16:37.182171 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fc8k\" (UniqueName: \"kubernetes.io/projected/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-kube-api-access-4fc8k\") on node \"crc\" DevicePath \"\"" Feb 18 14:16:37 crc kubenswrapper[4817]: I0218 14:16:37.182221 4817 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 18 14:16:37 crc kubenswrapper[4817]: I0218 14:16:37.182235 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:16:37 crc kubenswrapper[4817]: I0218 14:16:37.182246 4817 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 18 14:16:37 crc kubenswrapper[4817]: I0218 14:16:37.182257 4817 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 18 14:16:37 crc kubenswrapper[4817]: I0218 14:16:37.205270 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-c5btx"] Feb 18 14:16:37 crc kubenswrapper[4817]: W0218 14:16:37.214370 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f8fdaa1_d441_4b5e_b376_8ab67ce68339.slice/crio-7d1ffdbcea6dfca08e3f8629b7f3ae45ad0d1f26ecb99f86cd3bffc954493a34 WatchSource:0}: Error finding container 7d1ffdbcea6dfca08e3f8629b7f3ae45ad0d1f26ecb99f86cd3bffc954493a34: Status 404 returned error can't find the container with id 7d1ffdbcea6dfca08e3f8629b7f3ae45ad0d1f26ecb99f86cd3bffc954493a34 Feb 18 14:16:37 crc kubenswrapper[4817]: I0218 14:16:37.487900 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/77de7364-0925-438c-89e2-6ff0d3cb0776-etc-swift\") pod \"swift-storage-0\" (UID: \"77de7364-0925-438c-89e2-6ff0d3cb0776\") " pod="openstack/swift-storage-0" Feb 18 14:16:37 crc kubenswrapper[4817]: E0218 14:16:37.488256 4817 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 18 14:16:37 crc kubenswrapper[4817]: E0218 14:16:37.488285 4817 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 18 14:16:37 crc kubenswrapper[4817]: E0218 14:16:37.488346 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/77de7364-0925-438c-89e2-6ff0d3cb0776-etc-swift podName:77de7364-0925-438c-89e2-6ff0d3cb0776 nodeName:}" failed. No retries permitted until 2026-02-18 14:16:39.488323995 +0000 UTC m=+1062.063859978 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/77de7364-0925-438c-89e2-6ff0d3cb0776-etc-swift") pod "swift-storage-0" (UID: "77de7364-0925-438c-89e2-6ff0d3cb0776") : configmap "swift-ring-files" not found Feb 18 14:16:37 crc kubenswrapper[4817]: I0218 14:16:37.960875 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-95m6m" event={"ID":"ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6","Type":"ContainerStarted","Data":"4546aa2708adffdee41eff6079e6fe17c8a2b863def06c5f477915fabba8319d"} Feb 18 14:16:37 crc kubenswrapper[4817]: I0218 14:16:37.962298 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7cb5889db5-95m6m" Feb 18 14:16:37 crc kubenswrapper[4817]: I0218 14:16:37.970910 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7sxtp" Feb 18 14:16:37 crc kubenswrapper[4817]: I0218 14:16:37.971756 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-c5btx" event={"ID":"7f8fdaa1-d441-4b5e-b376-8ab67ce68339","Type":"ContainerStarted","Data":"7d1ffdbcea6dfca08e3f8629b7f3ae45ad0d1f26ecb99f86cd3bffc954493a34"} Feb 18 14:16:37 crc kubenswrapper[4817]: I0218 14:16:37.991857 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7cb5889db5-95m6m" podStartSLOduration=3.991837951 podStartE2EDuration="3.991837951s" podCreationTimestamp="2026-02-18 14:16:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:16:37.986376004 +0000 UTC m=+1060.561911997" watchObservedRunningTime="2026-02-18 14:16:37.991837951 +0000 UTC m=+1060.567373934" Feb 18 14:16:38 crc kubenswrapper[4817]: I0218 14:16:38.044350 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-7sxtp"] Feb 18 14:16:38 crc kubenswrapper[4817]: I0218 14:16:38.060658 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-7sxtp"] Feb 18 14:16:38 crc kubenswrapper[4817]: I0218 14:16:38.186818 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0dc9c71-c03e-4aff-a16b-2878ca7eda5f" path="/var/lib/kubelet/pods/d0dc9c71-c03e-4aff-a16b-2878ca7eda5f/volumes" Feb 18 14:16:39 crc kubenswrapper[4817]: I0218 14:16:39.545327 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/77de7364-0925-438c-89e2-6ff0d3cb0776-etc-swift\") pod \"swift-storage-0\" (UID: \"77de7364-0925-438c-89e2-6ff0d3cb0776\") " pod="openstack/swift-storage-0" Feb 18 14:16:39 crc kubenswrapper[4817]: E0218 14:16:39.545509 4817 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 18 14:16:39 crc kubenswrapper[4817]: E0218 14:16:39.545529 4817 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 18 14:16:39 crc kubenswrapper[4817]: E0218 14:16:39.545573 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/77de7364-0925-438c-89e2-6ff0d3cb0776-etc-swift podName:77de7364-0925-438c-89e2-6ff0d3cb0776 nodeName:}" failed. No retries permitted until 2026-02-18 14:16:43.545559031 +0000 UTC m=+1066.121095014 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/77de7364-0925-438c-89e2-6ff0d3cb0776-etc-swift") pod "swift-storage-0" (UID: "77de7364-0925-438c-89e2-6ff0d3cb0776") : configmap "swift-ring-files" not found Feb 18 14:16:40 crc kubenswrapper[4817]: I0218 14:16:40.997841 4817 generic.go:334] "Generic (PLEG): container finished" podID="641785a9-2372-4857-8882-192bf7d7fe45" containerID="4fa82fbd401c78d6c5c6823bf50c11dbdc26a0efaf586a0920b9403a0aa32fb1" exitCode=0 Feb 18 14:16:40 crc kubenswrapper[4817]: I0218 14:16:40.997993 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"641785a9-2372-4857-8882-192bf7d7fe45","Type":"ContainerDied","Data":"4fa82fbd401c78d6c5c6823bf50c11dbdc26a0efaf586a0920b9403a0aa32fb1"} Feb 18 14:16:41 crc kubenswrapper[4817]: I0218 14:16:41.002948 4817 generic.go:334] "Generic (PLEG): container finished" podID="def7b080-de6e-49f1-9437-44d6f40b48c4" containerID="c89cc07019f8c3da87501733220247d75e8f81921605054514a93c7eb576ac97" exitCode=0 Feb 18 14:16:41 crc kubenswrapper[4817]: I0218 14:16:41.003022 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"def7b080-de6e-49f1-9437-44d6f40b48c4","Type":"ContainerDied","Data":"c89cc07019f8c3da87501733220247d75e8f81921605054514a93c7eb576ac97"} Feb 18 14:16:42 crc kubenswrapper[4817]: I0218 14:16:42.863731 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:16:42 crc kubenswrapper[4817]: I0218 14:16:42.864545 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:16:42 crc kubenswrapper[4817]: I0218 14:16:42.864612 4817 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" Feb 18 14:16:42 crc kubenswrapper[4817]: I0218 14:16:42.865846 4817 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2a45288dd8059ad4005579ccd7ba9584a44ec34777e8d02ff7b0f8c874cff3f7"} pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 14:16:42 crc kubenswrapper[4817]: I0218 14:16:42.865920 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" containerID="cri-o://2a45288dd8059ad4005579ccd7ba9584a44ec34777e8d02ff7b0f8c874cff3f7" gracePeriod=600 Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.232013 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-gk7p2"] Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.233837 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-gk7p2" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.237312 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.265049 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-gk7p2"] Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.431778 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/be226950-1270-454d-8b23-2260dba4c819-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-gk7p2\" (UID: \"be226950-1270-454d-8b23-2260dba4c819\") " pod="openstack/ovn-controller-metrics-gk7p2" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.433055 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be226950-1270-454d-8b23-2260dba4c819-config\") pod \"ovn-controller-metrics-gk7p2\" (UID: \"be226950-1270-454d-8b23-2260dba4c819\") " pod="openstack/ovn-controller-metrics-gk7p2" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.433261 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/be226950-1270-454d-8b23-2260dba4c819-ovn-rundir\") pod \"ovn-controller-metrics-gk7p2\" (UID: \"be226950-1270-454d-8b23-2260dba4c819\") " pod="openstack/ovn-controller-metrics-gk7p2" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.433429 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/be226950-1270-454d-8b23-2260dba4c819-ovs-rundir\") pod \"ovn-controller-metrics-gk7p2\" (UID: \"be226950-1270-454d-8b23-2260dba4c819\") " pod="openstack/ovn-controller-metrics-gk7p2" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.433496 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be226950-1270-454d-8b23-2260dba4c819-combined-ca-bundle\") pod \"ovn-controller-metrics-gk7p2\" (UID: \"be226950-1270-454d-8b23-2260dba4c819\") " pod="openstack/ovn-controller-metrics-gk7p2" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.433530 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2vpj\" (UniqueName: \"kubernetes.io/projected/be226950-1270-454d-8b23-2260dba4c819-kube-api-access-b2vpj\") pod \"ovn-controller-metrics-gk7p2\" (UID: \"be226950-1270-454d-8b23-2260dba4c819\") " pod="openstack/ovn-controller-metrics-gk7p2" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.446456 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-95m6m"] Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.446673 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7cb5889db5-95m6m" podUID="ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6" containerName="dnsmasq-dns" containerID="cri-o://4546aa2708adffdee41eff6079e6fe17c8a2b863def06c5f477915fabba8319d" gracePeriod=10 Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.448154 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7cb5889db5-95m6m" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.479339 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-wwz4n"] Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.481001 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-wwz4n" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.484083 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.511967 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-wwz4n"] Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.535421 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/be226950-1270-454d-8b23-2260dba4c819-ovs-rundir\") pod \"ovn-controller-metrics-gk7p2\" (UID: \"be226950-1270-454d-8b23-2260dba4c819\") " pod="openstack/ovn-controller-metrics-gk7p2" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.535484 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be226950-1270-454d-8b23-2260dba4c819-combined-ca-bundle\") pod \"ovn-controller-metrics-gk7p2\" (UID: \"be226950-1270-454d-8b23-2260dba4c819\") " pod="openstack/ovn-controller-metrics-gk7p2" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.535514 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2vpj\" (UniqueName: \"kubernetes.io/projected/be226950-1270-454d-8b23-2260dba4c819-kube-api-access-b2vpj\") pod \"ovn-controller-metrics-gk7p2\" (UID: \"be226950-1270-454d-8b23-2260dba4c819\") " pod="openstack/ovn-controller-metrics-gk7p2" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.535554 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/be226950-1270-454d-8b23-2260dba4c819-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-gk7p2\" (UID: \"be226950-1270-454d-8b23-2260dba4c819\") " pod="openstack/ovn-controller-metrics-gk7p2" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.535675 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmr58\" (UniqueName: \"kubernetes.io/projected/90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d-kube-api-access-tmr58\") pod \"dnsmasq-dns-74f6f696b9-wwz4n\" (UID: \"90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d\") " pod="openstack/dnsmasq-dns-74f6f696b9-wwz4n" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.535704 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be226950-1270-454d-8b23-2260dba4c819-config\") pod \"ovn-controller-metrics-gk7p2\" (UID: \"be226950-1270-454d-8b23-2260dba4c819\") " pod="openstack/ovn-controller-metrics-gk7p2" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.535723 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d-dns-svc\") pod \"dnsmasq-dns-74f6f696b9-wwz4n\" (UID: \"90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d\") " pod="openstack/dnsmasq-dns-74f6f696b9-wwz4n" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.535750 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d-config\") pod \"dnsmasq-dns-74f6f696b9-wwz4n\" (UID: \"90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d\") " pod="openstack/dnsmasq-dns-74f6f696b9-wwz4n" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.535816 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/be226950-1270-454d-8b23-2260dba4c819-ovn-rundir\") pod \"ovn-controller-metrics-gk7p2\" (UID: \"be226950-1270-454d-8b23-2260dba4c819\") " pod="openstack/ovn-controller-metrics-gk7p2" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.535851 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/be226950-1270-454d-8b23-2260dba4c819-ovs-rundir\") pod \"ovn-controller-metrics-gk7p2\" (UID: \"be226950-1270-454d-8b23-2260dba4c819\") " pod="openstack/ovn-controller-metrics-gk7p2" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.535863 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6f696b9-wwz4n\" (UID: \"90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d\") " pod="openstack/dnsmasq-dns-74f6f696b9-wwz4n" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.536134 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/be226950-1270-454d-8b23-2260dba4c819-ovn-rundir\") pod \"ovn-controller-metrics-gk7p2\" (UID: \"be226950-1270-454d-8b23-2260dba4c819\") " pod="openstack/ovn-controller-metrics-gk7p2" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.536652 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be226950-1270-454d-8b23-2260dba4c819-config\") pod \"ovn-controller-metrics-gk7p2\" (UID: \"be226950-1270-454d-8b23-2260dba4c819\") " pod="openstack/ovn-controller-metrics-gk7p2" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.545235 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be226950-1270-454d-8b23-2260dba4c819-combined-ca-bundle\") pod \"ovn-controller-metrics-gk7p2\" (UID: \"be226950-1270-454d-8b23-2260dba4c819\") " pod="openstack/ovn-controller-metrics-gk7p2" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.547685 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/be226950-1270-454d-8b23-2260dba4c819-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-gk7p2\" (UID: \"be226950-1270-454d-8b23-2260dba4c819\") " pod="openstack/ovn-controller-metrics-gk7p2" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.566918 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2vpj\" (UniqueName: \"kubernetes.io/projected/be226950-1270-454d-8b23-2260dba4c819-kube-api-access-b2vpj\") pod \"ovn-controller-metrics-gk7p2\" (UID: \"be226950-1270-454d-8b23-2260dba4c819\") " pod="openstack/ovn-controller-metrics-gk7p2" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.637799 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6f696b9-wwz4n\" (UID: \"90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d\") " pod="openstack/dnsmasq-dns-74f6f696b9-wwz4n" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.638022 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmr58\" (UniqueName: \"kubernetes.io/projected/90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d-kube-api-access-tmr58\") pod \"dnsmasq-dns-74f6f696b9-wwz4n\" (UID: \"90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d\") " pod="openstack/dnsmasq-dns-74f6f696b9-wwz4n" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.638075 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d-dns-svc\") pod \"dnsmasq-dns-74f6f696b9-wwz4n\" (UID: \"90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d\") " pod="openstack/dnsmasq-dns-74f6f696b9-wwz4n" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.638110 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d-config\") pod \"dnsmasq-dns-74f6f696b9-wwz4n\" (UID: \"90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d\") " pod="openstack/dnsmasq-dns-74f6f696b9-wwz4n" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.638172 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/77de7364-0925-438c-89e2-6ff0d3cb0776-etc-swift\") pod \"swift-storage-0\" (UID: \"77de7364-0925-438c-89e2-6ff0d3cb0776\") " pod="openstack/swift-storage-0" Feb 18 14:16:43 crc kubenswrapper[4817]: E0218 14:16:43.638357 4817 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 18 14:16:43 crc kubenswrapper[4817]: E0218 14:16:43.638394 4817 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 18 14:16:43 crc kubenswrapper[4817]: E0218 14:16:43.638444 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/77de7364-0925-438c-89e2-6ff0d3cb0776-etc-swift podName:77de7364-0925-438c-89e2-6ff0d3cb0776 nodeName:}" failed. No retries permitted until 2026-02-18 14:16:51.638425991 +0000 UTC m=+1074.213961974 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/77de7364-0925-438c-89e2-6ff0d3cb0776-etc-swift") pod "swift-storage-0" (UID: "77de7364-0925-438c-89e2-6ff0d3cb0776") : configmap "swift-ring-files" not found Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.638737 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6f696b9-wwz4n\" (UID: \"90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d\") " pod="openstack/dnsmasq-dns-74f6f696b9-wwz4n" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.638906 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d-dns-svc\") pod \"dnsmasq-dns-74f6f696b9-wwz4n\" (UID: \"90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d\") " pod="openstack/dnsmasq-dns-74f6f696b9-wwz4n" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.639403 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d-config\") pod \"dnsmasq-dns-74f6f696b9-wwz4n\" (UID: \"90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d\") " pod="openstack/dnsmasq-dns-74f6f696b9-wwz4n" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.679787 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmr58\" (UniqueName: \"kubernetes.io/projected/90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d-kube-api-access-tmr58\") pod \"dnsmasq-dns-74f6f696b9-wwz4n\" (UID: \"90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d\") " pod="openstack/dnsmasq-dns-74f6f696b9-wwz4n" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.737962 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-wwz4n"] Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.738616 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-wwz4n" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.802773 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-4jpr4"] Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.809453 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-4jpr4" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.812159 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.840929 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e365fd4-7c85-448e-b932-e12471d948d5-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-4jpr4\" (UID: \"6e365fd4-7c85-448e-b932-e12471d948d5\") " pod="openstack/dnsmasq-dns-698758b865-4jpr4" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.841008 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e365fd4-7c85-448e-b932-e12471d948d5-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-4jpr4\" (UID: \"6e365fd4-7c85-448e-b932-e12471d948d5\") " pod="openstack/dnsmasq-dns-698758b865-4jpr4" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.841063 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e365fd4-7c85-448e-b932-e12471d948d5-dns-svc\") pod \"dnsmasq-dns-698758b865-4jpr4\" (UID: \"6e365fd4-7c85-448e-b932-e12471d948d5\") " pod="openstack/dnsmasq-dns-698758b865-4jpr4" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.841101 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e365fd4-7c85-448e-b932-e12471d948d5-config\") pod \"dnsmasq-dns-698758b865-4jpr4\" (UID: \"6e365fd4-7c85-448e-b932-e12471d948d5\") " pod="openstack/dnsmasq-dns-698758b865-4jpr4" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.841197 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct6xx\" (UniqueName: \"kubernetes.io/projected/6e365fd4-7c85-448e-b932-e12471d948d5-kube-api-access-ct6xx\") pod \"dnsmasq-dns-698758b865-4jpr4\" (UID: \"6e365fd4-7c85-448e-b932-e12471d948d5\") " pod="openstack/dnsmasq-dns-698758b865-4jpr4" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.864567 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-gk7p2" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.930237 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-4jpr4"] Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.945198 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e365fd4-7c85-448e-b932-e12471d948d5-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-4jpr4\" (UID: \"6e365fd4-7c85-448e-b932-e12471d948d5\") " pod="openstack/dnsmasq-dns-698758b865-4jpr4" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.945279 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e365fd4-7c85-448e-b932-e12471d948d5-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-4jpr4\" (UID: \"6e365fd4-7c85-448e-b932-e12471d948d5\") " pod="openstack/dnsmasq-dns-698758b865-4jpr4" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.945347 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e365fd4-7c85-448e-b932-e12471d948d5-dns-svc\") pod \"dnsmasq-dns-698758b865-4jpr4\" (UID: \"6e365fd4-7c85-448e-b932-e12471d948d5\") " pod="openstack/dnsmasq-dns-698758b865-4jpr4" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.945396 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e365fd4-7c85-448e-b932-e12471d948d5-config\") pod \"dnsmasq-dns-698758b865-4jpr4\" (UID: \"6e365fd4-7c85-448e-b932-e12471d948d5\") " pod="openstack/dnsmasq-dns-698758b865-4jpr4" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.945506 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct6xx\" (UniqueName: \"kubernetes.io/projected/6e365fd4-7c85-448e-b932-e12471d948d5-kube-api-access-ct6xx\") pod \"dnsmasq-dns-698758b865-4jpr4\" (UID: \"6e365fd4-7c85-448e-b932-e12471d948d5\") " pod="openstack/dnsmasq-dns-698758b865-4jpr4" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.946454 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e365fd4-7c85-448e-b932-e12471d948d5-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-4jpr4\" (UID: \"6e365fd4-7c85-448e-b932-e12471d948d5\") " pod="openstack/dnsmasq-dns-698758b865-4jpr4" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.946916 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e365fd4-7c85-448e-b932-e12471d948d5-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-4jpr4\" (UID: \"6e365fd4-7c85-448e-b932-e12471d948d5\") " pod="openstack/dnsmasq-dns-698758b865-4jpr4" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.947157 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e365fd4-7c85-448e-b932-e12471d948d5-dns-svc\") pod \"dnsmasq-dns-698758b865-4jpr4\" (UID: \"6e365fd4-7c85-448e-b932-e12471d948d5\") " pod="openstack/dnsmasq-dns-698758b865-4jpr4" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.947664 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e365fd4-7c85-448e-b932-e12471d948d5-config\") pod \"dnsmasq-dns-698758b865-4jpr4\" (UID: \"6e365fd4-7c85-448e-b932-e12471d948d5\") " pod="openstack/dnsmasq-dns-698758b865-4jpr4" Feb 18 14:16:43 crc kubenswrapper[4817]: I0218 14:16:43.972192 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct6xx\" (UniqueName: \"kubernetes.io/projected/6e365fd4-7c85-448e-b932-e12471d948d5-kube-api-access-ct6xx\") pod \"dnsmasq-dns-698758b865-4jpr4\" (UID: \"6e365fd4-7c85-448e-b932-e12471d948d5\") " pod="openstack/dnsmasq-dns-698758b865-4jpr4" Feb 18 14:16:44 crc kubenswrapper[4817]: I0218 14:16:44.038748 4817 generic.go:334] "Generic (PLEG): container finished" podID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerID="2a45288dd8059ad4005579ccd7ba9584a44ec34777e8d02ff7b0f8c874cff3f7" exitCode=0 Feb 18 14:16:44 crc kubenswrapper[4817]: I0218 14:16:44.039305 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerDied","Data":"2a45288dd8059ad4005579ccd7ba9584a44ec34777e8d02ff7b0f8c874cff3f7"} Feb 18 14:16:44 crc kubenswrapper[4817]: I0218 14:16:44.039344 4817 scope.go:117] "RemoveContainer" containerID="45f4df11b9cafd0abed8804744792dcd58abd224e061fc9294ed85d9ec653f5f" Feb 18 14:16:44 crc kubenswrapper[4817]: I0218 14:16:44.047138 4817 generic.go:334] "Generic (PLEG): container finished" podID="ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6" containerID="4546aa2708adffdee41eff6079e6fe17c8a2b863def06c5f477915fabba8319d" exitCode=0 Feb 18 14:16:44 crc kubenswrapper[4817]: I0218 14:16:44.047230 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-95m6m" event={"ID":"ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6","Type":"ContainerDied","Data":"4546aa2708adffdee41eff6079e6fe17c8a2b863def06c5f477915fabba8319d"} Feb 18 14:16:44 crc kubenswrapper[4817]: I0218 14:16:44.252259 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-4jpr4" Feb 18 14:16:44 crc kubenswrapper[4817]: I0218 14:16:44.472733 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-95m6m" Feb 18 14:16:44 crc kubenswrapper[4817]: I0218 14:16:44.559248 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4qwr\" (UniqueName: \"kubernetes.io/projected/ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6-kube-api-access-n4qwr\") pod \"ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6\" (UID: \"ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6\") " Feb 18 14:16:44 crc kubenswrapper[4817]: I0218 14:16:44.559351 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6-config\") pod \"ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6\" (UID: \"ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6\") " Feb 18 14:16:44 crc kubenswrapper[4817]: I0218 14:16:44.559488 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6-dns-svc\") pod \"ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6\" (UID: \"ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6\") " Feb 18 14:16:44 crc kubenswrapper[4817]: I0218 14:16:44.565417 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6-kube-api-access-n4qwr" (OuterVolumeSpecName: "kube-api-access-n4qwr") pod "ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6" (UID: "ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6"). InnerVolumeSpecName "kube-api-access-n4qwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:16:44 crc kubenswrapper[4817]: I0218 14:16:44.615751 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6" (UID: "ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:16:44 crc kubenswrapper[4817]: I0218 14:16:44.641665 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6-config" (OuterVolumeSpecName: "config") pod "ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6" (UID: "ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:16:44 crc kubenswrapper[4817]: I0218 14:16:44.668408 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:16:44 crc kubenswrapper[4817]: I0218 14:16:44.668757 4817 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 14:16:44 crc kubenswrapper[4817]: I0218 14:16:44.668770 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4qwr\" (UniqueName: \"kubernetes.io/projected/ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6-kube-api-access-n4qwr\") on node \"crc\" DevicePath \"\"" Feb 18 14:16:45 crc kubenswrapper[4817]: I0218 14:16:45.026278 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-wwz4n"] Feb 18 14:16:45 crc kubenswrapper[4817]: W0218 14:16:45.060003 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90cff1d5_8ddd_42e1_b0a8_a7b5130eba8d.slice/crio-b7a9e6dbbf9eff80c1a43a32769f2d6ce060700c302f433e38924f74bc7a8c3d WatchSource:0}: Error finding container b7a9e6dbbf9eff80c1a43a32769f2d6ce060700c302f433e38924f74bc7a8c3d: Status 404 returned error can't find the container with id b7a9e6dbbf9eff80c1a43a32769f2d6ce060700c302f433e38924f74bc7a8c3d Feb 18 14:16:45 crc kubenswrapper[4817]: I0218 14:16:45.069685 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-95m6m" Feb 18 14:16:45 crc kubenswrapper[4817]: I0218 14:16:45.069688 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-95m6m" event={"ID":"ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6","Type":"ContainerDied","Data":"6b624401038eb3d722209c601c6b666c4997cae6e3ecbaec414ec63c6a91c945"} Feb 18 14:16:45 crc kubenswrapper[4817]: I0218 14:16:45.069749 4817 scope.go:117] "RemoveContainer" containerID="4546aa2708adffdee41eff6079e6fe17c8a2b863def06c5f477915fabba8319d" Feb 18 14:16:45 crc kubenswrapper[4817]: I0218 14:16:45.074010 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerStarted","Data":"bd719d9fe372437c635a5966e962ebc51e7647a95b5fd6491500726f444d522f"} Feb 18 14:16:45 crc kubenswrapper[4817]: I0218 14:16:45.115120 4817 scope.go:117] "RemoveContainer" containerID="96b3cb0210ede19a1b34dad9cc238406b102c33b4a6057e308d3d517efbbed6a" Feb 18 14:16:45 crc kubenswrapper[4817]: I0218 14:16:45.131129 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-95m6m"] Feb 18 14:16:45 crc kubenswrapper[4817]: I0218 14:16:45.142518 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-95m6m"] Feb 18 14:16:45 crc kubenswrapper[4817]: I0218 14:16:45.159272 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-gk7p2"] Feb 18 14:16:45 crc kubenswrapper[4817]: I0218 14:16:45.170330 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-4jpr4"] Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.084495 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-c5btx" event={"ID":"7f8fdaa1-d441-4b5e-b376-8ab67ce68339","Type":"ContainerStarted","Data":"cf9bda08af9894c9a4a938124159cb1c0d0c15f639c2f973d701ad008225ed32"} Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.086856 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-gk7p2" event={"ID":"be226950-1270-454d-8b23-2260dba4c819","Type":"ContainerStarted","Data":"ab95a2cc81b1e84b43eb2a3c78c386a8b391d3bd94e0fb5ccb99898e70774179"} Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.086900 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-gk7p2" event={"ID":"be226950-1270-454d-8b23-2260dba4c819","Type":"ContainerStarted","Data":"7c61b2eb3f0d1d2a93ddea8a75cc81ccdd41553b6e524470fffb39a65b1ba3dd"} Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.089253 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"9e5146f3-4a88-4e31-82e7-0e0f72188d22","Type":"ContainerStarted","Data":"73122d53dd5b662dbcc7c3eef0dae215731e6059f1aeecd9b00a645d62ace5d1"} Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.091004 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b89d17a9-16cb-4abe-ba88-107ce95dbceb","Type":"ContainerStarted","Data":"f70a7d86ab6c6f4ef8d88cfdccfc6c56bbf2a7ba338dcd277ef6cf8e638f816c"} Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.094933 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"def7b080-de6e-49f1-9437-44d6f40b48c4","Type":"ContainerStarted","Data":"ed242fd5746c7c1e5a836264590aed3c566600e80ddb2fe7b3452a64e30df91b"} Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.097091 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d5d448f4-839e-4b71-ac6e-0c941ccd5a14","Type":"ContainerStarted","Data":"1e8977c90e480a38324dbe7a779273e2890fcc959446890b828d6f54e033a6a2"} Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.098406 4817 generic.go:334] "Generic (PLEG): container finished" podID="90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d" containerID="b0fd42c73b753060d5d2a5ceddf2eb9bf89131d2c47adbc9d0272489bd013c28" exitCode=0 Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.098465 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6f696b9-wwz4n" event={"ID":"90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d","Type":"ContainerDied","Data":"b0fd42c73b753060d5d2a5ceddf2eb9bf89131d2c47adbc9d0272489bd013c28"} Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.098486 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6f696b9-wwz4n" event={"ID":"90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d","Type":"ContainerStarted","Data":"b7a9e6dbbf9eff80c1a43a32769f2d6ce060700c302f433e38924f74bc7a8c3d"} Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.101254 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"641785a9-2372-4857-8882-192bf7d7fe45","Type":"ContainerStarted","Data":"7594215e43ee5d9eaa7e87cdf963da65f0dafab8ff12b67a1965d6c1996a896c"} Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.102955 4817 generic.go:334] "Generic (PLEG): container finished" podID="6e365fd4-7c85-448e-b932-e12471d948d5" containerID="7852bca6be599402e9ef8a2226e86dbd411f7a7941dbf4d06f006e6602f294f0" exitCode=0 Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.105508 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-4jpr4" event={"ID":"6e365fd4-7c85-448e-b932-e12471d948d5","Type":"ContainerDied","Data":"7852bca6be599402e9ef8a2226e86dbd411f7a7941dbf4d06f006e6602f294f0"} Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.105613 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-4jpr4" event={"ID":"6e365fd4-7c85-448e-b932-e12471d948d5","Type":"ContainerStarted","Data":"0ff72cef63e0874ac39d44fdc5e450b328f51a202214e4b7aff780b37822df94"} Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.110309 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-c5btx" podStartSLOduration=2.813651058 podStartE2EDuration="10.110288864s" podCreationTimestamp="2026-02-18 14:16:36 +0000 UTC" firstStartedPulling="2026-02-18 14:16:37.223676619 +0000 UTC m=+1059.799212602" lastFinishedPulling="2026-02-18 14:16:44.520314425 +0000 UTC m=+1067.095850408" observedRunningTime="2026-02-18 14:16:46.104092329 +0000 UTC m=+1068.679628312" watchObservedRunningTime="2026-02-18 14:16:46.110288864 +0000 UTC m=+1068.685824847" Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.130674 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=9.834294039 podStartE2EDuration="50.130655495s" podCreationTimestamp="2026-02-18 14:15:56 +0000 UTC" firstStartedPulling="2026-02-18 14:16:04.556745275 +0000 UTC m=+1027.132281258" lastFinishedPulling="2026-02-18 14:16:44.853106731 +0000 UTC m=+1067.428642714" observedRunningTime="2026-02-18 14:16:46.128318516 +0000 UTC m=+1068.703854499" watchObservedRunningTime="2026-02-18 14:16:46.130655495 +0000 UTC m=+1068.706191478" Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.133207 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"317526d8-4a73-4ae4-9607-b1d7375ba7f6","Type":"ContainerStarted","Data":"6b77ee655f3528ac4147eba0a513d477b2df9684006510be9240611cb7103cba"} Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.170613 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=25.397945316 podStartE2EDuration="56.170597877s" podCreationTimestamp="2026-02-18 14:15:50 +0000 UTC" firstStartedPulling="2026-02-18 14:16:04.462184666 +0000 UTC m=+1027.037720659" lastFinishedPulling="2026-02-18 14:16:35.234837237 +0000 UTC m=+1057.810373220" observedRunningTime="2026-02-18 14:16:46.169750425 +0000 UTC m=+1068.745286428" watchObservedRunningTime="2026-02-18 14:16:46.170597877 +0000 UTC m=+1068.746133860" Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.199165 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6" path="/var/lib/kubelet/pods/ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6/volumes" Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.199866 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.206993 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-gk7p2" podStartSLOduration=3.206959888 podStartE2EDuration="3.206959888s" podCreationTimestamp="2026-02-18 14:16:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:16:46.192182158 +0000 UTC m=+1068.767718151" watchObservedRunningTime="2026-02-18 14:16:46.206959888 +0000 UTC m=+1068.782495871" Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.263737 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=26.514789959 podStartE2EDuration="58.263712392s" podCreationTimestamp="2026-02-18 14:15:48 +0000 UTC" firstStartedPulling="2026-02-18 14:16:04.040224324 +0000 UTC m=+1026.615760317" lastFinishedPulling="2026-02-18 14:16:35.789146767 +0000 UTC m=+1058.364682750" observedRunningTime="2026-02-18 14:16:46.261716501 +0000 UTC m=+1068.837252484" watchObservedRunningTime="2026-02-18 14:16:46.263712392 +0000 UTC m=+1068.839248385" Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.319503 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=7.131150616 podStartE2EDuration="46.319427009s" podCreationTimestamp="2026-02-18 14:16:00 +0000 UTC" firstStartedPulling="2026-02-18 14:16:05.668723715 +0000 UTC m=+1028.244259708" lastFinishedPulling="2026-02-18 14:16:44.857000118 +0000 UTC m=+1067.432536101" observedRunningTime="2026-02-18 14:16:46.311266754 +0000 UTC m=+1068.886802757" watchObservedRunningTime="2026-02-18 14:16:46.319427009 +0000 UTC m=+1068.894963012" Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.365254 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz" Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.393595 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="7f685dd5-8921-4e4a-a4d5-d19a499775f5" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.492261 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.501202 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-wwz4n" Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.603745 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmr58\" (UniqueName: \"kubernetes.io/projected/90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d-kube-api-access-tmr58\") pod \"90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d\" (UID: \"90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d\") " Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.603828 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d-dns-svc\") pod \"90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d\" (UID: \"90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d\") " Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.603895 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d-config\") pod \"90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d\" (UID: \"90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d\") " Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.604013 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d-ovsdbserver-nb\") pod \"90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d\" (UID: \"90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d\") " Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.611304 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d-kube-api-access-tmr58" (OuterVolumeSpecName: "kube-api-access-tmr58") pod "90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d" (UID: "90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d"). InnerVolumeSpecName "kube-api-access-tmr58". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.632553 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d" (UID: "90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.633886 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d" (UID: "90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.705941 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-fmj4p" Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.707249 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmr58\" (UniqueName: \"kubernetes.io/projected/90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d-kube-api-access-tmr58\") on node \"crc\" DevicePath \"\"" Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.707299 4817 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.707315 4817 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.795313 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.853208 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d-config" (OuterVolumeSpecName: "config") pod "90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d" (UID: "90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:16:46 crc kubenswrapper[4817]: I0218 14:16:46.913546 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:16:47 crc kubenswrapper[4817]: I0218 14:16:47.143149 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-4jpr4" event={"ID":"6e365fd4-7c85-448e-b932-e12471d948d5","Type":"ContainerStarted","Data":"3e253b2e3d34a6940187b42d285539b0c2670e0a2e80d4122ef03184dfff4251"} Feb 18 14:16:47 crc kubenswrapper[4817]: I0218 14:16:47.144235 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-4jpr4" Feb 18 14:16:47 crc kubenswrapper[4817]: I0218 14:16:47.146467 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-wwz4n" Feb 18 14:16:47 crc kubenswrapper[4817]: I0218 14:16:47.150323 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6f696b9-wwz4n" event={"ID":"90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d","Type":"ContainerDied","Data":"b7a9e6dbbf9eff80c1a43a32769f2d6ce060700c302f433e38924f74bc7a8c3d"} Feb 18 14:16:47 crc kubenswrapper[4817]: I0218 14:16:47.150410 4817 scope.go:117] "RemoveContainer" containerID="b0fd42c73b753060d5d2a5ceddf2eb9bf89131d2c47adbc9d0272489bd013c28" Feb 18 14:16:47 crc kubenswrapper[4817]: I0218 14:16:47.175843 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-4jpr4" podStartSLOduration=4.175826254 podStartE2EDuration="4.175826254s" podCreationTimestamp="2026-02-18 14:16:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:16:47.169845174 +0000 UTC m=+1069.745381157" watchObservedRunningTime="2026-02-18 14:16:47.175826254 +0000 UTC m=+1069.751362237" Feb 18 14:16:47 crc kubenswrapper[4817]: I0218 14:16:47.232576 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 18 14:16:47 crc kubenswrapper[4817]: I0218 14:16:47.232936 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 18 14:16:48 crc kubenswrapper[4817]: I0218 14:16:48.199185 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 18 14:16:48 crc kubenswrapper[4817]: I0218 14:16:48.246649 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-wwz4n"] Feb 18 14:16:48 crc kubenswrapper[4817]: I0218 14:16:48.256109 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-wwz4n"] Feb 18 14:16:49 crc kubenswrapper[4817]: I0218 14:16:49.170219 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d5d448f4-839e-4b71-ac6e-0c941ccd5a14","Type":"ContainerStarted","Data":"a115b4ffa1e08b7aa72751645a637afa2ca40952df9066d390305a579effd26c"} Feb 18 14:16:49 crc kubenswrapper[4817]: I0218 14:16:49.173696 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"9e5146f3-4a88-4e31-82e7-0e0f72188d22","Type":"ContainerStarted","Data":"fe2ae8fcf2beaab080ca2a656a530bb8e7317660e71caceb3a5b2d8b10845f3f"} Feb 18 14:16:49 crc kubenswrapper[4817]: I0218 14:16:49.206646 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=15.232911254 podStartE2EDuration="55.206625407s" podCreationTimestamp="2026-02-18 14:15:54 +0000 UTC" firstStartedPulling="2026-02-18 14:16:04.463319764 +0000 UTC m=+1027.038855757" lastFinishedPulling="2026-02-18 14:16:44.437033927 +0000 UTC m=+1067.012569910" observedRunningTime="2026-02-18 14:16:49.193235551 +0000 UTC m=+1071.768771554" watchObservedRunningTime="2026-02-18 14:16:49.206625407 +0000 UTC m=+1071.782161390" Feb 18 14:16:49 crc kubenswrapper[4817]: I0218 14:16:49.239588 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 18 14:16:50 crc kubenswrapper[4817]: I0218 14:16:50.137951 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 18 14:16:50 crc kubenswrapper[4817]: I0218 14:16:50.138058 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 18 14:16:50 crc kubenswrapper[4817]: I0218 14:16:50.139600 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Feb 18 14:16:50 crc kubenswrapper[4817]: I0218 14:16:50.145100 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Feb 18 14:16:50 crc kubenswrapper[4817]: I0218 14:16:50.188638 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d" path="/var/lib/kubelet/pods/90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d/volumes" Feb 18 14:16:50 crc kubenswrapper[4817]: I0218 14:16:50.275321 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 18 14:16:50 crc kubenswrapper[4817]: I0218 14:16:50.323434 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 18 14:16:51 crc kubenswrapper[4817]: I0218 14:16:51.463856 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 18 14:16:51 crc kubenswrapper[4817]: I0218 14:16:51.463902 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 18 14:16:51 crc kubenswrapper[4817]: I0218 14:16:51.707528 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/77de7364-0925-438c-89e2-6ff0d3cb0776-etc-swift\") pod \"swift-storage-0\" (UID: \"77de7364-0925-438c-89e2-6ff0d3cb0776\") " pod="openstack/swift-storage-0" Feb 18 14:16:51 crc kubenswrapper[4817]: E0218 14:16:51.707719 4817 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 18 14:16:51 crc kubenswrapper[4817]: E0218 14:16:51.707740 4817 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 18 14:16:51 crc kubenswrapper[4817]: E0218 14:16:51.707797 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/77de7364-0925-438c-89e2-6ff0d3cb0776-etc-swift podName:77de7364-0925-438c-89e2-6ff0d3cb0776 nodeName:}" failed. No retries permitted until 2026-02-18 14:17:07.707775814 +0000 UTC m=+1090.283311797 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/77de7364-0925-438c-89e2-6ff0d3cb0776-etc-swift") pod "swift-storage-0" (UID: "77de7364-0925-438c-89e2-6ff0d3cb0776") : configmap "swift-ring-files" not found Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.251251 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.582917 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 18 14:16:53 crc kubenswrapper[4817]: E0218 14:16:53.588537 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6" containerName="init" Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.588559 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6" containerName="init" Feb 18 14:16:53 crc kubenswrapper[4817]: E0218 14:16:53.588569 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6" containerName="dnsmasq-dns" Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.588576 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6" containerName="dnsmasq-dns" Feb 18 14:16:53 crc kubenswrapper[4817]: E0218 14:16:53.588600 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d" containerName="init" Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.588606 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d" containerName="init" Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.588829 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="90cff1d5-8ddd-42e1-b0a8-a7b5130eba8d" containerName="init" Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.588857 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae93eccc-9e7c-4f5c-b427-24e8a3c6fba6" containerName="dnsmasq-dns" Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.590062 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.591901 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-jrz7h" Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.592184 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.592770 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.615087 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.626750 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.652155 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvg9c\" (UniqueName: \"kubernetes.io/projected/8b26cbdf-a148-4d07-bffc-afa241bc30e2-kube-api-access-nvg9c\") pod \"ovn-northd-0\" (UID: \"8b26cbdf-a148-4d07-bffc-afa241bc30e2\") " pod="openstack/ovn-northd-0" Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.652255 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b26cbdf-a148-4d07-bffc-afa241bc30e2-config\") pod \"ovn-northd-0\" (UID: \"8b26cbdf-a148-4d07-bffc-afa241bc30e2\") " pod="openstack/ovn-northd-0" Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.652285 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b26cbdf-a148-4d07-bffc-afa241bc30e2-scripts\") pod \"ovn-northd-0\" (UID: \"8b26cbdf-a148-4d07-bffc-afa241bc30e2\") " pod="openstack/ovn-northd-0" Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.652308 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b26cbdf-a148-4d07-bffc-afa241bc30e2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8b26cbdf-a148-4d07-bffc-afa241bc30e2\") " pod="openstack/ovn-northd-0" Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.652434 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b26cbdf-a148-4d07-bffc-afa241bc30e2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8b26cbdf-a148-4d07-bffc-afa241bc30e2\") " pod="openstack/ovn-northd-0" Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.652469 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8b26cbdf-a148-4d07-bffc-afa241bc30e2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8b26cbdf-a148-4d07-bffc-afa241bc30e2\") " pod="openstack/ovn-northd-0" Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.652487 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b26cbdf-a148-4d07-bffc-afa241bc30e2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8b26cbdf-a148-4d07-bffc-afa241bc30e2\") " pod="openstack/ovn-northd-0" Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.753680 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvg9c\" (UniqueName: \"kubernetes.io/projected/8b26cbdf-a148-4d07-bffc-afa241bc30e2-kube-api-access-nvg9c\") pod \"ovn-northd-0\" (UID: \"8b26cbdf-a148-4d07-bffc-afa241bc30e2\") " pod="openstack/ovn-northd-0" Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.753777 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b26cbdf-a148-4d07-bffc-afa241bc30e2-config\") pod \"ovn-northd-0\" (UID: \"8b26cbdf-a148-4d07-bffc-afa241bc30e2\") " pod="openstack/ovn-northd-0" Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.753807 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b26cbdf-a148-4d07-bffc-afa241bc30e2-scripts\") pod \"ovn-northd-0\" (UID: \"8b26cbdf-a148-4d07-bffc-afa241bc30e2\") " pod="openstack/ovn-northd-0" Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.753829 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b26cbdf-a148-4d07-bffc-afa241bc30e2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8b26cbdf-a148-4d07-bffc-afa241bc30e2\") " pod="openstack/ovn-northd-0" Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.753928 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b26cbdf-a148-4d07-bffc-afa241bc30e2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8b26cbdf-a148-4d07-bffc-afa241bc30e2\") " pod="openstack/ovn-northd-0" Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.753965 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8b26cbdf-a148-4d07-bffc-afa241bc30e2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8b26cbdf-a148-4d07-bffc-afa241bc30e2\") " pod="openstack/ovn-northd-0" Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.754009 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b26cbdf-a148-4d07-bffc-afa241bc30e2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8b26cbdf-a148-4d07-bffc-afa241bc30e2\") " pod="openstack/ovn-northd-0" Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.754613 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8b26cbdf-a148-4d07-bffc-afa241bc30e2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8b26cbdf-a148-4d07-bffc-afa241bc30e2\") " pod="openstack/ovn-northd-0" Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.754882 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b26cbdf-a148-4d07-bffc-afa241bc30e2-config\") pod \"ovn-northd-0\" (UID: \"8b26cbdf-a148-4d07-bffc-afa241bc30e2\") " pod="openstack/ovn-northd-0" Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.755601 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b26cbdf-a148-4d07-bffc-afa241bc30e2-scripts\") pod \"ovn-northd-0\" (UID: \"8b26cbdf-a148-4d07-bffc-afa241bc30e2\") " pod="openstack/ovn-northd-0" Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.767245 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b26cbdf-a148-4d07-bffc-afa241bc30e2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8b26cbdf-a148-4d07-bffc-afa241bc30e2\") " pod="openstack/ovn-northd-0" Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.770729 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b26cbdf-a148-4d07-bffc-afa241bc30e2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8b26cbdf-a148-4d07-bffc-afa241bc30e2\") " pod="openstack/ovn-northd-0" Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.797611 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b26cbdf-a148-4d07-bffc-afa241bc30e2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8b26cbdf-a148-4d07-bffc-afa241bc30e2\") " pod="openstack/ovn-northd-0" Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.805531 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvg9c\" (UniqueName: \"kubernetes.io/projected/8b26cbdf-a148-4d07-bffc-afa241bc30e2-kube-api-access-nvg9c\") pod \"ovn-northd-0\" (UID: \"8b26cbdf-a148-4d07-bffc-afa241bc30e2\") " pod="openstack/ovn-northd-0" Feb 18 14:16:53 crc kubenswrapper[4817]: I0218 14:16:53.925745 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 18 14:16:54 crc kubenswrapper[4817]: I0218 14:16:54.255411 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-4jpr4" Feb 18 14:16:54 crc kubenswrapper[4817]: I0218 14:16:54.323277 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-lfrgc"] Feb 18 14:16:54 crc kubenswrapper[4817]: I0218 14:16:54.323539 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-lfrgc" podUID="5ffa1255-1d7c-43a4-a197-931d34164a31" containerName="dnsmasq-dns" containerID="cri-o://57f72dcd454daf960339e19ab0c7a980f7d3db07916e4786141c767225a3b2ac" gracePeriod=10 Feb 18 14:16:54 crc kubenswrapper[4817]: I0218 14:16:54.504148 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 18 14:16:54 crc kubenswrapper[4817]: W0218 14:16:54.509171 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b26cbdf_a148_4d07_bffc_afa241bc30e2.slice/crio-2a8bc9fade88af0ede15bc32cc0b1f5f21f200bb4a38ea7824f141d614f6197f WatchSource:0}: Error finding container 2a8bc9fade88af0ede15bc32cc0b1f5f21f200bb4a38ea7824f141d614f6197f: Status 404 returned error can't find the container with id 2a8bc9fade88af0ede15bc32cc0b1f5f21f200bb4a38ea7824f141d614f6197f Feb 18 14:16:55 crc kubenswrapper[4817]: I0218 14:16:55.205356 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-lfrgc" Feb 18 14:16:55 crc kubenswrapper[4817]: I0218 14:16:55.239400 4817 generic.go:334] "Generic (PLEG): container finished" podID="5ffa1255-1d7c-43a4-a197-931d34164a31" containerID="57f72dcd454daf960339e19ab0c7a980f7d3db07916e4786141c767225a3b2ac" exitCode=0 Feb 18 14:16:55 crc kubenswrapper[4817]: I0218 14:16:55.239516 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-lfrgc" event={"ID":"5ffa1255-1d7c-43a4-a197-931d34164a31","Type":"ContainerDied","Data":"57f72dcd454daf960339e19ab0c7a980f7d3db07916e4786141c767225a3b2ac"} Feb 18 14:16:55 crc kubenswrapper[4817]: I0218 14:16:55.239548 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-lfrgc" event={"ID":"5ffa1255-1d7c-43a4-a197-931d34164a31","Type":"ContainerDied","Data":"b230e337a15cebeaabb082fc171738df6fe4c6b66870a032585c55f85039c269"} Feb 18 14:16:55 crc kubenswrapper[4817]: I0218 14:16:55.239570 4817 scope.go:117] "RemoveContainer" containerID="57f72dcd454daf960339e19ab0c7a980f7d3db07916e4786141c767225a3b2ac" Feb 18 14:16:55 crc kubenswrapper[4817]: I0218 14:16:55.239720 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-lfrgc" Feb 18 14:16:55 crc kubenswrapper[4817]: I0218 14:16:55.242042 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8b26cbdf-a148-4d07-bffc-afa241bc30e2","Type":"ContainerStarted","Data":"2a8bc9fade88af0ede15bc32cc0b1f5f21f200bb4a38ea7824f141d614f6197f"} Feb 18 14:16:55 crc kubenswrapper[4817]: I0218 14:16:55.276175 4817 scope.go:117] "RemoveContainer" containerID="c83540ab8f3e1e8a5f97b3084ce4f8bcdd393590521f6335665d87036b68375b" Feb 18 14:16:55 crc kubenswrapper[4817]: I0218 14:16:55.282429 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4q62\" (UniqueName: \"kubernetes.io/projected/5ffa1255-1d7c-43a4-a197-931d34164a31-kube-api-access-g4q62\") pod \"5ffa1255-1d7c-43a4-a197-931d34164a31\" (UID: \"5ffa1255-1d7c-43a4-a197-931d34164a31\") " Feb 18 14:16:55 crc kubenswrapper[4817]: I0218 14:16:55.282559 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ffa1255-1d7c-43a4-a197-931d34164a31-dns-svc\") pod \"5ffa1255-1d7c-43a4-a197-931d34164a31\" (UID: \"5ffa1255-1d7c-43a4-a197-931d34164a31\") " Feb 18 14:16:55 crc kubenswrapper[4817]: I0218 14:16:55.282581 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ffa1255-1d7c-43a4-a197-931d34164a31-config\") pod \"5ffa1255-1d7c-43a4-a197-931d34164a31\" (UID: \"5ffa1255-1d7c-43a4-a197-931d34164a31\") " Feb 18 14:16:55 crc kubenswrapper[4817]: I0218 14:16:55.305517 4817 scope.go:117] "RemoveContainer" containerID="57f72dcd454daf960339e19ab0c7a980f7d3db07916e4786141c767225a3b2ac" Feb 18 14:16:55 crc kubenswrapper[4817]: E0218 14:16:55.306603 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57f72dcd454daf960339e19ab0c7a980f7d3db07916e4786141c767225a3b2ac\": container with ID starting with 57f72dcd454daf960339e19ab0c7a980f7d3db07916e4786141c767225a3b2ac not found: ID does not exist" containerID="57f72dcd454daf960339e19ab0c7a980f7d3db07916e4786141c767225a3b2ac" Feb 18 14:16:55 crc kubenswrapper[4817]: I0218 14:16:55.306723 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57f72dcd454daf960339e19ab0c7a980f7d3db07916e4786141c767225a3b2ac"} err="failed to get container status \"57f72dcd454daf960339e19ab0c7a980f7d3db07916e4786141c767225a3b2ac\": rpc error: code = NotFound desc = could not find container \"57f72dcd454daf960339e19ab0c7a980f7d3db07916e4786141c767225a3b2ac\": container with ID starting with 57f72dcd454daf960339e19ab0c7a980f7d3db07916e4786141c767225a3b2ac not found: ID does not exist" Feb 18 14:16:55 crc kubenswrapper[4817]: I0218 14:16:55.306807 4817 scope.go:117] "RemoveContainer" containerID="c83540ab8f3e1e8a5f97b3084ce4f8bcdd393590521f6335665d87036b68375b" Feb 18 14:16:55 crc kubenswrapper[4817]: E0218 14:16:55.307392 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c83540ab8f3e1e8a5f97b3084ce4f8bcdd393590521f6335665d87036b68375b\": container with ID starting with c83540ab8f3e1e8a5f97b3084ce4f8bcdd393590521f6335665d87036b68375b not found: ID does not exist" containerID="c83540ab8f3e1e8a5f97b3084ce4f8bcdd393590521f6335665d87036b68375b" Feb 18 14:16:55 crc kubenswrapper[4817]: I0218 14:16:55.307490 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c83540ab8f3e1e8a5f97b3084ce4f8bcdd393590521f6335665d87036b68375b"} err="failed to get container status \"c83540ab8f3e1e8a5f97b3084ce4f8bcdd393590521f6335665d87036b68375b\": rpc error: code = NotFound desc = could not find container \"c83540ab8f3e1e8a5f97b3084ce4f8bcdd393590521f6335665d87036b68375b\": container with ID starting with c83540ab8f3e1e8a5f97b3084ce4f8bcdd393590521f6335665d87036b68375b not found: ID does not exist" Feb 18 14:16:55 crc kubenswrapper[4817]: I0218 14:16:55.317170 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ffa1255-1d7c-43a4-a197-931d34164a31-kube-api-access-g4q62" (OuterVolumeSpecName: "kube-api-access-g4q62") pod "5ffa1255-1d7c-43a4-a197-931d34164a31" (UID: "5ffa1255-1d7c-43a4-a197-931d34164a31"). InnerVolumeSpecName "kube-api-access-g4q62". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:16:55 crc kubenswrapper[4817]: I0218 14:16:55.350391 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ffa1255-1d7c-43a4-a197-931d34164a31-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5ffa1255-1d7c-43a4-a197-931d34164a31" (UID: "5ffa1255-1d7c-43a4-a197-931d34164a31"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:16:55 crc kubenswrapper[4817]: I0218 14:16:55.354018 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ffa1255-1d7c-43a4-a197-931d34164a31-config" (OuterVolumeSpecName: "config") pod "5ffa1255-1d7c-43a4-a197-931d34164a31" (UID: "5ffa1255-1d7c-43a4-a197-931d34164a31"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:16:55 crc kubenswrapper[4817]: I0218 14:16:55.385395 4817 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ffa1255-1d7c-43a4-a197-931d34164a31-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 14:16:55 crc kubenswrapper[4817]: I0218 14:16:55.385644 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ffa1255-1d7c-43a4-a197-931d34164a31-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:16:55 crc kubenswrapper[4817]: I0218 14:16:55.385725 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4q62\" (UniqueName: \"kubernetes.io/projected/5ffa1255-1d7c-43a4-a197-931d34164a31-kube-api-access-g4q62\") on node \"crc\" DevicePath \"\"" Feb 18 14:16:55 crc kubenswrapper[4817]: I0218 14:16:55.579854 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-lfrgc"] Feb 18 14:16:55 crc kubenswrapper[4817]: I0218 14:16:55.587328 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-lfrgc"] Feb 18 14:16:56 crc kubenswrapper[4817]: I0218 14:16:56.185527 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ffa1255-1d7c-43a4-a197-931d34164a31" path="/var/lib/kubelet/pods/5ffa1255-1d7c-43a4-a197-931d34164a31/volumes" Feb 18 14:16:56 crc kubenswrapper[4817]: I0218 14:16:56.329039 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-qb4ph" Feb 18 14:16:56 crc kubenswrapper[4817]: I0218 14:16:56.390084 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="7f685dd5-8921-4e4a-a4d5-d19a499775f5" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 18 14:16:58 crc kubenswrapper[4817]: I0218 14:16:58.288116 4817 generic.go:334] "Generic (PLEG): container finished" podID="7f8fdaa1-d441-4b5e-b376-8ab67ce68339" containerID="cf9bda08af9894c9a4a938124159cb1c0d0c15f639c2f973d701ad008225ed32" exitCode=0 Feb 18 14:16:58 crc kubenswrapper[4817]: I0218 14:16:58.288197 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-c5btx" event={"ID":"7f8fdaa1-d441-4b5e-b376-8ab67ce68339","Type":"ContainerDied","Data":"cf9bda08af9894c9a4a938124159cb1c0d0c15f639c2f973d701ad008225ed32"} Feb 18 14:16:58 crc kubenswrapper[4817]: I0218 14:16:58.398821 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 18 14:16:58 crc kubenswrapper[4817]: I0218 14:16:58.521494 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 18 14:16:58 crc kubenswrapper[4817]: I0218 14:16:58.869167 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-pbvzk"] Feb 18 14:16:58 crc kubenswrapper[4817]: E0218 14:16:58.870134 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ffa1255-1d7c-43a4-a197-931d34164a31" containerName="init" Feb 18 14:16:58 crc kubenswrapper[4817]: I0218 14:16:58.870161 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ffa1255-1d7c-43a4-a197-931d34164a31" containerName="init" Feb 18 14:16:58 crc kubenswrapper[4817]: E0218 14:16:58.870173 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ffa1255-1d7c-43a4-a197-931d34164a31" containerName="dnsmasq-dns" Feb 18 14:16:58 crc kubenswrapper[4817]: I0218 14:16:58.870183 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ffa1255-1d7c-43a4-a197-931d34164a31" containerName="dnsmasq-dns" Feb 18 14:16:58 crc kubenswrapper[4817]: I0218 14:16:58.870408 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ffa1255-1d7c-43a4-a197-931d34164a31" containerName="dnsmasq-dns" Feb 18 14:16:58 crc kubenswrapper[4817]: I0218 14:16:58.871422 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pbvzk" Feb 18 14:16:58 crc kubenswrapper[4817]: I0218 14:16:58.877041 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 18 14:16:58 crc kubenswrapper[4817]: I0218 14:16:58.886615 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-pbvzk"] Feb 18 14:16:59 crc kubenswrapper[4817]: I0218 14:16:59.063270 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zz6r\" (UniqueName: \"kubernetes.io/projected/20a99eb1-1830-4268-a857-107242bbb333-kube-api-access-5zz6r\") pod \"root-account-create-update-pbvzk\" (UID: \"20a99eb1-1830-4268-a857-107242bbb333\") " pod="openstack/root-account-create-update-pbvzk" Feb 18 14:16:59 crc kubenswrapper[4817]: I0218 14:16:59.064438 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20a99eb1-1830-4268-a857-107242bbb333-operator-scripts\") pod \"root-account-create-update-pbvzk\" (UID: \"20a99eb1-1830-4268-a857-107242bbb333\") " pod="openstack/root-account-create-update-pbvzk" Feb 18 14:16:59 crc kubenswrapper[4817]: I0218 14:16:59.166865 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20a99eb1-1830-4268-a857-107242bbb333-operator-scripts\") pod \"root-account-create-update-pbvzk\" (UID: \"20a99eb1-1830-4268-a857-107242bbb333\") " pod="openstack/root-account-create-update-pbvzk" Feb 18 14:16:59 crc kubenswrapper[4817]: I0218 14:16:59.167033 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zz6r\" (UniqueName: \"kubernetes.io/projected/20a99eb1-1830-4268-a857-107242bbb333-kube-api-access-5zz6r\") pod \"root-account-create-update-pbvzk\" (UID: \"20a99eb1-1830-4268-a857-107242bbb333\") " pod="openstack/root-account-create-update-pbvzk" Feb 18 14:16:59 crc kubenswrapper[4817]: I0218 14:16:59.167775 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20a99eb1-1830-4268-a857-107242bbb333-operator-scripts\") pod \"root-account-create-update-pbvzk\" (UID: \"20a99eb1-1830-4268-a857-107242bbb333\") " pod="openstack/root-account-create-update-pbvzk" Feb 18 14:16:59 crc kubenswrapper[4817]: I0218 14:16:59.188250 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zz6r\" (UniqueName: \"kubernetes.io/projected/20a99eb1-1830-4268-a857-107242bbb333-kube-api-access-5zz6r\") pod \"root-account-create-update-pbvzk\" (UID: \"20a99eb1-1830-4268-a857-107242bbb333\") " pod="openstack/root-account-create-update-pbvzk" Feb 18 14:16:59 crc kubenswrapper[4817]: I0218 14:16:59.194722 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pbvzk" Feb 18 14:16:59 crc kubenswrapper[4817]: I0218 14:16:59.304252 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d5d448f4-839e-4b71-ac6e-0c941ccd5a14","Type":"ContainerStarted","Data":"d497a7cb528a81ed2229b0cdac90677ed3c2d2f20b82859c9abe6a0511ea7ee8"} Feb 18 14:16:59 crc kubenswrapper[4817]: I0218 14:16:59.340945 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=11.260727549 podStartE2EDuration="1m5.340910569s" podCreationTimestamp="2026-02-18 14:15:54 +0000 UTC" firstStartedPulling="2026-02-18 14:16:04.496446774 +0000 UTC m=+1027.071982767" lastFinishedPulling="2026-02-18 14:16:58.576629804 +0000 UTC m=+1081.152165787" observedRunningTime="2026-02-18 14:16:59.333184645 +0000 UTC m=+1081.908720648" watchObservedRunningTime="2026-02-18 14:16:59.340910569 +0000 UTC m=+1081.916446542" Feb 18 14:16:59 crc kubenswrapper[4817]: I0218 14:16:59.794711 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-c5btx" Feb 18 14:16:59 crc kubenswrapper[4817]: I0218 14:16:59.988524 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-ring-data-devices\") pod \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\" (UID: \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\") " Feb 18 14:16:59 crc kubenswrapper[4817]: I0218 14:16:59.988601 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-scripts\") pod \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\" (UID: \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\") " Feb 18 14:16:59 crc kubenswrapper[4817]: I0218 14:16:59.988632 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-dispersionconf\") pod \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\" (UID: \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\") " Feb 18 14:16:59 crc kubenswrapper[4817]: I0218 14:16:59.988674 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-etc-swift\") pod \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\" (UID: \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\") " Feb 18 14:16:59 crc kubenswrapper[4817]: I0218 14:16:59.988844 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-combined-ca-bundle\") pod \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\" (UID: \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\") " Feb 18 14:16:59 crc kubenswrapper[4817]: I0218 14:16:59.988879 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-swiftconf\") pod \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\" (UID: \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\") " Feb 18 14:16:59 crc kubenswrapper[4817]: I0218 14:16:59.988911 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n29bh\" (UniqueName: \"kubernetes.io/projected/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-kube-api-access-n29bh\") pod \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\" (UID: \"7f8fdaa1-d441-4b5e-b376-8ab67ce68339\") " Feb 18 14:16:59 crc kubenswrapper[4817]: I0218 14:16:59.991046 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "7f8fdaa1-d441-4b5e-b376-8ab67ce68339" (UID: "7f8fdaa1-d441-4b5e-b376-8ab67ce68339"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:16:59 crc kubenswrapper[4817]: I0218 14:16:59.991158 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7f8fdaa1-d441-4b5e-b376-8ab67ce68339" (UID: "7f8fdaa1-d441-4b5e-b376-8ab67ce68339"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:16:59 crc kubenswrapper[4817]: I0218 14:16:59.998337 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-kube-api-access-n29bh" (OuterVolumeSpecName: "kube-api-access-n29bh") pod "7f8fdaa1-d441-4b5e-b376-8ab67ce68339" (UID: "7f8fdaa1-d441-4b5e-b376-8ab67ce68339"). InnerVolumeSpecName "kube-api-access-n29bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:17:00 crc kubenswrapper[4817]: I0218 14:17:00.002098 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "7f8fdaa1-d441-4b5e-b376-8ab67ce68339" (UID: "7f8fdaa1-d441-4b5e-b376-8ab67ce68339"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:17:00 crc kubenswrapper[4817]: I0218 14:17:00.020714 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f8fdaa1-d441-4b5e-b376-8ab67ce68339" (UID: "7f8fdaa1-d441-4b5e-b376-8ab67ce68339"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:17:00 crc kubenswrapper[4817]: I0218 14:17:00.027202 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "7f8fdaa1-d441-4b5e-b376-8ab67ce68339" (UID: "7f8fdaa1-d441-4b5e-b376-8ab67ce68339"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:17:00 crc kubenswrapper[4817]: I0218 14:17:00.028576 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-scripts" (OuterVolumeSpecName: "scripts") pod "7f8fdaa1-d441-4b5e-b376-8ab67ce68339" (UID: "7f8fdaa1-d441-4b5e-b376-8ab67ce68339"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:17:00 crc kubenswrapper[4817]: I0218 14:17:00.041049 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-pbvzk"] Feb 18 14:17:00 crc kubenswrapper[4817]: I0218 14:17:00.092122 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:00 crc kubenswrapper[4817]: I0218 14:17:00.092155 4817 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:00 crc kubenswrapper[4817]: I0218 14:17:00.092167 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n29bh\" (UniqueName: \"kubernetes.io/projected/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-kube-api-access-n29bh\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:00 crc kubenswrapper[4817]: I0218 14:17:00.092180 4817 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:00 crc kubenswrapper[4817]: I0218 14:17:00.092190 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:00 crc kubenswrapper[4817]: I0218 14:17:00.092200 4817 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:00 crc kubenswrapper[4817]: I0218 14:17:00.092212 4817 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7f8fdaa1-d441-4b5e-b376-8ab67ce68339-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:00 crc kubenswrapper[4817]: I0218 14:17:00.314168 4817 generic.go:334] "Generic (PLEG): container finished" podID="14e634c8-da00-43a5-96a8-33e8bf806873" containerID="9a5e9d303e366c3bb999a78527cf3761a74f192fe80921e3e7ff942f31d4204e" exitCode=0 Feb 18 14:17:00 crc kubenswrapper[4817]: I0218 14:17:00.314268 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"14e634c8-da00-43a5-96a8-33e8bf806873","Type":"ContainerDied","Data":"9a5e9d303e366c3bb999a78527cf3761a74f192fe80921e3e7ff942f31d4204e"} Feb 18 14:17:00 crc kubenswrapper[4817]: I0218 14:17:00.320415 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-pbvzk" event={"ID":"20a99eb1-1830-4268-a857-107242bbb333","Type":"ContainerStarted","Data":"dbaeaf695a0606c28a668d8f270b73ce7468bb3cb34c594fa65af74723208d2d"} Feb 18 14:17:00 crc kubenswrapper[4817]: I0218 14:17:00.320457 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-pbvzk" event={"ID":"20a99eb1-1830-4268-a857-107242bbb333","Type":"ContainerStarted","Data":"aa2322e85fdf7f74d1a8dee92ca3aa7ec8089fa0406249095c297d51d187bda3"} Feb 18 14:17:00 crc kubenswrapper[4817]: I0218 14:17:00.327515 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-c5btx" event={"ID":"7f8fdaa1-d441-4b5e-b376-8ab67ce68339","Type":"ContainerDied","Data":"7d1ffdbcea6dfca08e3f8629b7f3ae45ad0d1f26ecb99f86cd3bffc954493a34"} Feb 18 14:17:00 crc kubenswrapper[4817]: I0218 14:17:00.327560 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d1ffdbcea6dfca08e3f8629b7f3ae45ad0d1f26ecb99f86cd3bffc954493a34" Feb 18 14:17:00 crc kubenswrapper[4817]: I0218 14:17:00.327630 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-c5btx" Feb 18 14:17:00 crc kubenswrapper[4817]: I0218 14:17:00.340847 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8b26cbdf-a148-4d07-bffc-afa241bc30e2","Type":"ContainerStarted","Data":"19122d80e56397bbfabab6971d7b4919e006dab72083f85cbd24aaf889302a62"} Feb 18 14:17:00 crc kubenswrapper[4817]: I0218 14:17:00.341188 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8b26cbdf-a148-4d07-bffc-afa241bc30e2","Type":"ContainerStarted","Data":"d7f68295eefc288dadad4a2e6c8e76436b83cddc64be28fdf5349e4391434e4f"} Feb 18 14:17:00 crc kubenswrapper[4817]: I0218 14:17:00.341302 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 18 14:17:00 crc kubenswrapper[4817]: I0218 14:17:00.373935 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-pbvzk" podStartSLOduration=2.373912501 podStartE2EDuration="2.373912501s" podCreationTimestamp="2026-02-18 14:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:17:00.37265033 +0000 UTC m=+1082.948186313" watchObservedRunningTime="2026-02-18 14:17:00.373912501 +0000 UTC m=+1082.949448484" Feb 18 14:17:00 crc kubenswrapper[4817]: I0218 14:17:00.395626 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.118999061 podStartE2EDuration="7.395605265s" podCreationTimestamp="2026-02-18 14:16:53 +0000 UTC" firstStartedPulling="2026-02-18 14:16:54.512943155 +0000 UTC m=+1077.088479138" lastFinishedPulling="2026-02-18 14:16:59.789549359 +0000 UTC m=+1082.365085342" observedRunningTime="2026-02-18 14:17:00.394401425 +0000 UTC m=+1082.969937408" watchObservedRunningTime="2026-02-18 14:17:00.395605265 +0000 UTC m=+1082.971141248" Feb 18 14:17:01 crc kubenswrapper[4817]: I0218 14:17:01.047252 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:01 crc kubenswrapper[4817]: I0218 14:17:01.347221 4817 generic.go:334] "Generic (PLEG): container finished" podID="20a99eb1-1830-4268-a857-107242bbb333" containerID="dbaeaf695a0606c28a668d8f270b73ce7468bb3cb34c594fa65af74723208d2d" exitCode=0 Feb 18 14:17:01 crc kubenswrapper[4817]: I0218 14:17:01.347297 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-pbvzk" event={"ID":"20a99eb1-1830-4268-a857-107242bbb333","Type":"ContainerDied","Data":"dbaeaf695a0606c28a668d8f270b73ce7468bb3cb34c594fa65af74723208d2d"} Feb 18 14:17:01 crc kubenswrapper[4817]: I0218 14:17:01.349330 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"14e634c8-da00-43a5-96a8-33e8bf806873","Type":"ContainerStarted","Data":"2187986625b1e0afd9d63e29f131284ecbd1b9924fcf0bd96558e3a42054490b"} Feb 18 14:17:01 crc kubenswrapper[4817]: I0218 14:17:01.349578 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:17:01 crc kubenswrapper[4817]: I0218 14:17:01.398487 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=54.606031591 podStartE2EDuration="1m14.398468303s" podCreationTimestamp="2026-02-18 14:15:47 +0000 UTC" firstStartedPulling="2026-02-18 14:16:03.743449228 +0000 UTC m=+1026.318985221" lastFinishedPulling="2026-02-18 14:16:23.53588595 +0000 UTC m=+1046.111421933" observedRunningTime="2026-02-18 14:17:01.393513939 +0000 UTC m=+1083.969049932" watchObservedRunningTime="2026-02-18 14:17:01.398468303 +0000 UTC m=+1083.974004286" Feb 18 14:17:01 crc kubenswrapper[4817]: I0218 14:17:01.621650 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 18 14:17:01 crc kubenswrapper[4817]: I0218 14:17:01.727188 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 18 14:17:01 crc kubenswrapper[4817]: I0218 14:17:01.996039 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-6bvsw"] Feb 18 14:17:01 crc kubenswrapper[4817]: E0218 14:17:01.996739 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f8fdaa1-d441-4b5e-b376-8ab67ce68339" containerName="swift-ring-rebalance" Feb 18 14:17:01 crc kubenswrapper[4817]: I0218 14:17:01.996766 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f8fdaa1-d441-4b5e-b376-8ab67ce68339" containerName="swift-ring-rebalance" Feb 18 14:17:01 crc kubenswrapper[4817]: I0218 14:17:01.997058 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f8fdaa1-d441-4b5e-b376-8ab67ce68339" containerName="swift-ring-rebalance" Feb 18 14:17:01 crc kubenswrapper[4817]: I0218 14:17:01.998758 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-6bvsw" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.004826 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-6bvsw"] Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.107051 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-e38b-account-create-update-rlln2"] Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.108617 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e38b-account-create-update-rlln2" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.111348 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.123439 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-e38b-account-create-update-rlln2"] Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.146158 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pqdv\" (UniqueName: \"kubernetes.io/projected/de3b6456-c155-43c3-9b35-5832009c7054-kube-api-access-9pqdv\") pod \"glance-db-create-6bvsw\" (UID: \"de3b6456-c155-43c3-9b35-5832009c7054\") " pod="openstack/glance-db-create-6bvsw" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.146358 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de3b6456-c155-43c3-9b35-5832009c7054-operator-scripts\") pod \"glance-db-create-6bvsw\" (UID: \"de3b6456-c155-43c3-9b35-5832009c7054\") " pod="openstack/glance-db-create-6bvsw" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.248016 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de3b6456-c155-43c3-9b35-5832009c7054-operator-scripts\") pod \"glance-db-create-6bvsw\" (UID: \"de3b6456-c155-43c3-9b35-5832009c7054\") " pod="openstack/glance-db-create-6bvsw" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.248107 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pqdv\" (UniqueName: \"kubernetes.io/projected/de3b6456-c155-43c3-9b35-5832009c7054-kube-api-access-9pqdv\") pod \"glance-db-create-6bvsw\" (UID: \"de3b6456-c155-43c3-9b35-5832009c7054\") " pod="openstack/glance-db-create-6bvsw" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.248135 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45992562-7a07-46a2-93fb-3c5fc00b367c-operator-scripts\") pod \"glance-e38b-account-create-update-rlln2\" (UID: \"45992562-7a07-46a2-93fb-3c5fc00b367c\") " pod="openstack/glance-e38b-account-create-update-rlln2" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.248184 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlblh\" (UniqueName: \"kubernetes.io/projected/45992562-7a07-46a2-93fb-3c5fc00b367c-kube-api-access-jlblh\") pod \"glance-e38b-account-create-update-rlln2\" (UID: \"45992562-7a07-46a2-93fb-3c5fc00b367c\") " pod="openstack/glance-e38b-account-create-update-rlln2" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.248883 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de3b6456-c155-43c3-9b35-5832009c7054-operator-scripts\") pod \"glance-db-create-6bvsw\" (UID: \"de3b6456-c155-43c3-9b35-5832009c7054\") " pod="openstack/glance-db-create-6bvsw" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.280792 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pqdv\" (UniqueName: \"kubernetes.io/projected/de3b6456-c155-43c3-9b35-5832009c7054-kube-api-access-9pqdv\") pod \"glance-db-create-6bvsw\" (UID: \"de3b6456-c155-43c3-9b35-5832009c7054\") " pod="openstack/glance-db-create-6bvsw" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.317347 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-6bvsw" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.350534 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlblh\" (UniqueName: \"kubernetes.io/projected/45992562-7a07-46a2-93fb-3c5fc00b367c-kube-api-access-jlblh\") pod \"glance-e38b-account-create-update-rlln2\" (UID: \"45992562-7a07-46a2-93fb-3c5fc00b367c\") " pod="openstack/glance-e38b-account-create-update-rlln2" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.350866 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45992562-7a07-46a2-93fb-3c5fc00b367c-operator-scripts\") pod \"glance-e38b-account-create-update-rlln2\" (UID: \"45992562-7a07-46a2-93fb-3c5fc00b367c\") " pod="openstack/glance-e38b-account-create-update-rlln2" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.351907 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45992562-7a07-46a2-93fb-3c5fc00b367c-operator-scripts\") pod \"glance-e38b-account-create-update-rlln2\" (UID: \"45992562-7a07-46a2-93fb-3c5fc00b367c\") " pod="openstack/glance-e38b-account-create-update-rlln2" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.373673 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlblh\" (UniqueName: \"kubernetes.io/projected/45992562-7a07-46a2-93fb-3c5fc00b367c-kube-api-access-jlblh\") pod \"glance-e38b-account-create-update-rlln2\" (UID: \"45992562-7a07-46a2-93fb-3c5fc00b367c\") " pod="openstack/glance-e38b-account-create-update-rlln2" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.378806 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-46wx9" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.428408 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e38b-account-create-update-rlln2" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.429680 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-46wx9" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.684082 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9bcxg-config-ktxwt"] Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.685763 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9bcxg-config-ktxwt" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.689166 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.698262 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9bcxg-config-ktxwt"] Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.761356 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7ba2896e-c9b2-4110-a66a-bf27049b80f0-var-run\") pod \"ovn-controller-9bcxg-config-ktxwt\" (UID: \"7ba2896e-c9b2-4110-a66a-bf27049b80f0\") " pod="openstack/ovn-controller-9bcxg-config-ktxwt" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.761463 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7ba2896e-c9b2-4110-a66a-bf27049b80f0-additional-scripts\") pod \"ovn-controller-9bcxg-config-ktxwt\" (UID: \"7ba2896e-c9b2-4110-a66a-bf27049b80f0\") " pod="openstack/ovn-controller-9bcxg-config-ktxwt" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.761517 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ba2896e-c9b2-4110-a66a-bf27049b80f0-scripts\") pod \"ovn-controller-9bcxg-config-ktxwt\" (UID: \"7ba2896e-c9b2-4110-a66a-bf27049b80f0\") " pod="openstack/ovn-controller-9bcxg-config-ktxwt" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.761570 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgwlw\" (UniqueName: \"kubernetes.io/projected/7ba2896e-c9b2-4110-a66a-bf27049b80f0-kube-api-access-bgwlw\") pod \"ovn-controller-9bcxg-config-ktxwt\" (UID: \"7ba2896e-c9b2-4110-a66a-bf27049b80f0\") " pod="openstack/ovn-controller-9bcxg-config-ktxwt" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.761638 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7ba2896e-c9b2-4110-a66a-bf27049b80f0-var-run-ovn\") pod \"ovn-controller-9bcxg-config-ktxwt\" (UID: \"7ba2896e-c9b2-4110-a66a-bf27049b80f0\") " pod="openstack/ovn-controller-9bcxg-config-ktxwt" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.761665 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7ba2896e-c9b2-4110-a66a-bf27049b80f0-var-log-ovn\") pod \"ovn-controller-9bcxg-config-ktxwt\" (UID: \"7ba2896e-c9b2-4110-a66a-bf27049b80f0\") " pod="openstack/ovn-controller-9bcxg-config-ktxwt" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.785080 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pbvzk" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.858553 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-268hx"] Feb 18 14:17:02 crc kubenswrapper[4817]: E0218 14:17:02.859103 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20a99eb1-1830-4268-a857-107242bbb333" containerName="mariadb-account-create-update" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.859121 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="20a99eb1-1830-4268-a857-107242bbb333" containerName="mariadb-account-create-update" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.859344 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="20a99eb1-1830-4268-a857-107242bbb333" containerName="mariadb-account-create-update" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.860280 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-268hx" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.862466 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20a99eb1-1830-4268-a857-107242bbb333-operator-scripts\") pod \"20a99eb1-1830-4268-a857-107242bbb333\" (UID: \"20a99eb1-1830-4268-a857-107242bbb333\") " Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.862592 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zz6r\" (UniqueName: \"kubernetes.io/projected/20a99eb1-1830-4268-a857-107242bbb333-kube-api-access-5zz6r\") pod \"20a99eb1-1830-4268-a857-107242bbb333\" (UID: \"20a99eb1-1830-4268-a857-107242bbb333\") " Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.862961 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7ba2896e-c9b2-4110-a66a-bf27049b80f0-var-run-ovn\") pod \"ovn-controller-9bcxg-config-ktxwt\" (UID: \"7ba2896e-c9b2-4110-a66a-bf27049b80f0\") " pod="openstack/ovn-controller-9bcxg-config-ktxwt" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.863059 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7ba2896e-c9b2-4110-a66a-bf27049b80f0-var-log-ovn\") pod \"ovn-controller-9bcxg-config-ktxwt\" (UID: \"7ba2896e-c9b2-4110-a66a-bf27049b80f0\") " pod="openstack/ovn-controller-9bcxg-config-ktxwt" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.863191 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7ba2896e-c9b2-4110-a66a-bf27049b80f0-var-run\") pod \"ovn-controller-9bcxg-config-ktxwt\" (UID: \"7ba2896e-c9b2-4110-a66a-bf27049b80f0\") " pod="openstack/ovn-controller-9bcxg-config-ktxwt" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.863264 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7ba2896e-c9b2-4110-a66a-bf27049b80f0-additional-scripts\") pod \"ovn-controller-9bcxg-config-ktxwt\" (UID: \"7ba2896e-c9b2-4110-a66a-bf27049b80f0\") " pod="openstack/ovn-controller-9bcxg-config-ktxwt" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.863311 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ba2896e-c9b2-4110-a66a-bf27049b80f0-scripts\") pod \"ovn-controller-9bcxg-config-ktxwt\" (UID: \"7ba2896e-c9b2-4110-a66a-bf27049b80f0\") " pod="openstack/ovn-controller-9bcxg-config-ktxwt" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.863351 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgwlw\" (UniqueName: \"kubernetes.io/projected/7ba2896e-c9b2-4110-a66a-bf27049b80f0-kube-api-access-bgwlw\") pod \"ovn-controller-9bcxg-config-ktxwt\" (UID: \"7ba2896e-c9b2-4110-a66a-bf27049b80f0\") " pod="openstack/ovn-controller-9bcxg-config-ktxwt" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.864415 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20a99eb1-1830-4268-a857-107242bbb333-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "20a99eb1-1830-4268-a857-107242bbb333" (UID: "20a99eb1-1830-4268-a857-107242bbb333"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.864921 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7ba2896e-c9b2-4110-a66a-bf27049b80f0-var-run\") pod \"ovn-controller-9bcxg-config-ktxwt\" (UID: \"7ba2896e-c9b2-4110-a66a-bf27049b80f0\") " pod="openstack/ovn-controller-9bcxg-config-ktxwt" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.865030 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7ba2896e-c9b2-4110-a66a-bf27049b80f0-var-run-ovn\") pod \"ovn-controller-9bcxg-config-ktxwt\" (UID: \"7ba2896e-c9b2-4110-a66a-bf27049b80f0\") " pod="openstack/ovn-controller-9bcxg-config-ktxwt" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.865087 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7ba2896e-c9b2-4110-a66a-bf27049b80f0-var-log-ovn\") pod \"ovn-controller-9bcxg-config-ktxwt\" (UID: \"7ba2896e-c9b2-4110-a66a-bf27049b80f0\") " pod="openstack/ovn-controller-9bcxg-config-ktxwt" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.865861 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7ba2896e-c9b2-4110-a66a-bf27049b80f0-additional-scripts\") pod \"ovn-controller-9bcxg-config-ktxwt\" (UID: \"7ba2896e-c9b2-4110-a66a-bf27049b80f0\") " pod="openstack/ovn-controller-9bcxg-config-ktxwt" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.867777 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ba2896e-c9b2-4110-a66a-bf27049b80f0-scripts\") pod \"ovn-controller-9bcxg-config-ktxwt\" (UID: \"7ba2896e-c9b2-4110-a66a-bf27049b80f0\") " pod="openstack/ovn-controller-9bcxg-config-ktxwt" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.869813 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20a99eb1-1830-4268-a857-107242bbb333-kube-api-access-5zz6r" (OuterVolumeSpecName: "kube-api-access-5zz6r") pod "20a99eb1-1830-4268-a857-107242bbb333" (UID: "20a99eb1-1830-4268-a857-107242bbb333"). InnerVolumeSpecName "kube-api-access-5zz6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.871050 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-268hx"] Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.888615 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgwlw\" (UniqueName: \"kubernetes.io/projected/7ba2896e-c9b2-4110-a66a-bf27049b80f0-kube-api-access-bgwlw\") pod \"ovn-controller-9bcxg-config-ktxwt\" (UID: \"7ba2896e-c9b2-4110-a66a-bf27049b80f0\") " pod="openstack/ovn-controller-9bcxg-config-ktxwt" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.945613 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bd17-account-create-update-v4mb6"] Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.962345 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bd17-account-create-update-v4mb6" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.967056 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.973962 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eabe27ea-3386-4a4d-bba3-1786b2041d2a-operator-scripts\") pod \"keystone-db-create-268hx\" (UID: \"eabe27ea-3386-4a4d-bba3-1786b2041d2a\") " pod="openstack/keystone-db-create-268hx" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.974106 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vhgp\" (UniqueName: \"kubernetes.io/projected/eabe27ea-3386-4a4d-bba3-1786b2041d2a-kube-api-access-5vhgp\") pod \"keystone-db-create-268hx\" (UID: \"eabe27ea-3386-4a4d-bba3-1786b2041d2a\") " pod="openstack/keystone-db-create-268hx" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.974475 4817 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20a99eb1-1830-4268-a857-107242bbb333-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.974505 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zz6r\" (UniqueName: \"kubernetes.io/projected/20a99eb1-1830-4268-a857-107242bbb333-kube-api-access-5zz6r\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:02 crc kubenswrapper[4817]: I0218 14:17:02.978126 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bd17-account-create-update-v4mb6"] Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.014769 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-6bvsw"] Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.025279 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9bcxg-config-ktxwt" Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.076043 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x86dw\" (UniqueName: \"kubernetes.io/projected/68ca39a4-c529-4e29-b4f0-bea5dde2dab9-kube-api-access-x86dw\") pod \"keystone-bd17-account-create-update-v4mb6\" (UID: \"68ca39a4-c529-4e29-b4f0-bea5dde2dab9\") " pod="openstack/keystone-bd17-account-create-update-v4mb6" Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.076133 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68ca39a4-c529-4e29-b4f0-bea5dde2dab9-operator-scripts\") pod \"keystone-bd17-account-create-update-v4mb6\" (UID: \"68ca39a4-c529-4e29-b4f0-bea5dde2dab9\") " pod="openstack/keystone-bd17-account-create-update-v4mb6" Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.076337 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eabe27ea-3386-4a4d-bba3-1786b2041d2a-operator-scripts\") pod \"keystone-db-create-268hx\" (UID: \"eabe27ea-3386-4a4d-bba3-1786b2041d2a\") " pod="openstack/keystone-db-create-268hx" Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.076425 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vhgp\" (UniqueName: \"kubernetes.io/projected/eabe27ea-3386-4a4d-bba3-1786b2041d2a-kube-api-access-5vhgp\") pod \"keystone-db-create-268hx\" (UID: \"eabe27ea-3386-4a4d-bba3-1786b2041d2a\") " pod="openstack/keystone-db-create-268hx" Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.077250 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eabe27ea-3386-4a4d-bba3-1786b2041d2a-operator-scripts\") pod \"keystone-db-create-268hx\" (UID: \"eabe27ea-3386-4a4d-bba3-1786b2041d2a\") " pod="openstack/keystone-db-create-268hx" Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.097518 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vhgp\" (UniqueName: \"kubernetes.io/projected/eabe27ea-3386-4a4d-bba3-1786b2041d2a-kube-api-access-5vhgp\") pod \"keystone-db-create-268hx\" (UID: \"eabe27ea-3386-4a4d-bba3-1786b2041d2a\") " pod="openstack/keystone-db-create-268hx" Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.127055 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-e38b-account-create-update-rlln2"] Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.149028 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-6dg5k"] Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.150363 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6dg5k" Feb 18 14:17:03 crc kubenswrapper[4817]: W0218 14:17:03.151789 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45992562_7a07_46a2_93fb_3c5fc00b367c.slice/crio-1f3bc804fc98366697e828c1bb213498615d025ee7d243e6a9672172ce4e5860 WatchSource:0}: Error finding container 1f3bc804fc98366697e828c1bb213498615d025ee7d243e6a9672172ce4e5860: Status 404 returned error can't find the container with id 1f3bc804fc98366697e828c1bb213498615d025ee7d243e6a9672172ce4e5860 Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.161166 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-3519-account-create-update-4tjwq"] Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.165695 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3519-account-create-update-4tjwq" Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.170358 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.179798 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-268hx" Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.181540 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x86dw\" (UniqueName: \"kubernetes.io/projected/68ca39a4-c529-4e29-b4f0-bea5dde2dab9-kube-api-access-x86dw\") pod \"keystone-bd17-account-create-update-v4mb6\" (UID: \"68ca39a4-c529-4e29-b4f0-bea5dde2dab9\") " pod="openstack/keystone-bd17-account-create-update-v4mb6" Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.181634 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68ca39a4-c529-4e29-b4f0-bea5dde2dab9-operator-scripts\") pod \"keystone-bd17-account-create-update-v4mb6\" (UID: \"68ca39a4-c529-4e29-b4f0-bea5dde2dab9\") " pod="openstack/keystone-bd17-account-create-update-v4mb6" Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.182586 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68ca39a4-c529-4e29-b4f0-bea5dde2dab9-operator-scripts\") pod \"keystone-bd17-account-create-update-v4mb6\" (UID: \"68ca39a4-c529-4e29-b4f0-bea5dde2dab9\") " pod="openstack/keystone-bd17-account-create-update-v4mb6" Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.183744 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6dg5k"] Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.206086 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3519-account-create-update-4tjwq"] Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.227178 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x86dw\" (UniqueName: \"kubernetes.io/projected/68ca39a4-c529-4e29-b4f0-bea5dde2dab9-kube-api-access-x86dw\") pod \"keystone-bd17-account-create-update-v4mb6\" (UID: \"68ca39a4-c529-4e29-b4f0-bea5dde2dab9\") " pod="openstack/keystone-bd17-account-create-update-v4mb6" Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.285519 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22ca54d4-d910-46fb-9966-792b61e4969b-operator-scripts\") pod \"placement-3519-account-create-update-4tjwq\" (UID: \"22ca54d4-d910-46fb-9966-792b61e4969b\") " pod="openstack/placement-3519-account-create-update-4tjwq" Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.285685 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb9cb\" (UniqueName: \"kubernetes.io/projected/d5efb39b-4f4c-4cae-a4f6-529877efbafb-kube-api-access-bb9cb\") pod \"placement-db-create-6dg5k\" (UID: \"d5efb39b-4f4c-4cae-a4f6-529877efbafb\") " pod="openstack/placement-db-create-6dg5k" Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.285833 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5efb39b-4f4c-4cae-a4f6-529877efbafb-operator-scripts\") pod \"placement-db-create-6dg5k\" (UID: \"d5efb39b-4f4c-4cae-a4f6-529877efbafb\") " pod="openstack/placement-db-create-6dg5k" Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.286183 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcbpj\" (UniqueName: \"kubernetes.io/projected/22ca54d4-d910-46fb-9966-792b61e4969b-kube-api-access-rcbpj\") pod \"placement-3519-account-create-update-4tjwq\" (UID: \"22ca54d4-d910-46fb-9966-792b61e4969b\") " pod="openstack/placement-3519-account-create-update-4tjwq" Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.303016 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bd17-account-create-update-v4mb6" Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.393115 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcbpj\" (UniqueName: \"kubernetes.io/projected/22ca54d4-d910-46fb-9966-792b61e4969b-kube-api-access-rcbpj\") pod \"placement-3519-account-create-update-4tjwq\" (UID: \"22ca54d4-d910-46fb-9966-792b61e4969b\") " pod="openstack/placement-3519-account-create-update-4tjwq" Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.393230 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22ca54d4-d910-46fb-9966-792b61e4969b-operator-scripts\") pod \"placement-3519-account-create-update-4tjwq\" (UID: \"22ca54d4-d910-46fb-9966-792b61e4969b\") " pod="openstack/placement-3519-account-create-update-4tjwq" Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.393262 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb9cb\" (UniqueName: \"kubernetes.io/projected/d5efb39b-4f4c-4cae-a4f6-529877efbafb-kube-api-access-bb9cb\") pod \"placement-db-create-6dg5k\" (UID: \"d5efb39b-4f4c-4cae-a4f6-529877efbafb\") " pod="openstack/placement-db-create-6dg5k" Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.393316 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5efb39b-4f4c-4cae-a4f6-529877efbafb-operator-scripts\") pod \"placement-db-create-6dg5k\" (UID: \"d5efb39b-4f4c-4cae-a4f6-529877efbafb\") " pod="openstack/placement-db-create-6dg5k" Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.394252 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5efb39b-4f4c-4cae-a4f6-529877efbafb-operator-scripts\") pod \"placement-db-create-6dg5k\" (UID: \"d5efb39b-4f4c-4cae-a4f6-529877efbafb\") " pod="openstack/placement-db-create-6dg5k" Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.395567 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22ca54d4-d910-46fb-9966-792b61e4969b-operator-scripts\") pod \"placement-3519-account-create-update-4tjwq\" (UID: \"22ca54d4-d910-46fb-9966-792b61e4969b\") " pod="openstack/placement-3519-account-create-update-4tjwq" Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.419997 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcbpj\" (UniqueName: \"kubernetes.io/projected/22ca54d4-d910-46fb-9966-792b61e4969b-kube-api-access-rcbpj\") pod \"placement-3519-account-create-update-4tjwq\" (UID: \"22ca54d4-d910-46fb-9966-792b61e4969b\") " pod="openstack/placement-3519-account-create-update-4tjwq" Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.429733 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-6bvsw" event={"ID":"de3b6456-c155-43c3-9b35-5832009c7054","Type":"ContainerStarted","Data":"3e71791cd6590943c0c6e0cf316c02d74c31a8d7ce97b9649c5b05b4ca2c8622"} Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.429795 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-6bvsw" event={"ID":"de3b6456-c155-43c3-9b35-5832009c7054","Type":"ContainerStarted","Data":"2d03d1e139750a8c9e3d8c20b63d06bcafaf3d5fa047fba6053f1214e38347ac"} Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.439960 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb9cb\" (UniqueName: \"kubernetes.io/projected/d5efb39b-4f4c-4cae-a4f6-529877efbafb-kube-api-access-bb9cb\") pod \"placement-db-create-6dg5k\" (UID: \"d5efb39b-4f4c-4cae-a4f6-529877efbafb\") " pod="openstack/placement-db-create-6dg5k" Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.441498 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e38b-account-create-update-rlln2" event={"ID":"45992562-7a07-46a2-93fb-3c5fc00b367c","Type":"ContainerStarted","Data":"1f3bc804fc98366697e828c1bb213498615d025ee7d243e6a9672172ce4e5860"} Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.446956 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pbvzk" Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.447635 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-pbvzk" event={"ID":"20a99eb1-1830-4268-a857-107242bbb333","Type":"ContainerDied","Data":"aa2322e85fdf7f74d1a8dee92ca3aa7ec8089fa0406249095c297d51d187bda3"} Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.447661 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa2322e85fdf7f74d1a8dee92ca3aa7ec8089fa0406249095c297d51d187bda3" Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.456059 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-6bvsw" podStartSLOduration=2.456037839 podStartE2EDuration="2.456037839s" podCreationTimestamp="2026-02-18 14:17:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:17:03.453773262 +0000 UTC m=+1086.029309265" watchObservedRunningTime="2026-02-18 14:17:03.456037839 +0000 UTC m=+1086.031573822" Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.529666 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6dg5k" Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.539577 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3519-account-create-update-4tjwq" Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.569898 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9bcxg-config-ktxwt"] Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.789558 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-268hx"] Feb 18 14:17:03 crc kubenswrapper[4817]: I0218 14:17:03.895212 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bd17-account-create-update-v4mb6"] Feb 18 14:17:03 crc kubenswrapper[4817]: W0218 14:17:03.905516 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68ca39a4_c529_4e29_b4f0_bea5dde2dab9.slice/crio-1076a02e7b0c749535a4b2951f2d9ad084d4d50574f3948e336738b527c063df WatchSource:0}: Error finding container 1076a02e7b0c749535a4b2951f2d9ad084d4d50574f3948e336738b527c063df: Status 404 returned error can't find the container with id 1076a02e7b0c749535a4b2951f2d9ad084d4d50574f3948e336738b527c063df Feb 18 14:17:04 crc kubenswrapper[4817]: I0218 14:17:04.072875 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3519-account-create-update-4tjwq"] Feb 18 14:17:04 crc kubenswrapper[4817]: W0218 14:17:04.082021 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22ca54d4_d910_46fb_9966_792b61e4969b.slice/crio-2145c27d19d184e1791f7c9a1757f47d3a41260348246decc8d7f77538099e7f WatchSource:0}: Error finding container 2145c27d19d184e1791f7c9a1757f47d3a41260348246decc8d7f77538099e7f: Status 404 returned error can't find the container with id 2145c27d19d184e1791f7c9a1757f47d3a41260348246decc8d7f77538099e7f Feb 18 14:17:04 crc kubenswrapper[4817]: I0218 14:17:04.220712 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6dg5k"] Feb 18 14:17:04 crc kubenswrapper[4817]: I0218 14:17:04.457618 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bd17-account-create-update-v4mb6" event={"ID":"68ca39a4-c529-4e29-b4f0-bea5dde2dab9","Type":"ContainerStarted","Data":"0cda4539915d564c1f1a0a904f5f9eee6275da5e919fceb4db7a2a5be677f158"} Feb 18 14:17:04 crc kubenswrapper[4817]: I0218 14:17:04.457970 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bd17-account-create-update-v4mb6" event={"ID":"68ca39a4-c529-4e29-b4f0-bea5dde2dab9","Type":"ContainerStarted","Data":"1076a02e7b0c749535a4b2951f2d9ad084d4d50574f3948e336738b527c063df"} Feb 18 14:17:04 crc kubenswrapper[4817]: I0218 14:17:04.460139 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6dg5k" event={"ID":"d5efb39b-4f4c-4cae-a4f6-529877efbafb","Type":"ContainerStarted","Data":"242fb7d69607fff9a64e955e63a2147c95f81370401563b0d8d97bc28c38f616"} Feb 18 14:17:04 crc kubenswrapper[4817]: I0218 14:17:04.467142 4817 generic.go:334] "Generic (PLEG): container finished" podID="45992562-7a07-46a2-93fb-3c5fc00b367c" containerID="4344cd761fe1014382b5bf26b108be7d341da3d9a3a09754092aa47d2a7fa98f" exitCode=0 Feb 18 14:17:04 crc kubenswrapper[4817]: I0218 14:17:04.467176 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e38b-account-create-update-rlln2" event={"ID":"45992562-7a07-46a2-93fb-3c5fc00b367c","Type":"ContainerDied","Data":"4344cd761fe1014382b5bf26b108be7d341da3d9a3a09754092aa47d2a7fa98f"} Feb 18 14:17:04 crc kubenswrapper[4817]: I0218 14:17:04.471265 4817 generic.go:334] "Generic (PLEG): container finished" podID="eabe27ea-3386-4a4d-bba3-1786b2041d2a" containerID="7ea18320b93e4d6a63ee602f94a46f9b3d8f6cd87e1ea610e7d55edd2aef27b6" exitCode=0 Feb 18 14:17:04 crc kubenswrapper[4817]: I0218 14:17:04.471379 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-268hx" event={"ID":"eabe27ea-3386-4a4d-bba3-1786b2041d2a","Type":"ContainerDied","Data":"7ea18320b93e4d6a63ee602f94a46f9b3d8f6cd87e1ea610e7d55edd2aef27b6"} Feb 18 14:17:04 crc kubenswrapper[4817]: I0218 14:17:04.471406 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-268hx" event={"ID":"eabe27ea-3386-4a4d-bba3-1786b2041d2a","Type":"ContainerStarted","Data":"03bb687332b6e0a9e0a12b17fc25585caebc8b340af1455bb1612fd2df196147"} Feb 18 14:17:04 crc kubenswrapper[4817]: I0218 14:17:04.474934 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bd17-account-create-update-v4mb6" podStartSLOduration=2.474911417 podStartE2EDuration="2.474911417s" podCreationTimestamp="2026-02-18 14:17:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:17:04.469508212 +0000 UTC m=+1087.045044205" watchObservedRunningTime="2026-02-18 14:17:04.474911417 +0000 UTC m=+1087.050447400" Feb 18 14:17:04 crc kubenswrapper[4817]: I0218 14:17:04.494741 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3519-account-create-update-4tjwq" event={"ID":"22ca54d4-d910-46fb-9966-792b61e4969b","Type":"ContainerStarted","Data":"acef9937a409cfd792c07db69d073715e581438c9fab8c3beaa608147c79d31c"} Feb 18 14:17:04 crc kubenswrapper[4817]: I0218 14:17:04.494829 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3519-account-create-update-4tjwq" event={"ID":"22ca54d4-d910-46fb-9966-792b61e4969b","Type":"ContainerStarted","Data":"2145c27d19d184e1791f7c9a1757f47d3a41260348246decc8d7f77538099e7f"} Feb 18 14:17:04 crc kubenswrapper[4817]: I0218 14:17:04.502154 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9bcxg-config-ktxwt" event={"ID":"7ba2896e-c9b2-4110-a66a-bf27049b80f0","Type":"ContainerStarted","Data":"4b71496c76fcfa64e3b94ed413e473b6c3559b2cd28dd667d386cc78bbcb413e"} Feb 18 14:17:04 crc kubenswrapper[4817]: I0218 14:17:04.502400 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9bcxg-config-ktxwt" event={"ID":"7ba2896e-c9b2-4110-a66a-bf27049b80f0","Type":"ContainerStarted","Data":"5eedceb1ccf0df104ddd55c0579cb62547fe8cef884cc412273ae938422a75f9"} Feb 18 14:17:04 crc kubenswrapper[4817]: I0218 14:17:04.525993 4817 generic.go:334] "Generic (PLEG): container finished" podID="de3b6456-c155-43c3-9b35-5832009c7054" containerID="3e71791cd6590943c0c6e0cf316c02d74c31a8d7ce97b9649c5b05b4ca2c8622" exitCode=0 Feb 18 14:17:04 crc kubenswrapper[4817]: I0218 14:17:04.527095 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-6bvsw" event={"ID":"de3b6456-c155-43c3-9b35-5832009c7054","Type":"ContainerDied","Data":"3e71791cd6590943c0c6e0cf316c02d74c31a8d7ce97b9649c5b05b4ca2c8622"} Feb 18 14:17:04 crc kubenswrapper[4817]: I0218 14:17:04.543255 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-3519-account-create-update-4tjwq" podStartSLOduration=1.54322922 podStartE2EDuration="1.54322922s" podCreationTimestamp="2026-02-18 14:17:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:17:04.54242951 +0000 UTC m=+1087.117965503" watchObservedRunningTime="2026-02-18 14:17:04.54322922 +0000 UTC m=+1087.118765203" Feb 18 14:17:04 crc kubenswrapper[4817]: I0218 14:17:04.598825 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-9bcxg-config-ktxwt" podStartSLOduration=2.598802104 podStartE2EDuration="2.598802104s" podCreationTimestamp="2026-02-18 14:17:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:17:04.588198838 +0000 UTC m=+1087.163734851" watchObservedRunningTime="2026-02-18 14:17:04.598802104 +0000 UTC m=+1087.174338097" Feb 18 14:17:04 crc kubenswrapper[4817]: E0218 14:17:04.639196 4817 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeabe27ea_3386_4a4d_bba3_1786b2041d2a.slice/crio-conmon-7ea18320b93e4d6a63ee602f94a46f9b3d8f6cd87e1ea610e7d55edd2aef27b6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeabe27ea_3386_4a4d_bba3_1786b2041d2a.slice/crio-7ea18320b93e4d6a63ee602f94a46f9b3d8f6cd87e1ea610e7d55edd2aef27b6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68ca39a4_c529_4e29_b4f0_bea5dde2dab9.slice/crio-0cda4539915d564c1f1a0a904f5f9eee6275da5e919fceb4db7a2a5be677f158.scope\": RecentStats: unable to find data in memory cache]" Feb 18 14:17:05 crc kubenswrapper[4817]: I0218 14:17:05.539806 4817 generic.go:334] "Generic (PLEG): container finished" podID="68ca39a4-c529-4e29-b4f0-bea5dde2dab9" containerID="0cda4539915d564c1f1a0a904f5f9eee6275da5e919fceb4db7a2a5be677f158" exitCode=0 Feb 18 14:17:05 crc kubenswrapper[4817]: I0218 14:17:05.539908 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bd17-account-create-update-v4mb6" event={"ID":"68ca39a4-c529-4e29-b4f0-bea5dde2dab9","Type":"ContainerDied","Data":"0cda4539915d564c1f1a0a904f5f9eee6275da5e919fceb4db7a2a5be677f158"} Feb 18 14:17:05 crc kubenswrapper[4817]: I0218 14:17:05.542446 4817 generic.go:334] "Generic (PLEG): container finished" podID="d5efb39b-4f4c-4cae-a4f6-529877efbafb" containerID="5811f76d8389c264d315be4cb88f7078ce8c6626bf4cdee993d6f8845c7f2119" exitCode=0 Feb 18 14:17:05 crc kubenswrapper[4817]: I0218 14:17:05.542535 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6dg5k" event={"ID":"d5efb39b-4f4c-4cae-a4f6-529877efbafb","Type":"ContainerDied","Data":"5811f76d8389c264d315be4cb88f7078ce8c6626bf4cdee993d6f8845c7f2119"} Feb 18 14:17:05 crc kubenswrapper[4817]: I0218 14:17:05.544594 4817 generic.go:334] "Generic (PLEG): container finished" podID="22ca54d4-d910-46fb-9966-792b61e4969b" containerID="acef9937a409cfd792c07db69d073715e581438c9fab8c3beaa608147c79d31c" exitCode=0 Feb 18 14:17:05 crc kubenswrapper[4817]: I0218 14:17:05.544924 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3519-account-create-update-4tjwq" event={"ID":"22ca54d4-d910-46fb-9966-792b61e4969b","Type":"ContainerDied","Data":"acef9937a409cfd792c07db69d073715e581438c9fab8c3beaa608147c79d31c"} Feb 18 14:17:05 crc kubenswrapper[4817]: I0218 14:17:05.546042 4817 generic.go:334] "Generic (PLEG): container finished" podID="7ba2896e-c9b2-4110-a66a-bf27049b80f0" containerID="4b71496c76fcfa64e3b94ed413e473b6c3559b2cd28dd667d386cc78bbcb413e" exitCode=0 Feb 18 14:17:05 crc kubenswrapper[4817]: I0218 14:17:05.546263 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9bcxg-config-ktxwt" event={"ID":"7ba2896e-c9b2-4110-a66a-bf27049b80f0","Type":"ContainerDied","Data":"4b71496c76fcfa64e3b94ed413e473b6c3559b2cd28dd667d386cc78bbcb413e"} Feb 18 14:17:06 crc kubenswrapper[4817]: I0218 14:17:06.073040 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-6bvsw" Feb 18 14:17:06 crc kubenswrapper[4817]: I0218 14:17:06.159300 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de3b6456-c155-43c3-9b35-5832009c7054-operator-scripts\") pod \"de3b6456-c155-43c3-9b35-5832009c7054\" (UID: \"de3b6456-c155-43c3-9b35-5832009c7054\") " Feb 18 14:17:06 crc kubenswrapper[4817]: I0218 14:17:06.159475 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pqdv\" (UniqueName: \"kubernetes.io/projected/de3b6456-c155-43c3-9b35-5832009c7054-kube-api-access-9pqdv\") pod \"de3b6456-c155-43c3-9b35-5832009c7054\" (UID: \"de3b6456-c155-43c3-9b35-5832009c7054\") " Feb 18 14:17:06 crc kubenswrapper[4817]: I0218 14:17:06.160425 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de3b6456-c155-43c3-9b35-5832009c7054-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "de3b6456-c155-43c3-9b35-5832009c7054" (UID: "de3b6456-c155-43c3-9b35-5832009c7054"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:17:06 crc kubenswrapper[4817]: I0218 14:17:06.165444 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de3b6456-c155-43c3-9b35-5832009c7054-kube-api-access-9pqdv" (OuterVolumeSpecName: "kube-api-access-9pqdv") pod "de3b6456-c155-43c3-9b35-5832009c7054" (UID: "de3b6456-c155-43c3-9b35-5832009c7054"). InnerVolumeSpecName "kube-api-access-9pqdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:17:06 crc kubenswrapper[4817]: I0218 14:17:06.228498 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e38b-account-create-update-rlln2" Feb 18 14:17:06 crc kubenswrapper[4817]: I0218 14:17:06.239721 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-268hx" Feb 18 14:17:06 crc kubenswrapper[4817]: I0218 14:17:06.261607 4817 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de3b6456-c155-43c3-9b35-5832009c7054-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:06 crc kubenswrapper[4817]: I0218 14:17:06.261646 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pqdv\" (UniqueName: \"kubernetes.io/projected/de3b6456-c155-43c3-9b35-5832009c7054-kube-api-access-9pqdv\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:06 crc kubenswrapper[4817]: I0218 14:17:06.363031 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlblh\" (UniqueName: \"kubernetes.io/projected/45992562-7a07-46a2-93fb-3c5fc00b367c-kube-api-access-jlblh\") pod \"45992562-7a07-46a2-93fb-3c5fc00b367c\" (UID: \"45992562-7a07-46a2-93fb-3c5fc00b367c\") " Feb 18 14:17:06 crc kubenswrapper[4817]: I0218 14:17:06.363114 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vhgp\" (UniqueName: \"kubernetes.io/projected/eabe27ea-3386-4a4d-bba3-1786b2041d2a-kube-api-access-5vhgp\") pod \"eabe27ea-3386-4a4d-bba3-1786b2041d2a\" (UID: \"eabe27ea-3386-4a4d-bba3-1786b2041d2a\") " Feb 18 14:17:06 crc kubenswrapper[4817]: I0218 14:17:06.363218 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eabe27ea-3386-4a4d-bba3-1786b2041d2a-operator-scripts\") pod \"eabe27ea-3386-4a4d-bba3-1786b2041d2a\" (UID: \"eabe27ea-3386-4a4d-bba3-1786b2041d2a\") " Feb 18 14:17:06 crc kubenswrapper[4817]: I0218 14:17:06.363287 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45992562-7a07-46a2-93fb-3c5fc00b367c-operator-scripts\") pod \"45992562-7a07-46a2-93fb-3c5fc00b367c\" (UID: \"45992562-7a07-46a2-93fb-3c5fc00b367c\") " Feb 18 14:17:06 crc kubenswrapper[4817]: I0218 14:17:06.363727 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eabe27ea-3386-4a4d-bba3-1786b2041d2a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eabe27ea-3386-4a4d-bba3-1786b2041d2a" (UID: "eabe27ea-3386-4a4d-bba3-1786b2041d2a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:17:06 crc kubenswrapper[4817]: I0218 14:17:06.363868 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45992562-7a07-46a2-93fb-3c5fc00b367c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "45992562-7a07-46a2-93fb-3c5fc00b367c" (UID: "45992562-7a07-46a2-93fb-3c5fc00b367c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:17:06 crc kubenswrapper[4817]: I0218 14:17:06.364217 4817 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eabe27ea-3386-4a4d-bba3-1786b2041d2a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:06 crc kubenswrapper[4817]: I0218 14:17:06.364238 4817 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45992562-7a07-46a2-93fb-3c5fc00b367c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:06 crc kubenswrapper[4817]: I0218 14:17:06.366494 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45992562-7a07-46a2-93fb-3c5fc00b367c-kube-api-access-jlblh" (OuterVolumeSpecName: "kube-api-access-jlblh") pod "45992562-7a07-46a2-93fb-3c5fc00b367c" (UID: "45992562-7a07-46a2-93fb-3c5fc00b367c"). InnerVolumeSpecName "kube-api-access-jlblh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:17:06 crc kubenswrapper[4817]: I0218 14:17:06.367456 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eabe27ea-3386-4a4d-bba3-1786b2041d2a-kube-api-access-5vhgp" (OuterVolumeSpecName: "kube-api-access-5vhgp") pod "eabe27ea-3386-4a4d-bba3-1786b2041d2a" (UID: "eabe27ea-3386-4a4d-bba3-1786b2041d2a"). InnerVolumeSpecName "kube-api-access-5vhgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:17:06 crc kubenswrapper[4817]: I0218 14:17:06.389299 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="7f685dd5-8921-4e4a-a4d5-d19a499775f5" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 18 14:17:06 crc kubenswrapper[4817]: I0218 14:17:06.466032 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlblh\" (UniqueName: \"kubernetes.io/projected/45992562-7a07-46a2-93fb-3c5fc00b367c-kube-api-access-jlblh\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:06 crc kubenswrapper[4817]: I0218 14:17:06.466085 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vhgp\" (UniqueName: \"kubernetes.io/projected/eabe27ea-3386-4a4d-bba3-1786b2041d2a-kube-api-access-5vhgp\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:06 crc kubenswrapper[4817]: I0218 14:17:06.557214 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-6bvsw" event={"ID":"de3b6456-c155-43c3-9b35-5832009c7054","Type":"ContainerDied","Data":"2d03d1e139750a8c9e3d8c20b63d06bcafaf3d5fa047fba6053f1214e38347ac"} Feb 18 14:17:06 crc kubenswrapper[4817]: I0218 14:17:06.557260 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d03d1e139750a8c9e3d8c20b63d06bcafaf3d5fa047fba6053f1214e38347ac" Feb 18 14:17:06 crc kubenswrapper[4817]: I0218 14:17:06.557258 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-6bvsw" Feb 18 14:17:06 crc kubenswrapper[4817]: I0218 14:17:06.559381 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e38b-account-create-update-rlln2" Feb 18 14:17:06 crc kubenswrapper[4817]: I0218 14:17:06.559397 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e38b-account-create-update-rlln2" event={"ID":"45992562-7a07-46a2-93fb-3c5fc00b367c","Type":"ContainerDied","Data":"1f3bc804fc98366697e828c1bb213498615d025ee7d243e6a9672172ce4e5860"} Feb 18 14:17:06 crc kubenswrapper[4817]: I0218 14:17:06.559441 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f3bc804fc98366697e828c1bb213498615d025ee7d243e6a9672172ce4e5860" Feb 18 14:17:06 crc kubenswrapper[4817]: I0218 14:17:06.561560 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-268hx" Feb 18 14:17:06 crc kubenswrapper[4817]: I0218 14:17:06.564207 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-268hx" event={"ID":"eabe27ea-3386-4a4d-bba3-1786b2041d2a","Type":"ContainerDied","Data":"03bb687332b6e0a9e0a12b17fc25585caebc8b340af1455bb1612fd2df196147"} Feb 18 14:17:06 crc kubenswrapper[4817]: I0218 14:17:06.564247 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03bb687332b6e0a9e0a12b17fc25585caebc8b340af1455bb1612fd2df196147" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.040916 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3519-account-create-update-4tjwq" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.184143 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcbpj\" (UniqueName: \"kubernetes.io/projected/22ca54d4-d910-46fb-9966-792b61e4969b-kube-api-access-rcbpj\") pod \"22ca54d4-d910-46fb-9966-792b61e4969b\" (UID: \"22ca54d4-d910-46fb-9966-792b61e4969b\") " Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.184485 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22ca54d4-d910-46fb-9966-792b61e4969b-operator-scripts\") pod \"22ca54d4-d910-46fb-9966-792b61e4969b\" (UID: \"22ca54d4-d910-46fb-9966-792b61e4969b\") " Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.185808 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22ca54d4-d910-46fb-9966-792b61e4969b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "22ca54d4-d910-46fb-9966-792b61e4969b" (UID: "22ca54d4-d910-46fb-9966-792b61e4969b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.186454 4817 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22ca54d4-d910-46fb-9966-792b61e4969b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.189534 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22ca54d4-d910-46fb-9966-792b61e4969b-kube-api-access-rcbpj" (OuterVolumeSpecName: "kube-api-access-rcbpj") pod "22ca54d4-d910-46fb-9966-792b61e4969b" (UID: "22ca54d4-d910-46fb-9966-792b61e4969b"). InnerVolumeSpecName "kube-api-access-rcbpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.270616 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9bcxg-config-ktxwt" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.290199 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcbpj\" (UniqueName: \"kubernetes.io/projected/22ca54d4-d910-46fb-9966-792b61e4969b-kube-api-access-rcbpj\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.295316 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6dg5k" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.295490 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bd17-account-create-update-v4mb6" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.376050 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-9bcxg" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.391223 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ba2896e-c9b2-4110-a66a-bf27049b80f0-scripts\") pod \"7ba2896e-c9b2-4110-a66a-bf27049b80f0\" (UID: \"7ba2896e-c9b2-4110-a66a-bf27049b80f0\") " Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.391267 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5efb39b-4f4c-4cae-a4f6-529877efbafb-operator-scripts\") pod \"d5efb39b-4f4c-4cae-a4f6-529877efbafb\" (UID: \"d5efb39b-4f4c-4cae-a4f6-529877efbafb\") " Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.391358 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68ca39a4-c529-4e29-b4f0-bea5dde2dab9-operator-scripts\") pod \"68ca39a4-c529-4e29-b4f0-bea5dde2dab9\" (UID: \"68ca39a4-c529-4e29-b4f0-bea5dde2dab9\") " Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.391406 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7ba2896e-c9b2-4110-a66a-bf27049b80f0-additional-scripts\") pod \"7ba2896e-c9b2-4110-a66a-bf27049b80f0\" (UID: \"7ba2896e-c9b2-4110-a66a-bf27049b80f0\") " Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.391433 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bb9cb\" (UniqueName: \"kubernetes.io/projected/d5efb39b-4f4c-4cae-a4f6-529877efbafb-kube-api-access-bb9cb\") pod \"d5efb39b-4f4c-4cae-a4f6-529877efbafb\" (UID: \"d5efb39b-4f4c-4cae-a4f6-529877efbafb\") " Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.391473 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7ba2896e-c9b2-4110-a66a-bf27049b80f0-var-run\") pod \"7ba2896e-c9b2-4110-a66a-bf27049b80f0\" (UID: \"7ba2896e-c9b2-4110-a66a-bf27049b80f0\") " Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.391523 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x86dw\" (UniqueName: \"kubernetes.io/projected/68ca39a4-c529-4e29-b4f0-bea5dde2dab9-kube-api-access-x86dw\") pod \"68ca39a4-c529-4e29-b4f0-bea5dde2dab9\" (UID: \"68ca39a4-c529-4e29-b4f0-bea5dde2dab9\") " Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.391570 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7ba2896e-c9b2-4110-a66a-bf27049b80f0-var-log-ovn\") pod \"7ba2896e-c9b2-4110-a66a-bf27049b80f0\" (UID: \"7ba2896e-c9b2-4110-a66a-bf27049b80f0\") " Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.391649 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7ba2896e-c9b2-4110-a66a-bf27049b80f0-var-run-ovn\") pod \"7ba2896e-c9b2-4110-a66a-bf27049b80f0\" (UID: \"7ba2896e-c9b2-4110-a66a-bf27049b80f0\") " Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.391687 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgwlw\" (UniqueName: \"kubernetes.io/projected/7ba2896e-c9b2-4110-a66a-bf27049b80f0-kube-api-access-bgwlw\") pod \"7ba2896e-c9b2-4110-a66a-bf27049b80f0\" (UID: \"7ba2896e-c9b2-4110-a66a-bf27049b80f0\") " Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.391802 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ba2896e-c9b2-4110-a66a-bf27049b80f0-var-run" (OuterVolumeSpecName: "var-run") pod "7ba2896e-c9b2-4110-a66a-bf27049b80f0" (UID: "7ba2896e-c9b2-4110-a66a-bf27049b80f0"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.392100 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68ca39a4-c529-4e29-b4f0-bea5dde2dab9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "68ca39a4-c529-4e29-b4f0-bea5dde2dab9" (UID: "68ca39a4-c529-4e29-b4f0-bea5dde2dab9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.392132 4817 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7ba2896e-c9b2-4110-a66a-bf27049b80f0-var-run\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.392175 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ba2896e-c9b2-4110-a66a-bf27049b80f0-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "7ba2896e-c9b2-4110-a66a-bf27049b80f0" (UID: "7ba2896e-c9b2-4110-a66a-bf27049b80f0"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.392268 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5efb39b-4f4c-4cae-a4f6-529877efbafb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d5efb39b-4f4c-4cae-a4f6-529877efbafb" (UID: "d5efb39b-4f4c-4cae-a4f6-529877efbafb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.392301 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ba2896e-c9b2-4110-a66a-bf27049b80f0-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "7ba2896e-c9b2-4110-a66a-bf27049b80f0" (UID: "7ba2896e-c9b2-4110-a66a-bf27049b80f0"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.392409 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ba2896e-c9b2-4110-a66a-bf27049b80f0-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "7ba2896e-c9b2-4110-a66a-bf27049b80f0" (UID: "7ba2896e-c9b2-4110-a66a-bf27049b80f0"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.392860 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ba2896e-c9b2-4110-a66a-bf27049b80f0-scripts" (OuterVolumeSpecName: "scripts") pod "7ba2896e-c9b2-4110-a66a-bf27049b80f0" (UID: "7ba2896e-c9b2-4110-a66a-bf27049b80f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.402226 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ba2896e-c9b2-4110-a66a-bf27049b80f0-kube-api-access-bgwlw" (OuterVolumeSpecName: "kube-api-access-bgwlw") pod "7ba2896e-c9b2-4110-a66a-bf27049b80f0" (UID: "7ba2896e-c9b2-4110-a66a-bf27049b80f0"). InnerVolumeSpecName "kube-api-access-bgwlw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.405400 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68ca39a4-c529-4e29-b4f0-bea5dde2dab9-kube-api-access-x86dw" (OuterVolumeSpecName: "kube-api-access-x86dw") pod "68ca39a4-c529-4e29-b4f0-bea5dde2dab9" (UID: "68ca39a4-c529-4e29-b4f0-bea5dde2dab9"). InnerVolumeSpecName "kube-api-access-x86dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.422925 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5efb39b-4f4c-4cae-a4f6-529877efbafb-kube-api-access-bb9cb" (OuterVolumeSpecName: "kube-api-access-bb9cb") pod "d5efb39b-4f4c-4cae-a4f6-529877efbafb" (UID: "d5efb39b-4f4c-4cae-a4f6-529877efbafb"). InnerVolumeSpecName "kube-api-access-bb9cb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.494457 4817 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7ba2896e-c9b2-4110-a66a-bf27049b80f0-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.494772 4817 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7ba2896e-c9b2-4110-a66a-bf27049b80f0-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.494786 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgwlw\" (UniqueName: \"kubernetes.io/projected/7ba2896e-c9b2-4110-a66a-bf27049b80f0-kube-api-access-bgwlw\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.494801 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ba2896e-c9b2-4110-a66a-bf27049b80f0-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.494813 4817 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5efb39b-4f4c-4cae-a4f6-529877efbafb-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.494824 4817 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68ca39a4-c529-4e29-b4f0-bea5dde2dab9-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.494835 4817 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7ba2896e-c9b2-4110-a66a-bf27049b80f0-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.494846 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bb9cb\" (UniqueName: \"kubernetes.io/projected/d5efb39b-4f4c-4cae-a4f6-529877efbafb-kube-api-access-bb9cb\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.494857 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x86dw\" (UniqueName: \"kubernetes.io/projected/68ca39a4-c529-4e29-b4f0-bea5dde2dab9-kube-api-access-x86dw\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.571127 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3519-account-create-update-4tjwq" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.571126 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3519-account-create-update-4tjwq" event={"ID":"22ca54d4-d910-46fb-9966-792b61e4969b","Type":"ContainerDied","Data":"2145c27d19d184e1791f7c9a1757f47d3a41260348246decc8d7f77538099e7f"} Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.571264 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2145c27d19d184e1791f7c9a1757f47d3a41260348246decc8d7f77538099e7f" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.573514 4817 generic.go:334] "Generic (PLEG): container finished" podID="d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1" containerID="3a7b1cf4852e8739319f69c8a261a36817c331f73c78b92b691e2916854cabb5" exitCode=0 Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.573606 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1","Type":"ContainerDied","Data":"3a7b1cf4852e8739319f69c8a261a36817c331f73c78b92b691e2916854cabb5"} Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.577852 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9bcxg-config-ktxwt" event={"ID":"7ba2896e-c9b2-4110-a66a-bf27049b80f0","Type":"ContainerDied","Data":"5eedceb1ccf0df104ddd55c0579cb62547fe8cef884cc412273ae938422a75f9"} Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.577893 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5eedceb1ccf0df104ddd55c0579cb62547fe8cef884cc412273ae938422a75f9" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.577995 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9bcxg-config-ktxwt" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.587274 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bd17-account-create-update-v4mb6" event={"ID":"68ca39a4-c529-4e29-b4f0-bea5dde2dab9","Type":"ContainerDied","Data":"1076a02e7b0c749535a4b2951f2d9ad084d4d50574f3948e336738b527c063df"} Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.587312 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1076a02e7b0c749535a4b2951f2d9ad084d4d50574f3948e336738b527c063df" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.587365 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bd17-account-create-update-v4mb6" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.590923 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6dg5k" event={"ID":"d5efb39b-4f4c-4cae-a4f6-529877efbafb","Type":"ContainerDied","Data":"242fb7d69607fff9a64e955e63a2147c95f81370401563b0d8d97bc28c38f616"} Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.590954 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="242fb7d69607fff9a64e955e63a2147c95f81370401563b0d8d97bc28c38f616" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.593154 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6dg5k" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.799802 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/77de7364-0925-438c-89e2-6ff0d3cb0776-etc-swift\") pod \"swift-storage-0\" (UID: \"77de7364-0925-438c-89e2-6ff0d3cb0776\") " pod="openstack/swift-storage-0" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.805012 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/77de7364-0925-438c-89e2-6ff0d3cb0776-etc-swift\") pod \"swift-storage-0\" (UID: \"77de7364-0925-438c-89e2-6ff0d3cb0776\") " pod="openstack/swift-storage-0" Feb 18 14:17:07 crc kubenswrapper[4817]: I0218 14:17:07.839659 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 18 14:17:08 crc kubenswrapper[4817]: I0218 14:17:08.425908 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9bcxg-config-ktxwt"] Feb 18 14:17:08 crc kubenswrapper[4817]: I0218 14:17:08.435411 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-9bcxg-config-ktxwt"] Feb 18 14:17:08 crc kubenswrapper[4817]: I0218 14:17:08.602151 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1","Type":"ContainerStarted","Data":"e26ecc0e0a3b4457bde2bdb563a90b212c385b31af07a4839118b11a69829aaf"} Feb 18 14:17:08 crc kubenswrapper[4817]: I0218 14:17:08.603125 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 18 14:17:08 crc kubenswrapper[4817]: I0218 14:17:08.632162 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371955.222654 podStartE2EDuration="1m21.632121162s" podCreationTimestamp="2026-02-18 14:15:47 +0000 UTC" firstStartedPulling="2026-02-18 14:16:04.020901 +0000 UTC m=+1026.596436983" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:17:08.624566713 +0000 UTC m=+1091.200102706" watchObservedRunningTime="2026-02-18 14:17:08.632121162 +0000 UTC m=+1091.207657145" Feb 18 14:17:08 crc kubenswrapper[4817]: W0218 14:17:08.669123 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77de7364_0925_438c_89e2_6ff0d3cb0776.slice/crio-7db1d927067463087be3c2e08cb2bb54bb6c7cf61640c7d9e2e3149fef240fe9 WatchSource:0}: Error finding container 7db1d927067463087be3c2e08cb2bb54bb6c7cf61640c7d9e2e3149fef240fe9: Status 404 returned error can't find the container with id 7db1d927067463087be3c2e08cb2bb54bb6c7cf61640c7d9e2e3149fef240fe9 Feb 18 14:17:08 crc kubenswrapper[4817]: I0218 14:17:08.681265 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 18 14:17:09 crc kubenswrapper[4817]: I0218 14:17:09.624788 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"77de7364-0925-438c-89e2-6ff0d3cb0776","Type":"ContainerStarted","Data":"7db1d927067463087be3c2e08cb2bb54bb6c7cf61640c7d9e2e3149fef240fe9"} Feb 18 14:17:10 crc kubenswrapper[4817]: I0218 14:17:10.101497 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-pbvzk"] Feb 18 14:17:10 crc kubenswrapper[4817]: I0218 14:17:10.110561 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-pbvzk"] Feb 18 14:17:10 crc kubenswrapper[4817]: I0218 14:17:10.184681 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20a99eb1-1830-4268-a857-107242bbb333" path="/var/lib/kubelet/pods/20a99eb1-1830-4268-a857-107242bbb333/volumes" Feb 18 14:17:10 crc kubenswrapper[4817]: I0218 14:17:10.185380 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ba2896e-c9b2-4110-a66a-bf27049b80f0" path="/var/lib/kubelet/pods/7ba2896e-c9b2-4110-a66a-bf27049b80f0/volumes" Feb 18 14:17:10 crc kubenswrapper[4817]: I0218 14:17:10.223243 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-8fqk4"] Feb 18 14:17:10 crc kubenswrapper[4817]: E0218 14:17:10.223615 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de3b6456-c155-43c3-9b35-5832009c7054" containerName="mariadb-database-create" Feb 18 14:17:10 crc kubenswrapper[4817]: I0218 14:17:10.223631 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="de3b6456-c155-43c3-9b35-5832009c7054" containerName="mariadb-database-create" Feb 18 14:17:10 crc kubenswrapper[4817]: E0218 14:17:10.223640 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5efb39b-4f4c-4cae-a4f6-529877efbafb" containerName="mariadb-database-create" Feb 18 14:17:10 crc kubenswrapper[4817]: I0218 14:17:10.223646 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5efb39b-4f4c-4cae-a4f6-529877efbafb" containerName="mariadb-database-create" Feb 18 14:17:10 crc kubenswrapper[4817]: E0218 14:17:10.223654 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eabe27ea-3386-4a4d-bba3-1786b2041d2a" containerName="mariadb-database-create" Feb 18 14:17:10 crc kubenswrapper[4817]: I0218 14:17:10.223660 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="eabe27ea-3386-4a4d-bba3-1786b2041d2a" containerName="mariadb-database-create" Feb 18 14:17:10 crc kubenswrapper[4817]: E0218 14:17:10.223678 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ba2896e-c9b2-4110-a66a-bf27049b80f0" containerName="ovn-config" Feb 18 14:17:10 crc kubenswrapper[4817]: I0218 14:17:10.223684 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ba2896e-c9b2-4110-a66a-bf27049b80f0" containerName="ovn-config" Feb 18 14:17:10 crc kubenswrapper[4817]: E0218 14:17:10.223697 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45992562-7a07-46a2-93fb-3c5fc00b367c" containerName="mariadb-account-create-update" Feb 18 14:17:10 crc kubenswrapper[4817]: I0218 14:17:10.223702 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="45992562-7a07-46a2-93fb-3c5fc00b367c" containerName="mariadb-account-create-update" Feb 18 14:17:10 crc kubenswrapper[4817]: E0218 14:17:10.223712 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68ca39a4-c529-4e29-b4f0-bea5dde2dab9" containerName="mariadb-account-create-update" Feb 18 14:17:10 crc kubenswrapper[4817]: I0218 14:17:10.223718 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="68ca39a4-c529-4e29-b4f0-bea5dde2dab9" containerName="mariadb-account-create-update" Feb 18 14:17:10 crc kubenswrapper[4817]: E0218 14:17:10.223726 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22ca54d4-d910-46fb-9966-792b61e4969b" containerName="mariadb-account-create-update" Feb 18 14:17:10 crc kubenswrapper[4817]: I0218 14:17:10.223732 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="22ca54d4-d910-46fb-9966-792b61e4969b" containerName="mariadb-account-create-update" Feb 18 14:17:10 crc kubenswrapper[4817]: I0218 14:17:10.223874 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="68ca39a4-c529-4e29-b4f0-bea5dde2dab9" containerName="mariadb-account-create-update" Feb 18 14:17:10 crc kubenswrapper[4817]: I0218 14:17:10.223893 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="45992562-7a07-46a2-93fb-3c5fc00b367c" containerName="mariadb-account-create-update" Feb 18 14:17:10 crc kubenswrapper[4817]: I0218 14:17:10.223905 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="eabe27ea-3386-4a4d-bba3-1786b2041d2a" containerName="mariadb-database-create" Feb 18 14:17:10 crc kubenswrapper[4817]: I0218 14:17:10.223911 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="22ca54d4-d910-46fb-9966-792b61e4969b" containerName="mariadb-account-create-update" Feb 18 14:17:10 crc kubenswrapper[4817]: I0218 14:17:10.223919 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ba2896e-c9b2-4110-a66a-bf27049b80f0" containerName="ovn-config" Feb 18 14:17:10 crc kubenswrapper[4817]: I0218 14:17:10.223927 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="de3b6456-c155-43c3-9b35-5832009c7054" containerName="mariadb-database-create" Feb 18 14:17:10 crc kubenswrapper[4817]: I0218 14:17:10.223938 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5efb39b-4f4c-4cae-a4f6-529877efbafb" containerName="mariadb-database-create" Feb 18 14:17:10 crc kubenswrapper[4817]: I0218 14:17:10.224677 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8fqk4" Feb 18 14:17:10 crc kubenswrapper[4817]: I0218 14:17:10.228142 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 18 14:17:10 crc kubenswrapper[4817]: I0218 14:17:10.232838 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-8fqk4"] Feb 18 14:17:10 crc kubenswrapper[4817]: I0218 14:17:10.353450 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/206a327c-35a4-4c97-9e11-9792c464b2c3-operator-scripts\") pod \"root-account-create-update-8fqk4\" (UID: \"206a327c-35a4-4c97-9e11-9792c464b2c3\") " pod="openstack/root-account-create-update-8fqk4" Feb 18 14:17:10 crc kubenswrapper[4817]: I0218 14:17:10.353671 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps584\" (UniqueName: \"kubernetes.io/projected/206a327c-35a4-4c97-9e11-9792c464b2c3-kube-api-access-ps584\") pod \"root-account-create-update-8fqk4\" (UID: \"206a327c-35a4-4c97-9e11-9792c464b2c3\") " pod="openstack/root-account-create-update-8fqk4" Feb 18 14:17:10 crc kubenswrapper[4817]: I0218 14:17:10.455911 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps584\" (UniqueName: \"kubernetes.io/projected/206a327c-35a4-4c97-9e11-9792c464b2c3-kube-api-access-ps584\") pod \"root-account-create-update-8fqk4\" (UID: \"206a327c-35a4-4c97-9e11-9792c464b2c3\") " pod="openstack/root-account-create-update-8fqk4" Feb 18 14:17:10 crc kubenswrapper[4817]: I0218 14:17:10.455993 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/206a327c-35a4-4c97-9e11-9792c464b2c3-operator-scripts\") pod \"root-account-create-update-8fqk4\" (UID: \"206a327c-35a4-4c97-9e11-9792c464b2c3\") " pod="openstack/root-account-create-update-8fqk4" Feb 18 14:17:10 crc kubenswrapper[4817]: I0218 14:17:10.456832 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/206a327c-35a4-4c97-9e11-9792c464b2c3-operator-scripts\") pod \"root-account-create-update-8fqk4\" (UID: \"206a327c-35a4-4c97-9e11-9792c464b2c3\") " pod="openstack/root-account-create-update-8fqk4" Feb 18 14:17:10 crc kubenswrapper[4817]: I0218 14:17:10.483623 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps584\" (UniqueName: \"kubernetes.io/projected/206a327c-35a4-4c97-9e11-9792c464b2c3-kube-api-access-ps584\") pod \"root-account-create-update-8fqk4\" (UID: \"206a327c-35a4-4c97-9e11-9792c464b2c3\") " pod="openstack/root-account-create-update-8fqk4" Feb 18 14:17:10 crc kubenswrapper[4817]: I0218 14:17:10.558179 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8fqk4" Feb 18 14:17:10 crc kubenswrapper[4817]: I0218 14:17:10.645705 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"77de7364-0925-438c-89e2-6ff0d3cb0776","Type":"ContainerStarted","Data":"b68d9ca26e29f5834a68aa5112c354f40893c10d08e88db0c191c6848f3d8902"} Feb 18 14:17:10 crc kubenswrapper[4817]: I0218 14:17:10.645760 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"77de7364-0925-438c-89e2-6ff0d3cb0776","Type":"ContainerStarted","Data":"d37ffe1be35b9cea1a751d758b12b84825021baaed4275dd98b131da4f774b91"} Feb 18 14:17:10 crc kubenswrapper[4817]: I0218 14:17:10.645773 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"77de7364-0925-438c-89e2-6ff0d3cb0776","Type":"ContainerStarted","Data":"9279f0c70af0c060a610c8756b9959ccf54bf5f45080049b6a86899c171d8491"} Feb 18 14:17:11 crc kubenswrapper[4817]: I0218 14:17:11.047802 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:11 crc kubenswrapper[4817]: I0218 14:17:11.051454 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:11 crc kubenswrapper[4817]: I0218 14:17:11.069532 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-8fqk4"] Feb 18 14:17:11 crc kubenswrapper[4817]: I0218 14:17:11.658591 4817 generic.go:334] "Generic (PLEG): container finished" podID="206a327c-35a4-4c97-9e11-9792c464b2c3" containerID="d9cd83558493762888c997dff506122078861fe06345c3b74ea1ad02b2770ab1" exitCode=0 Feb 18 14:17:11 crc kubenswrapper[4817]: I0218 14:17:11.658685 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8fqk4" event={"ID":"206a327c-35a4-4c97-9e11-9792c464b2c3","Type":"ContainerDied","Data":"d9cd83558493762888c997dff506122078861fe06345c3b74ea1ad02b2770ab1"} Feb 18 14:17:11 crc kubenswrapper[4817]: I0218 14:17:11.658997 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8fqk4" event={"ID":"206a327c-35a4-4c97-9e11-9792c464b2c3","Type":"ContainerStarted","Data":"2a31da39f0c65ce5ec8e0ab37008f76d6e5726aa418b6d4b6c8da64dd6e849a4"} Feb 18 14:17:11 crc kubenswrapper[4817]: I0218 14:17:11.663112 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"77de7364-0925-438c-89e2-6ff0d3cb0776","Type":"ContainerStarted","Data":"0a38e51b83434be2b044073f1ba29b0853518dfd73c83536d2afeccda59dfd79"} Feb 18 14:17:11 crc kubenswrapper[4817]: I0218 14:17:11.665371 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:12 crc kubenswrapper[4817]: I0218 14:17:12.338532 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-2fk5c"] Feb 18 14:17:12 crc kubenswrapper[4817]: I0218 14:17:12.340884 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-2fk5c" Feb 18 14:17:12 crc kubenswrapper[4817]: I0218 14:17:12.354533 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 18 14:17:12 crc kubenswrapper[4817]: I0218 14:17:12.354933 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-24d5c" Feb 18 14:17:12 crc kubenswrapper[4817]: I0218 14:17:12.384626 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-2fk5c"] Feb 18 14:17:12 crc kubenswrapper[4817]: I0218 14:17:12.392880 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8de51007-ada2-49f5-90b2-11151899e3cf-db-sync-config-data\") pod \"glance-db-sync-2fk5c\" (UID: \"8de51007-ada2-49f5-90b2-11151899e3cf\") " pod="openstack/glance-db-sync-2fk5c" Feb 18 14:17:12 crc kubenswrapper[4817]: I0218 14:17:12.393315 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7fzc\" (UniqueName: \"kubernetes.io/projected/8de51007-ada2-49f5-90b2-11151899e3cf-kube-api-access-q7fzc\") pod \"glance-db-sync-2fk5c\" (UID: \"8de51007-ada2-49f5-90b2-11151899e3cf\") " pod="openstack/glance-db-sync-2fk5c" Feb 18 14:17:12 crc kubenswrapper[4817]: I0218 14:17:12.393421 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de51007-ada2-49f5-90b2-11151899e3cf-combined-ca-bundle\") pod \"glance-db-sync-2fk5c\" (UID: \"8de51007-ada2-49f5-90b2-11151899e3cf\") " pod="openstack/glance-db-sync-2fk5c" Feb 18 14:17:12 crc kubenswrapper[4817]: I0218 14:17:12.393571 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de51007-ada2-49f5-90b2-11151899e3cf-config-data\") pod \"glance-db-sync-2fk5c\" (UID: \"8de51007-ada2-49f5-90b2-11151899e3cf\") " pod="openstack/glance-db-sync-2fk5c" Feb 18 14:17:12 crc kubenswrapper[4817]: I0218 14:17:12.495392 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7fzc\" (UniqueName: \"kubernetes.io/projected/8de51007-ada2-49f5-90b2-11151899e3cf-kube-api-access-q7fzc\") pod \"glance-db-sync-2fk5c\" (UID: \"8de51007-ada2-49f5-90b2-11151899e3cf\") " pod="openstack/glance-db-sync-2fk5c" Feb 18 14:17:12 crc kubenswrapper[4817]: I0218 14:17:12.495860 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de51007-ada2-49f5-90b2-11151899e3cf-combined-ca-bundle\") pod \"glance-db-sync-2fk5c\" (UID: \"8de51007-ada2-49f5-90b2-11151899e3cf\") " pod="openstack/glance-db-sync-2fk5c" Feb 18 14:17:12 crc kubenswrapper[4817]: I0218 14:17:12.495941 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de51007-ada2-49f5-90b2-11151899e3cf-config-data\") pod \"glance-db-sync-2fk5c\" (UID: \"8de51007-ada2-49f5-90b2-11151899e3cf\") " pod="openstack/glance-db-sync-2fk5c" Feb 18 14:17:12 crc kubenswrapper[4817]: I0218 14:17:12.496018 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8de51007-ada2-49f5-90b2-11151899e3cf-db-sync-config-data\") pod \"glance-db-sync-2fk5c\" (UID: \"8de51007-ada2-49f5-90b2-11151899e3cf\") " pod="openstack/glance-db-sync-2fk5c" Feb 18 14:17:12 crc kubenswrapper[4817]: I0218 14:17:12.501732 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8de51007-ada2-49f5-90b2-11151899e3cf-db-sync-config-data\") pod \"glance-db-sync-2fk5c\" (UID: \"8de51007-ada2-49f5-90b2-11151899e3cf\") " pod="openstack/glance-db-sync-2fk5c" Feb 18 14:17:12 crc kubenswrapper[4817]: I0218 14:17:12.502473 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de51007-ada2-49f5-90b2-11151899e3cf-config-data\") pod \"glance-db-sync-2fk5c\" (UID: \"8de51007-ada2-49f5-90b2-11151899e3cf\") " pod="openstack/glance-db-sync-2fk5c" Feb 18 14:17:12 crc kubenswrapper[4817]: I0218 14:17:12.502972 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de51007-ada2-49f5-90b2-11151899e3cf-combined-ca-bundle\") pod \"glance-db-sync-2fk5c\" (UID: \"8de51007-ada2-49f5-90b2-11151899e3cf\") " pod="openstack/glance-db-sync-2fk5c" Feb 18 14:17:12 crc kubenswrapper[4817]: I0218 14:17:12.522594 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7fzc\" (UniqueName: \"kubernetes.io/projected/8de51007-ada2-49f5-90b2-11151899e3cf-kube-api-access-q7fzc\") pod \"glance-db-sync-2fk5c\" (UID: \"8de51007-ada2-49f5-90b2-11151899e3cf\") " pod="openstack/glance-db-sync-2fk5c" Feb 18 14:17:12 crc kubenswrapper[4817]: I0218 14:17:12.682009 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-2fk5c" Feb 18 14:17:12 crc kubenswrapper[4817]: I0218 14:17:12.687725 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"77de7364-0925-438c-89e2-6ff0d3cb0776","Type":"ContainerStarted","Data":"01f50101260f856ea3dfc184f4e8f44667b439b2fc43f2a1b316949787854725"} Feb 18 14:17:12 crc kubenswrapper[4817]: I0218 14:17:12.687770 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"77de7364-0925-438c-89e2-6ff0d3cb0776","Type":"ContainerStarted","Data":"a19ba8009973caf05bb6392085c7a4ae54cfc40a0c12055c72aa15217b1fd366"} Feb 18 14:17:12 crc kubenswrapper[4817]: I0218 14:17:12.687789 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"77de7364-0925-438c-89e2-6ff0d3cb0776","Type":"ContainerStarted","Data":"3fdd4684f3acc24b16dfce6ed8e554c3e51dd2884e5136cff3789544e1509ebb"} Feb 18 14:17:13 crc kubenswrapper[4817]: I0218 14:17:13.082959 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8fqk4" Feb 18 14:17:13 crc kubenswrapper[4817]: I0218 14:17:13.106916 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/206a327c-35a4-4c97-9e11-9792c464b2c3-operator-scripts\") pod \"206a327c-35a4-4c97-9e11-9792c464b2c3\" (UID: \"206a327c-35a4-4c97-9e11-9792c464b2c3\") " Feb 18 14:17:13 crc kubenswrapper[4817]: I0218 14:17:13.107265 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ps584\" (UniqueName: \"kubernetes.io/projected/206a327c-35a4-4c97-9e11-9792c464b2c3-kube-api-access-ps584\") pod \"206a327c-35a4-4c97-9e11-9792c464b2c3\" (UID: \"206a327c-35a4-4c97-9e11-9792c464b2c3\") " Feb 18 14:17:13 crc kubenswrapper[4817]: I0218 14:17:13.107840 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/206a327c-35a4-4c97-9e11-9792c464b2c3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "206a327c-35a4-4c97-9e11-9792c464b2c3" (UID: "206a327c-35a4-4c97-9e11-9792c464b2c3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:17:13 crc kubenswrapper[4817]: I0218 14:17:13.108220 4817 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/206a327c-35a4-4c97-9e11-9792c464b2c3-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:13 crc kubenswrapper[4817]: I0218 14:17:13.114796 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/206a327c-35a4-4c97-9e11-9792c464b2c3-kube-api-access-ps584" (OuterVolumeSpecName: "kube-api-access-ps584") pod "206a327c-35a4-4c97-9e11-9792c464b2c3" (UID: "206a327c-35a4-4c97-9e11-9792c464b2c3"). InnerVolumeSpecName "kube-api-access-ps584". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:17:13 crc kubenswrapper[4817]: I0218 14:17:13.210903 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ps584\" (UniqueName: \"kubernetes.io/projected/206a327c-35a4-4c97-9e11-9792c464b2c3-kube-api-access-ps584\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:13 crc kubenswrapper[4817]: I0218 14:17:13.402419 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-2fk5c"] Feb 18 14:17:13 crc kubenswrapper[4817]: W0218 14:17:13.430290 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8de51007_ada2_49f5_90b2_11151899e3cf.slice/crio-bb641bbbdcbc85305cb1c82ecfc41c8af46270b50aa7e87aba5ae233f0c82a8b WatchSource:0}: Error finding container bb641bbbdcbc85305cb1c82ecfc41c8af46270b50aa7e87aba5ae233f0c82a8b: Status 404 returned error can't find the container with id bb641bbbdcbc85305cb1c82ecfc41c8af46270b50aa7e87aba5ae233f0c82a8b Feb 18 14:17:13 crc kubenswrapper[4817]: I0218 14:17:13.698024 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"77de7364-0925-438c-89e2-6ff0d3cb0776","Type":"ContainerStarted","Data":"23234fd25c9a651da3d698a6cf5689b4c8e0ac366ec3f9064934eaf25ae8d468"} Feb 18 14:17:13 crc kubenswrapper[4817]: I0218 14:17:13.700864 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8fqk4" Feb 18 14:17:13 crc kubenswrapper[4817]: I0218 14:17:13.700855 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8fqk4" event={"ID":"206a327c-35a4-4c97-9e11-9792c464b2c3","Type":"ContainerDied","Data":"2a31da39f0c65ce5ec8e0ab37008f76d6e5726aa418b6d4b6c8da64dd6e849a4"} Feb 18 14:17:13 crc kubenswrapper[4817]: I0218 14:17:13.700914 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a31da39f0c65ce5ec8e0ab37008f76d6e5726aa418b6d4b6c8da64dd6e849a4" Feb 18 14:17:13 crc kubenswrapper[4817]: I0218 14:17:13.702509 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-2fk5c" event={"ID":"8de51007-ada2-49f5-90b2-11151899e3cf","Type":"ContainerStarted","Data":"bb641bbbdcbc85305cb1c82ecfc41c8af46270b50aa7e87aba5ae233f0c82a8b"} Feb 18 14:17:13 crc kubenswrapper[4817]: I0218 14:17:13.996826 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 18 14:17:14 crc kubenswrapper[4817]: I0218 14:17:14.717933 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"77de7364-0925-438c-89e2-6ff0d3cb0776","Type":"ContainerStarted","Data":"69a8139f37fa5d56db3c68a2816d22533cd75022a027c4c11e757c44bfd2c107"} Feb 18 14:17:14 crc kubenswrapper[4817]: I0218 14:17:14.718324 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"77de7364-0925-438c-89e2-6ff0d3cb0776","Type":"ContainerStarted","Data":"fb813c76ee98dd086f9184c40ea180e05e41f438c5b6bf3ff52d8fd8648323d1"} Feb 18 14:17:14 crc kubenswrapper[4817]: I0218 14:17:14.937407 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 18 14:17:14 crc kubenswrapper[4817]: I0218 14:17:14.938132 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="d5d448f4-839e-4b71-ac6e-0c941ccd5a14" containerName="prometheus" containerID="cri-o://1e8977c90e480a38324dbe7a779273e2890fcc959446890b828d6f54e033a6a2" gracePeriod=600 Feb 18 14:17:14 crc kubenswrapper[4817]: I0218 14:17:14.938535 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="d5d448f4-839e-4b71-ac6e-0c941ccd5a14" containerName="config-reloader" containerID="cri-o://a115b4ffa1e08b7aa72751645a637afa2ca40952df9066d390305a579effd26c" gracePeriod=600 Feb 18 14:17:14 crc kubenswrapper[4817]: I0218 14:17:14.938809 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="d5d448f4-839e-4b71-ac6e-0c941ccd5a14" containerName="thanos-sidecar" containerID="cri-o://d497a7cb528a81ed2229b0cdac90677ed3c2d2f20b82859c9abe6a0511ea7ee8" gracePeriod=600 Feb 18 14:17:15 crc kubenswrapper[4817]: I0218 14:17:15.756607 4817 generic.go:334] "Generic (PLEG): container finished" podID="d5d448f4-839e-4b71-ac6e-0c941ccd5a14" containerID="d497a7cb528a81ed2229b0cdac90677ed3c2d2f20b82859c9abe6a0511ea7ee8" exitCode=0 Feb 18 14:17:15 crc kubenswrapper[4817]: I0218 14:17:15.756943 4817 generic.go:334] "Generic (PLEG): container finished" podID="d5d448f4-839e-4b71-ac6e-0c941ccd5a14" containerID="a115b4ffa1e08b7aa72751645a637afa2ca40952df9066d390305a579effd26c" exitCode=0 Feb 18 14:17:15 crc kubenswrapper[4817]: I0218 14:17:15.756956 4817 generic.go:334] "Generic (PLEG): container finished" podID="d5d448f4-839e-4b71-ac6e-0c941ccd5a14" containerID="1e8977c90e480a38324dbe7a779273e2890fcc959446890b828d6f54e033a6a2" exitCode=0 Feb 18 14:17:15 crc kubenswrapper[4817]: I0218 14:17:15.756757 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d5d448f4-839e-4b71-ac6e-0c941ccd5a14","Type":"ContainerDied","Data":"d497a7cb528a81ed2229b0cdac90677ed3c2d2f20b82859c9abe6a0511ea7ee8"} Feb 18 14:17:15 crc kubenswrapper[4817]: I0218 14:17:15.757060 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d5d448f4-839e-4b71-ac6e-0c941ccd5a14","Type":"ContainerDied","Data":"a115b4ffa1e08b7aa72751645a637afa2ca40952df9066d390305a579effd26c"} Feb 18 14:17:15 crc kubenswrapper[4817]: I0218 14:17:15.757078 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d5d448f4-839e-4b71-ac6e-0c941ccd5a14","Type":"ContainerDied","Data":"1e8977c90e480a38324dbe7a779273e2890fcc959446890b828d6f54e033a6a2"} Feb 18 14:17:15 crc kubenswrapper[4817]: I0218 14:17:15.766786 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"77de7364-0925-438c-89e2-6ff0d3cb0776","Type":"ContainerStarted","Data":"35fa4a565f592534c4a1ba555cddf793dd52f258067f3bb20cb11c6ad678209a"} Feb 18 14:17:15 crc kubenswrapper[4817]: I0218 14:17:15.766839 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"77de7364-0925-438c-89e2-6ff0d3cb0776","Type":"ContainerStarted","Data":"959512cac22211543e274738e63180050b86d6487ce4af14c12499e767bfd918"} Feb 18 14:17:15 crc kubenswrapper[4817]: I0218 14:17:15.766850 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"77de7364-0925-438c-89e2-6ff0d3cb0776","Type":"ContainerStarted","Data":"e6580c713018d1d497b1ab9476dbb279bc198fce89f19492bb618a5377be109b"} Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.059904 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.176710 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-web-config\") pod \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.176796 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-prometheus-metric-storage-rulefiles-2\") pod \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.176846 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-config-out\") pod \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.176946 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-prometheus-metric-storage-rulefiles-1\") pod \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.177011 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-config\") pod \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.177042 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-tls-assets\") pod \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.177213 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-04929794-6ef0-4dce-978a-755fd164a7e0\") pod \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.177260 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t56w9\" (UniqueName: \"kubernetes.io/projected/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-kube-api-access-t56w9\") pod \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.177303 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-prometheus-metric-storage-rulefiles-0\") pod \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.177327 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-thanos-prometheus-http-client-file\") pod \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\" (UID: \"d5d448f4-839e-4b71-ac6e-0c941ccd5a14\") " Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.179476 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "d5d448f4-839e-4b71-ac6e-0c941ccd5a14" (UID: "d5d448f4-839e-4b71-ac6e-0c941ccd5a14"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.179860 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "d5d448f4-839e-4b71-ac6e-0c941ccd5a14" (UID: "d5d448f4-839e-4b71-ac6e-0c941ccd5a14"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.180267 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "d5d448f4-839e-4b71-ac6e-0c941ccd5a14" (UID: "d5d448f4-839e-4b71-ac6e-0c941ccd5a14"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.186662 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "d5d448f4-839e-4b71-ac6e-0c941ccd5a14" (UID: "d5d448f4-839e-4b71-ac6e-0c941ccd5a14"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.187059 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-kube-api-access-t56w9" (OuterVolumeSpecName: "kube-api-access-t56w9") pod "d5d448f4-839e-4b71-ac6e-0c941ccd5a14" (UID: "d5d448f4-839e-4b71-ac6e-0c941ccd5a14"). InnerVolumeSpecName "kube-api-access-t56w9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.187174 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "d5d448f4-839e-4b71-ac6e-0c941ccd5a14" (UID: "d5d448f4-839e-4b71-ac6e-0c941ccd5a14"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.190713 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-config" (OuterVolumeSpecName: "config") pod "d5d448f4-839e-4b71-ac6e-0c941ccd5a14" (UID: "d5d448f4-839e-4b71-ac6e-0c941ccd5a14"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.192170 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-config-out" (OuterVolumeSpecName: "config-out") pod "d5d448f4-839e-4b71-ac6e-0c941ccd5a14" (UID: "d5d448f4-839e-4b71-ac6e-0c941ccd5a14"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.246955 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-04929794-6ef0-4dce-978a-755fd164a7e0" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "d5d448f4-839e-4b71-ac6e-0c941ccd5a14" (UID: "d5d448f4-839e-4b71-ac6e-0c941ccd5a14"). InnerVolumeSpecName "pvc-04929794-6ef0-4dce-978a-755fd164a7e0". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.279387 4817 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.279425 4817 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-config-out\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.279439 4817 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.279452 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.279465 4817 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-tls-assets\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.279572 4817 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-04929794-6ef0-4dce-978a-755fd164a7e0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-04929794-6ef0-4dce-978a-755fd164a7e0\") on node \"crc\" " Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.279595 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t56w9\" (UniqueName: \"kubernetes.io/projected/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-kube-api-access-t56w9\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.279607 4817 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.279619 4817 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.280063 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-web-config" (OuterVolumeSpecName: "web-config") pod "d5d448f4-839e-4b71-ac6e-0c941ccd5a14" (UID: "d5d448f4-839e-4b71-ac6e-0c941ccd5a14"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.310464 4817 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.310614 4817 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-04929794-6ef0-4dce-978a-755fd164a7e0" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-04929794-6ef0-4dce-978a-755fd164a7e0") on node "crc" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.381086 4817 reconciler_common.go:293] "Volume detached for volume \"pvc-04929794-6ef0-4dce-978a-755fd164a7e0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-04929794-6ef0-4dce-978a-755fd164a7e0\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.381441 4817 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d5d448f4-839e-4b71-ac6e-0c941ccd5a14-web-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.392587 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="7f685dd5-8921-4e4a-a4d5-d19a499775f5" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.784032 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"77de7364-0925-438c-89e2-6ff0d3cb0776","Type":"ContainerStarted","Data":"4bddf3482f782283610db41e9c7db1b5fd19cb1ac59a843b2c9abffc8c22254d"} Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.784086 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"77de7364-0925-438c-89e2-6ff0d3cb0776","Type":"ContainerStarted","Data":"f9e2b84d8650ae14b1816e7ce1740329e6a1bffb7e777cabe69691951e552a64"} Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.789115 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d5d448f4-839e-4b71-ac6e-0c941ccd5a14","Type":"ContainerDied","Data":"ac599d57fa780b172192a3ed1aedd48964d320e0159a4d96cfa2f1eddda81da0"} Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.789183 4817 scope.go:117] "RemoveContainer" containerID="d497a7cb528a81ed2229b0cdac90677ed3c2d2f20b82859c9abe6a0511ea7ee8" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.789327 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.820101 4817 scope.go:117] "RemoveContainer" containerID="a115b4ffa1e08b7aa72751645a637afa2ca40952df9066d390305a579effd26c" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.836529 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=37.456685149 podStartE2EDuration="42.836506832s" podCreationTimestamp="2026-02-18 14:16:34 +0000 UTC" firstStartedPulling="2026-02-18 14:17:08.676552846 +0000 UTC m=+1091.252088829" lastFinishedPulling="2026-02-18 14:17:14.056374529 +0000 UTC m=+1096.631910512" observedRunningTime="2026-02-18 14:17:16.832570884 +0000 UTC m=+1099.408106877" watchObservedRunningTime="2026-02-18 14:17:16.836506832 +0000 UTC m=+1099.412042815" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.859673 4817 scope.go:117] "RemoveContainer" containerID="1e8977c90e480a38324dbe7a779273e2890fcc959446890b828d6f54e033a6a2" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.862044 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.879685 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.892418 4817 scope.go:117] "RemoveContainer" containerID="c7916262128ab97149b2ff1a8ebd8ab2eaf07d5003c9118d7e36bb46bc6a4812" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.900749 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 18 14:17:16 crc kubenswrapper[4817]: E0218 14:17:16.901203 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5d448f4-839e-4b71-ac6e-0c941ccd5a14" containerName="prometheus" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.901222 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5d448f4-839e-4b71-ac6e-0c941ccd5a14" containerName="prometheus" Feb 18 14:17:16 crc kubenswrapper[4817]: E0218 14:17:16.901252 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5d448f4-839e-4b71-ac6e-0c941ccd5a14" containerName="config-reloader" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.901260 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5d448f4-839e-4b71-ac6e-0c941ccd5a14" containerName="config-reloader" Feb 18 14:17:16 crc kubenswrapper[4817]: E0218 14:17:16.901283 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="206a327c-35a4-4c97-9e11-9792c464b2c3" containerName="mariadb-account-create-update" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.901292 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="206a327c-35a4-4c97-9e11-9792c464b2c3" containerName="mariadb-account-create-update" Feb 18 14:17:16 crc kubenswrapper[4817]: E0218 14:17:16.901306 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5d448f4-839e-4b71-ac6e-0c941ccd5a14" containerName="thanos-sidecar" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.901312 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5d448f4-839e-4b71-ac6e-0c941ccd5a14" containerName="thanos-sidecar" Feb 18 14:17:16 crc kubenswrapper[4817]: E0218 14:17:16.901320 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5d448f4-839e-4b71-ac6e-0c941ccd5a14" containerName="init-config-reloader" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.901327 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5d448f4-839e-4b71-ac6e-0c941ccd5a14" containerName="init-config-reloader" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.901524 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="206a327c-35a4-4c97-9e11-9792c464b2c3" containerName="mariadb-account-create-update" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.901560 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5d448f4-839e-4b71-ac6e-0c941ccd5a14" containerName="config-reloader" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.901588 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5d448f4-839e-4b71-ac6e-0c941ccd5a14" containerName="prometheus" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.901599 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5d448f4-839e-4b71-ac6e-0c941ccd5a14" containerName="thanos-sidecar" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.906718 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.910804 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.911466 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.911675 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.911883 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.912123 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.913267 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.913444 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.913449 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.914850 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-287x5" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.932425 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.996004 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.996341 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.996373 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.996458 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.996516 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56f5l\" (UniqueName: \"kubernetes.io/projected/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-kube-api-access-56f5l\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.996599 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.996663 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.996800 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-04929794-6ef0-4dce-978a-755fd164a7e0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-04929794-6ef0-4dce-978a-755fd164a7e0\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.996839 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-config\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.996877 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.996917 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.996954 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:16 crc kubenswrapper[4817]: I0218 14:17:16.997043 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.099159 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.099221 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56f5l\" (UniqueName: \"kubernetes.io/projected/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-kube-api-access-56f5l\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.099256 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.099279 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.099331 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-config\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.099357 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-04929794-6ef0-4dce-978a-755fd164a7e0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-04929794-6ef0-4dce-978a-755fd164a7e0\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.099393 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.099422 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.099457 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.099477 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.099507 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.099584 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.099602 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.100313 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.101919 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.106927 4817 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.106971 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.107036 4817 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-04929794-6ef0-4dce-978a-755fd164a7e0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-04929794-6ef0-4dce-978a-755fd164a7e0\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ab4d68949bad9bb24db26ca4380e123968c4e63f4e0711630b3924ef3b41508c/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.107321 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-config\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.108252 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.118377 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.118889 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.123405 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.123610 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.124632 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.124968 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.127365 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56f5l\" (UniqueName: \"kubernetes.io/projected/fbb28d6a-260d-45fa-80ec-9f583e8fc37b-kube-api-access-56f5l\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.160342 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-04929794-6ef0-4dce-978a-755fd164a7e0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-04929794-6ef0-4dce-978a-755fd164a7e0\") pod \"prometheus-metric-storage-0\" (UID: \"fbb28d6a-260d-45fa-80ec-9f583e8fc37b\") " pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.180336 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-b795s"] Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.185367 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-b795s" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.189310 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.210880 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-b795s"] Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.212567 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb3fc91e-a9df-429a-b494-fbac21db2ab9-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-b795s\" (UID: \"eb3fc91e-a9df-429a-b494-fbac21db2ab9\") " pod="openstack/dnsmasq-dns-77585f5f8c-b795s" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.213835 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb3fc91e-a9df-429a-b494-fbac21db2ab9-config\") pod \"dnsmasq-dns-77585f5f8c-b795s\" (UID: \"eb3fc91e-a9df-429a-b494-fbac21db2ab9\") " pod="openstack/dnsmasq-dns-77585f5f8c-b795s" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.214889 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnc6r\" (UniqueName: \"kubernetes.io/projected/eb3fc91e-a9df-429a-b494-fbac21db2ab9-kube-api-access-jnc6r\") pod \"dnsmasq-dns-77585f5f8c-b795s\" (UID: \"eb3fc91e-a9df-429a-b494-fbac21db2ab9\") " pod="openstack/dnsmasq-dns-77585f5f8c-b795s" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.215012 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb3fc91e-a9df-429a-b494-fbac21db2ab9-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-b795s\" (UID: \"eb3fc91e-a9df-429a-b494-fbac21db2ab9\") " pod="openstack/dnsmasq-dns-77585f5f8c-b795s" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.215439 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb3fc91e-a9df-429a-b494-fbac21db2ab9-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-b795s\" (UID: \"eb3fc91e-a9df-429a-b494-fbac21db2ab9\") " pod="openstack/dnsmasq-dns-77585f5f8c-b795s" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.215511 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb3fc91e-a9df-429a-b494-fbac21db2ab9-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-b795s\" (UID: \"eb3fc91e-a9df-429a-b494-fbac21db2ab9\") " pod="openstack/dnsmasq-dns-77585f5f8c-b795s" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.266304 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.317058 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb3fc91e-a9df-429a-b494-fbac21db2ab9-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-b795s\" (UID: \"eb3fc91e-a9df-429a-b494-fbac21db2ab9\") " pod="openstack/dnsmasq-dns-77585f5f8c-b795s" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.317120 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb3fc91e-a9df-429a-b494-fbac21db2ab9-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-b795s\" (UID: \"eb3fc91e-a9df-429a-b494-fbac21db2ab9\") " pod="openstack/dnsmasq-dns-77585f5f8c-b795s" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.317158 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb3fc91e-a9df-429a-b494-fbac21db2ab9-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-b795s\" (UID: \"eb3fc91e-a9df-429a-b494-fbac21db2ab9\") " pod="openstack/dnsmasq-dns-77585f5f8c-b795s" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.317185 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb3fc91e-a9df-429a-b494-fbac21db2ab9-config\") pod \"dnsmasq-dns-77585f5f8c-b795s\" (UID: \"eb3fc91e-a9df-429a-b494-fbac21db2ab9\") " pod="openstack/dnsmasq-dns-77585f5f8c-b795s" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.317223 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnc6r\" (UniqueName: \"kubernetes.io/projected/eb3fc91e-a9df-429a-b494-fbac21db2ab9-kube-api-access-jnc6r\") pod \"dnsmasq-dns-77585f5f8c-b795s\" (UID: \"eb3fc91e-a9df-429a-b494-fbac21db2ab9\") " pod="openstack/dnsmasq-dns-77585f5f8c-b795s" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.317261 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb3fc91e-a9df-429a-b494-fbac21db2ab9-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-b795s\" (UID: \"eb3fc91e-a9df-429a-b494-fbac21db2ab9\") " pod="openstack/dnsmasq-dns-77585f5f8c-b795s" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.318203 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb3fc91e-a9df-429a-b494-fbac21db2ab9-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-b795s\" (UID: \"eb3fc91e-a9df-429a-b494-fbac21db2ab9\") " pod="openstack/dnsmasq-dns-77585f5f8c-b795s" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.318429 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb3fc91e-a9df-429a-b494-fbac21db2ab9-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-b795s\" (UID: \"eb3fc91e-a9df-429a-b494-fbac21db2ab9\") " pod="openstack/dnsmasq-dns-77585f5f8c-b795s" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.318443 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb3fc91e-a9df-429a-b494-fbac21db2ab9-config\") pod \"dnsmasq-dns-77585f5f8c-b795s\" (UID: \"eb3fc91e-a9df-429a-b494-fbac21db2ab9\") " pod="openstack/dnsmasq-dns-77585f5f8c-b795s" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.318686 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb3fc91e-a9df-429a-b494-fbac21db2ab9-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-b795s\" (UID: \"eb3fc91e-a9df-429a-b494-fbac21db2ab9\") " pod="openstack/dnsmasq-dns-77585f5f8c-b795s" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.318994 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb3fc91e-a9df-429a-b494-fbac21db2ab9-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-b795s\" (UID: \"eb3fc91e-a9df-429a-b494-fbac21db2ab9\") " pod="openstack/dnsmasq-dns-77585f5f8c-b795s" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.338093 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnc6r\" (UniqueName: \"kubernetes.io/projected/eb3fc91e-a9df-429a-b494-fbac21db2ab9-kube-api-access-jnc6r\") pod \"dnsmasq-dns-77585f5f8c-b795s\" (UID: \"eb3fc91e-a9df-429a-b494-fbac21db2ab9\") " pod="openstack/dnsmasq-dns-77585f5f8c-b795s" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.534589 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-b795s" Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.757364 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 18 14:17:17 crc kubenswrapper[4817]: I0218 14:17:17.802875 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fbb28d6a-260d-45fa-80ec-9f583e8fc37b","Type":"ContainerStarted","Data":"5e0c334901ef648669fb5721452658e0d630bfa8fe1963ce9def95784fb16136"} Feb 18 14:17:18 crc kubenswrapper[4817]: I0218 14:17:18.034616 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-b795s"] Feb 18 14:17:18 crc kubenswrapper[4817]: W0218 14:17:18.042816 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb3fc91e_a9df_429a_b494_fbac21db2ab9.slice/crio-ede924ac3267edd61ebd005671c9730bba813cafcadcc33a6786d93a813e2140 WatchSource:0}: Error finding container ede924ac3267edd61ebd005671c9730bba813cafcadcc33a6786d93a813e2140: Status 404 returned error can't find the container with id ede924ac3267edd61ebd005671c9730bba813cafcadcc33a6786d93a813e2140 Feb 18 14:17:18 crc kubenswrapper[4817]: I0218 14:17:18.186839 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5d448f4-839e-4b71-ac6e-0c941ccd5a14" path="/var/lib/kubelet/pods/d5d448f4-839e-4b71-ac6e-0c941ccd5a14/volumes" Feb 18 14:17:18 crc kubenswrapper[4817]: I0218 14:17:18.585358 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 18 14:17:18 crc kubenswrapper[4817]: I0218 14:17:18.829175 4817 generic.go:334] "Generic (PLEG): container finished" podID="eb3fc91e-a9df-429a-b494-fbac21db2ab9" containerID="e38ac216656ee1ec7984ec44b6b0624827bed0e3ba8fdb0f3b57817dde46e0d9" exitCode=0 Feb 18 14:17:18 crc kubenswrapper[4817]: I0218 14:17:18.829224 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-b795s" event={"ID":"eb3fc91e-a9df-429a-b494-fbac21db2ab9","Type":"ContainerDied","Data":"e38ac216656ee1ec7984ec44b6b0624827bed0e3ba8fdb0f3b57817dde46e0d9"} Feb 18 14:17:18 crc kubenswrapper[4817]: I0218 14:17:18.829256 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-b795s" event={"ID":"eb3fc91e-a9df-429a-b494-fbac21db2ab9","Type":"ContainerStarted","Data":"ede924ac3267edd61ebd005671c9730bba813cafcadcc33a6786d93a813e2140"} Feb 18 14:17:18 crc kubenswrapper[4817]: I0218 14:17:18.925278 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.033966 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-4q5qq"] Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.035504 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4q5qq" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.048694 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="d5d448f4-839e-4b71-ac6e-0c941ccd5a14" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.114:9090/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.057446 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-4q5qq"] Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.151338 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-5d24-account-create-update-rrt2r"] Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.154278 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5d24-account-create-update-rrt2r" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.157323 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/639aeed9-1ba1-4ad8-acb2-90e3e800e4a9-operator-scripts\") pod \"cinder-db-create-4q5qq\" (UID: \"639aeed9-1ba1-4ad8-acb2-90e3e800e4a9\") " pod="openstack/cinder-db-create-4q5qq" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.157452 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlrrs\" (UniqueName: \"kubernetes.io/projected/639aeed9-1ba1-4ad8-acb2-90e3e800e4a9-kube-api-access-rlrrs\") pod \"cinder-db-create-4q5qq\" (UID: \"639aeed9-1ba1-4ad8-acb2-90e3e800e4a9\") " pod="openstack/cinder-db-create-4q5qq" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.160572 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.178946 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-5d24-account-create-update-rrt2r"] Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.243044 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-create-hs8dt"] Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.244612 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-hs8dt" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.275473 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/639aeed9-1ba1-4ad8-acb2-90e3e800e4a9-operator-scripts\") pod \"cinder-db-create-4q5qq\" (UID: \"639aeed9-1ba1-4ad8-acb2-90e3e800e4a9\") " pod="openstack/cinder-db-create-4q5qq" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.275650 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/288ec94a-2fa7-44a5-afaf-1bf7909336a7-operator-scripts\") pod \"cinder-5d24-account-create-update-rrt2r\" (UID: \"288ec94a-2fa7-44a5-afaf-1bf7909336a7\") " pod="openstack/cinder-5d24-account-create-update-rrt2r" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.275759 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlrrs\" (UniqueName: \"kubernetes.io/projected/639aeed9-1ba1-4ad8-acb2-90e3e800e4a9-kube-api-access-rlrrs\") pod \"cinder-db-create-4q5qq\" (UID: \"639aeed9-1ba1-4ad8-acb2-90e3e800e4a9\") " pod="openstack/cinder-db-create-4q5qq" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.277615 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/639aeed9-1ba1-4ad8-acb2-90e3e800e4a9-operator-scripts\") pod \"cinder-db-create-4q5qq\" (UID: \"639aeed9-1ba1-4ad8-acb2-90e3e800e4a9\") " pod="openstack/cinder-db-create-4q5qq" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.278226 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4tp9\" (UniqueName: \"kubernetes.io/projected/288ec94a-2fa7-44a5-afaf-1bf7909336a7-kube-api-access-f4tp9\") pod \"cinder-5d24-account-create-update-rrt2r\" (UID: \"288ec94a-2fa7-44a5-afaf-1bf7909336a7\") " pod="openstack/cinder-5d24-account-create-update-rrt2r" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.297255 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-hs8dt"] Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.311694 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlrrs\" (UniqueName: \"kubernetes.io/projected/639aeed9-1ba1-4ad8-acb2-90e3e800e4a9-kube-api-access-rlrrs\") pod \"cinder-db-create-4q5qq\" (UID: \"639aeed9-1ba1-4ad8-acb2-90e3e800e4a9\") " pod="openstack/cinder-db-create-4q5qq" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.365528 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4q5qq" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.389116 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4tp9\" (UniqueName: \"kubernetes.io/projected/288ec94a-2fa7-44a5-afaf-1bf7909336a7-kube-api-access-f4tp9\") pod \"cinder-5d24-account-create-update-rrt2r\" (UID: \"288ec94a-2fa7-44a5-afaf-1bf7909336a7\") " pod="openstack/cinder-5d24-account-create-update-rrt2r" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.389205 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff3fb952-c1b3-4311-832b-c8807407385e-operator-scripts\") pod \"cloudkitty-db-create-hs8dt\" (UID: \"ff3fb952-c1b3-4311-832b-c8807407385e\") " pod="openstack/cloudkitty-db-create-hs8dt" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.389248 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6qfg\" (UniqueName: \"kubernetes.io/projected/ff3fb952-c1b3-4311-832b-c8807407385e-kube-api-access-l6qfg\") pod \"cloudkitty-db-create-hs8dt\" (UID: \"ff3fb952-c1b3-4311-832b-c8807407385e\") " pod="openstack/cloudkitty-db-create-hs8dt" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.389325 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/288ec94a-2fa7-44a5-afaf-1bf7909336a7-operator-scripts\") pod \"cinder-5d24-account-create-update-rrt2r\" (UID: \"288ec94a-2fa7-44a5-afaf-1bf7909336a7\") " pod="openstack/cinder-5d24-account-create-update-rrt2r" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.390694 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/288ec94a-2fa7-44a5-afaf-1bf7909336a7-operator-scripts\") pod \"cinder-5d24-account-create-update-rrt2r\" (UID: \"288ec94a-2fa7-44a5-afaf-1bf7909336a7\") " pod="openstack/cinder-5d24-account-create-update-rrt2r" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.393489 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-jwsrh"] Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.395765 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jwsrh" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.402909 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.403150 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2ntmd" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.403287 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.403440 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.422034 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jwsrh"] Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.464537 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-553d-account-create-update-twtnj"] Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.465807 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-553d-account-create-update-twtnj" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.469038 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.491277 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e8be8e-4ac1-4926-8790-e6910c1cbddf-combined-ca-bundle\") pod \"keystone-db-sync-jwsrh\" (UID: \"d5e8be8e-4ac1-4926-8790-e6910c1cbddf\") " pod="openstack/keystone-db-sync-jwsrh" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.491358 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjlsf\" (UniqueName: \"kubernetes.io/projected/d5e8be8e-4ac1-4926-8790-e6910c1cbddf-kube-api-access-jjlsf\") pod \"keystone-db-sync-jwsrh\" (UID: \"d5e8be8e-4ac1-4926-8790-e6910c1cbddf\") " pod="openstack/keystone-db-sync-jwsrh" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.491394 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqvps\" (UniqueName: \"kubernetes.io/projected/4135b831-d02a-45bc-aea0-4584e8b2a01f-kube-api-access-pqvps\") pod \"neutron-553d-account-create-update-twtnj\" (UID: \"4135b831-d02a-45bc-aea0-4584e8b2a01f\") " pod="openstack/neutron-553d-account-create-update-twtnj" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.491508 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff3fb952-c1b3-4311-832b-c8807407385e-operator-scripts\") pod \"cloudkitty-db-create-hs8dt\" (UID: \"ff3fb952-c1b3-4311-832b-c8807407385e\") " pod="openstack/cloudkitty-db-create-hs8dt" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.491539 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4135b831-d02a-45bc-aea0-4584e8b2a01f-operator-scripts\") pod \"neutron-553d-account-create-update-twtnj\" (UID: \"4135b831-d02a-45bc-aea0-4584e8b2a01f\") " pod="openstack/neutron-553d-account-create-update-twtnj" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.491574 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e8be8e-4ac1-4926-8790-e6910c1cbddf-config-data\") pod \"keystone-db-sync-jwsrh\" (UID: \"d5e8be8e-4ac1-4926-8790-e6910c1cbddf\") " pod="openstack/keystone-db-sync-jwsrh" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.491599 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6qfg\" (UniqueName: \"kubernetes.io/projected/ff3fb952-c1b3-4311-832b-c8807407385e-kube-api-access-l6qfg\") pod \"cloudkitty-db-create-hs8dt\" (UID: \"ff3fb952-c1b3-4311-832b-c8807407385e\") " pod="openstack/cloudkitty-db-create-hs8dt" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.492958 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff3fb952-c1b3-4311-832b-c8807407385e-operator-scripts\") pod \"cloudkitty-db-create-hs8dt\" (UID: \"ff3fb952-c1b3-4311-832b-c8807407385e\") " pod="openstack/cloudkitty-db-create-hs8dt" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.498716 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-bkwzx"] Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.502957 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bkwzx" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.516458 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-553d-account-create-update-twtnj"] Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.554853 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6qfg\" (UniqueName: \"kubernetes.io/projected/ff3fb952-c1b3-4311-832b-c8807407385e-kube-api-access-l6qfg\") pod \"cloudkitty-db-create-hs8dt\" (UID: \"ff3fb952-c1b3-4311-832b-c8807407385e\") " pod="openstack/cloudkitty-db-create-hs8dt" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.556476 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4tp9\" (UniqueName: \"kubernetes.io/projected/288ec94a-2fa7-44a5-afaf-1bf7909336a7-kube-api-access-f4tp9\") pod \"cinder-5d24-account-create-update-rrt2r\" (UID: \"288ec94a-2fa7-44a5-afaf-1bf7909336a7\") " pod="openstack/cinder-5d24-account-create-update-rrt2r" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.556526 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-bkwzx"] Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.575042 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-zqdhk"] Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.576330 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zqdhk" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.584555 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db0f-account-create-update-ljxz9"] Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.585969 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db0f-account-create-update-ljxz9" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.588447 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-db-secret" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.593638 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4135b831-d02a-45bc-aea0-4584e8b2a01f-operator-scripts\") pod \"neutron-553d-account-create-update-twtnj\" (UID: \"4135b831-d02a-45bc-aea0-4584e8b2a01f\") " pod="openstack/neutron-553d-account-create-update-twtnj" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.593696 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e8be8e-4ac1-4926-8790-e6910c1cbddf-config-data\") pod \"keystone-db-sync-jwsrh\" (UID: \"d5e8be8e-4ac1-4926-8790-e6910c1cbddf\") " pod="openstack/keystone-db-sync-jwsrh" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.593785 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e8be8e-4ac1-4926-8790-e6910c1cbddf-combined-ca-bundle\") pod \"keystone-db-sync-jwsrh\" (UID: \"d5e8be8e-4ac1-4926-8790-e6910c1cbddf\") " pod="openstack/keystone-db-sync-jwsrh" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.593841 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjlsf\" (UniqueName: \"kubernetes.io/projected/d5e8be8e-4ac1-4926-8790-e6910c1cbddf-kube-api-access-jjlsf\") pod \"keystone-db-sync-jwsrh\" (UID: \"d5e8be8e-4ac1-4926-8790-e6910c1cbddf\") " pod="openstack/keystone-db-sync-jwsrh" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.593878 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqvps\" (UniqueName: \"kubernetes.io/projected/4135b831-d02a-45bc-aea0-4584e8b2a01f-kube-api-access-pqvps\") pod \"neutron-553d-account-create-update-twtnj\" (UID: \"4135b831-d02a-45bc-aea0-4584e8b2a01f\") " pod="openstack/neutron-553d-account-create-update-twtnj" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.599686 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4135b831-d02a-45bc-aea0-4584e8b2a01f-operator-scripts\") pod \"neutron-553d-account-create-update-twtnj\" (UID: \"4135b831-d02a-45bc-aea0-4584e8b2a01f\") " pod="openstack/neutron-553d-account-create-update-twtnj" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.603544 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e8be8e-4ac1-4926-8790-e6910c1cbddf-combined-ca-bundle\") pod \"keystone-db-sync-jwsrh\" (UID: \"d5e8be8e-4ac1-4926-8790-e6910c1cbddf\") " pod="openstack/keystone-db-sync-jwsrh" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.604127 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e8be8e-4ac1-4926-8790-e6910c1cbddf-config-data\") pod \"keystone-db-sync-jwsrh\" (UID: \"d5e8be8e-4ac1-4926-8790-e6910c1cbddf\") " pod="openstack/keystone-db-sync-jwsrh" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.605831 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-hs8dt" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.607591 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-zqdhk"] Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.617010 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqvps\" (UniqueName: \"kubernetes.io/projected/4135b831-d02a-45bc-aea0-4584e8b2a01f-kube-api-access-pqvps\") pod \"neutron-553d-account-create-update-twtnj\" (UID: \"4135b831-d02a-45bc-aea0-4584e8b2a01f\") " pod="openstack/neutron-553d-account-create-update-twtnj" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.617091 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db0f-account-create-update-ljxz9"] Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.627805 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjlsf\" (UniqueName: \"kubernetes.io/projected/d5e8be8e-4ac1-4926-8790-e6910c1cbddf-kube-api-access-jjlsf\") pod \"keystone-db-sync-jwsrh\" (UID: \"d5e8be8e-4ac1-4926-8790-e6910c1cbddf\") " pod="openstack/keystone-db-sync-jwsrh" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.695672 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzzvd\" (UniqueName: \"kubernetes.io/projected/ed148ce2-1bf9-44a7-b0bd-444c12bead6c-kube-api-access-wzzvd\") pod \"cloudkitty-db0f-account-create-update-ljxz9\" (UID: \"ed148ce2-1bf9-44a7-b0bd-444c12bead6c\") " pod="openstack/cloudkitty-db0f-account-create-update-ljxz9" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.695751 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d4b2205-ea38-4a29-858f-c2acb3cbd423-operator-scripts\") pod \"barbican-db-create-bkwzx\" (UID: \"2d4b2205-ea38-4a29-858f-c2acb3cbd423\") " pod="openstack/barbican-db-create-bkwzx" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.695780 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr4sr\" (UniqueName: \"kubernetes.io/projected/2d4b2205-ea38-4a29-858f-c2acb3cbd423-kube-api-access-mr4sr\") pod \"barbican-db-create-bkwzx\" (UID: \"2d4b2205-ea38-4a29-858f-c2acb3cbd423\") " pod="openstack/barbican-db-create-bkwzx" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.695817 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed148ce2-1bf9-44a7-b0bd-444c12bead6c-operator-scripts\") pod \"cloudkitty-db0f-account-create-update-ljxz9\" (UID: \"ed148ce2-1bf9-44a7-b0bd-444c12bead6c\") " pod="openstack/cloudkitty-db0f-account-create-update-ljxz9" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.695882 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67e72216-792c-4525-b231-2370a5b4d8ef-operator-scripts\") pod \"neutron-db-create-zqdhk\" (UID: \"67e72216-792c-4525-b231-2370a5b4d8ef\") " pod="openstack/neutron-db-create-zqdhk" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.695949 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gtdb\" (UniqueName: \"kubernetes.io/projected/67e72216-792c-4525-b231-2370a5b4d8ef-kube-api-access-6gtdb\") pod \"neutron-db-create-zqdhk\" (UID: \"67e72216-792c-4525-b231-2370a5b4d8ef\") " pod="openstack/neutron-db-create-zqdhk" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.726873 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-9002-account-create-update-x52d2"] Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.728469 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9002-account-create-update-x52d2" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.730517 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.742755 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jwsrh" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.748247 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-9002-account-create-update-x52d2"] Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.804876 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5d24-account-create-update-rrt2r" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.805805 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xhsq\" (UniqueName: \"kubernetes.io/projected/e4e9ee1a-8b93-4306-ad94-d154b80f60c3-kube-api-access-9xhsq\") pod \"barbican-9002-account-create-update-x52d2\" (UID: \"e4e9ee1a-8b93-4306-ad94-d154b80f60c3\") " pod="openstack/barbican-9002-account-create-update-x52d2" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.805864 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gtdb\" (UniqueName: \"kubernetes.io/projected/67e72216-792c-4525-b231-2370a5b4d8ef-kube-api-access-6gtdb\") pod \"neutron-db-create-zqdhk\" (UID: \"67e72216-792c-4525-b231-2370a5b4d8ef\") " pod="openstack/neutron-db-create-zqdhk" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.805913 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzzvd\" (UniqueName: \"kubernetes.io/projected/ed148ce2-1bf9-44a7-b0bd-444c12bead6c-kube-api-access-wzzvd\") pod \"cloudkitty-db0f-account-create-update-ljxz9\" (UID: \"ed148ce2-1bf9-44a7-b0bd-444c12bead6c\") " pod="openstack/cloudkitty-db0f-account-create-update-ljxz9" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.806001 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d4b2205-ea38-4a29-858f-c2acb3cbd423-operator-scripts\") pod \"barbican-db-create-bkwzx\" (UID: \"2d4b2205-ea38-4a29-858f-c2acb3cbd423\") " pod="openstack/barbican-db-create-bkwzx" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.806024 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4e9ee1a-8b93-4306-ad94-d154b80f60c3-operator-scripts\") pod \"barbican-9002-account-create-update-x52d2\" (UID: \"e4e9ee1a-8b93-4306-ad94-d154b80f60c3\") " pod="openstack/barbican-9002-account-create-update-x52d2" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.806056 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr4sr\" (UniqueName: \"kubernetes.io/projected/2d4b2205-ea38-4a29-858f-c2acb3cbd423-kube-api-access-mr4sr\") pod \"barbican-db-create-bkwzx\" (UID: \"2d4b2205-ea38-4a29-858f-c2acb3cbd423\") " pod="openstack/barbican-db-create-bkwzx" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.806093 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed148ce2-1bf9-44a7-b0bd-444c12bead6c-operator-scripts\") pod \"cloudkitty-db0f-account-create-update-ljxz9\" (UID: \"ed148ce2-1bf9-44a7-b0bd-444c12bead6c\") " pod="openstack/cloudkitty-db0f-account-create-update-ljxz9" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.806171 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67e72216-792c-4525-b231-2370a5b4d8ef-operator-scripts\") pod \"neutron-db-create-zqdhk\" (UID: \"67e72216-792c-4525-b231-2370a5b4d8ef\") " pod="openstack/neutron-db-create-zqdhk" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.807303 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67e72216-792c-4525-b231-2370a5b4d8ef-operator-scripts\") pod \"neutron-db-create-zqdhk\" (UID: \"67e72216-792c-4525-b231-2370a5b4d8ef\") " pod="openstack/neutron-db-create-zqdhk" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.808409 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d4b2205-ea38-4a29-858f-c2acb3cbd423-operator-scripts\") pod \"barbican-db-create-bkwzx\" (UID: \"2d4b2205-ea38-4a29-858f-c2acb3cbd423\") " pod="openstack/barbican-db-create-bkwzx" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.808721 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-553d-account-create-update-twtnj" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.821683 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed148ce2-1bf9-44a7-b0bd-444c12bead6c-operator-scripts\") pod \"cloudkitty-db0f-account-create-update-ljxz9\" (UID: \"ed148ce2-1bf9-44a7-b0bd-444c12bead6c\") " pod="openstack/cloudkitty-db0f-account-create-update-ljxz9" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.829556 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gtdb\" (UniqueName: \"kubernetes.io/projected/67e72216-792c-4525-b231-2370a5b4d8ef-kube-api-access-6gtdb\") pod \"neutron-db-create-zqdhk\" (UID: \"67e72216-792c-4525-b231-2370a5b4d8ef\") " pod="openstack/neutron-db-create-zqdhk" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.835645 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr4sr\" (UniqueName: \"kubernetes.io/projected/2d4b2205-ea38-4a29-858f-c2acb3cbd423-kube-api-access-mr4sr\") pod \"barbican-db-create-bkwzx\" (UID: \"2d4b2205-ea38-4a29-858f-c2acb3cbd423\") " pod="openstack/barbican-db-create-bkwzx" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.837016 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzzvd\" (UniqueName: \"kubernetes.io/projected/ed148ce2-1bf9-44a7-b0bd-444c12bead6c-kube-api-access-wzzvd\") pod \"cloudkitty-db0f-account-create-update-ljxz9\" (UID: \"ed148ce2-1bf9-44a7-b0bd-444c12bead6c\") " pod="openstack/cloudkitty-db0f-account-create-update-ljxz9" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.878483 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-b795s" event={"ID":"eb3fc91e-a9df-429a-b494-fbac21db2ab9","Type":"ContainerStarted","Data":"65c0df07b7ef056ed33a7df783205f0d6e71f08915af3ed65418d22a3a7a4a15"} Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.880183 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-b795s" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.908776 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xhsq\" (UniqueName: \"kubernetes.io/projected/e4e9ee1a-8b93-4306-ad94-d154b80f60c3-kube-api-access-9xhsq\") pod \"barbican-9002-account-create-update-x52d2\" (UID: \"e4e9ee1a-8b93-4306-ad94-d154b80f60c3\") " pod="openstack/barbican-9002-account-create-update-x52d2" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.908950 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4e9ee1a-8b93-4306-ad94-d154b80f60c3-operator-scripts\") pod \"barbican-9002-account-create-update-x52d2\" (UID: \"e4e9ee1a-8b93-4306-ad94-d154b80f60c3\") " pod="openstack/barbican-9002-account-create-update-x52d2" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.910498 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4e9ee1a-8b93-4306-ad94-d154b80f60c3-operator-scripts\") pod \"barbican-9002-account-create-update-x52d2\" (UID: \"e4e9ee1a-8b93-4306-ad94-d154b80f60c3\") " pod="openstack/barbican-9002-account-create-update-x52d2" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.913018 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77585f5f8c-b795s" podStartSLOduration=2.912972207 podStartE2EDuration="2.912972207s" podCreationTimestamp="2026-02-18 14:17:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:17:19.900476673 +0000 UTC m=+1102.476012656" watchObservedRunningTime="2026-02-18 14:17:19.912972207 +0000 UTC m=+1102.488508190" Feb 18 14:17:19 crc kubenswrapper[4817]: I0218 14:17:19.972263 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xhsq\" (UniqueName: \"kubernetes.io/projected/e4e9ee1a-8b93-4306-ad94-d154b80f60c3-kube-api-access-9xhsq\") pod \"barbican-9002-account-create-update-x52d2\" (UID: \"e4e9ee1a-8b93-4306-ad94-d154b80f60c3\") " pod="openstack/barbican-9002-account-create-update-x52d2" Feb 18 14:17:20 crc kubenswrapper[4817]: I0218 14:17:20.093076 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zqdhk" Feb 18 14:17:20 crc kubenswrapper[4817]: I0218 14:17:20.116465 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db0f-account-create-update-ljxz9" Feb 18 14:17:20 crc kubenswrapper[4817]: I0218 14:17:20.129902 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bkwzx" Feb 18 14:17:20 crc kubenswrapper[4817]: I0218 14:17:20.138737 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9002-account-create-update-x52d2" Feb 18 14:17:20 crc kubenswrapper[4817]: I0218 14:17:20.282102 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-4q5qq"] Feb 18 14:17:20 crc kubenswrapper[4817]: I0218 14:17:20.523262 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-5d24-account-create-update-rrt2r"] Feb 18 14:17:20 crc kubenswrapper[4817]: I0218 14:17:20.592357 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-hs8dt"] Feb 18 14:17:20 crc kubenswrapper[4817]: I0218 14:17:20.627791 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-553d-account-create-update-twtnj"] Feb 18 14:17:20 crc kubenswrapper[4817]: W0218 14:17:20.660111 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff3fb952_c1b3_4311_832b_c8807407385e.slice/crio-5a971bc7e969d1a599d1e832cd360ecb032c66a718450f082528ac3e38f76911 WatchSource:0}: Error finding container 5a971bc7e969d1a599d1e832cd360ecb032c66a718450f082528ac3e38f76911: Status 404 returned error can't find the container with id 5a971bc7e969d1a599d1e832cd360ecb032c66a718450f082528ac3e38f76911 Feb 18 14:17:20 crc kubenswrapper[4817]: W0218 14:17:20.696017 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4135b831_d02a_45bc_aea0_4584e8b2a01f.slice/crio-22e7a7194586191a134dbac47e3fe13b8588e75e15c70b6a80580b130bd7926e WatchSource:0}: Error finding container 22e7a7194586191a134dbac47e3fe13b8588e75e15c70b6a80580b130bd7926e: Status 404 returned error can't find the container with id 22e7a7194586191a134dbac47e3fe13b8588e75e15c70b6a80580b130bd7926e Feb 18 14:17:20 crc kubenswrapper[4817]: I0218 14:17:20.830147 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jwsrh"] Feb 18 14:17:20 crc kubenswrapper[4817]: W0218 14:17:20.838156 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5e8be8e_4ac1_4926_8790_e6910c1cbddf.slice/crio-616d213ed59849273a84fc3cca4693920944f5a9453a4d06c05db7799d1fc19a WatchSource:0}: Error finding container 616d213ed59849273a84fc3cca4693920944f5a9453a4d06c05db7799d1fc19a: Status 404 returned error can't find the container with id 616d213ed59849273a84fc3cca4693920944f5a9453a4d06c05db7799d1fc19a Feb 18 14:17:20 crc kubenswrapper[4817]: I0218 14:17:20.906338 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-553d-account-create-update-twtnj" event={"ID":"4135b831-d02a-45bc-aea0-4584e8b2a01f","Type":"ContainerStarted","Data":"22e7a7194586191a134dbac47e3fe13b8588e75e15c70b6a80580b130bd7926e"} Feb 18 14:17:20 crc kubenswrapper[4817]: I0218 14:17:20.908057 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jwsrh" event={"ID":"d5e8be8e-4ac1-4926-8790-e6910c1cbddf","Type":"ContainerStarted","Data":"616d213ed59849273a84fc3cca4693920944f5a9453a4d06c05db7799d1fc19a"} Feb 18 14:17:20 crc kubenswrapper[4817]: I0218 14:17:20.909403 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5d24-account-create-update-rrt2r" event={"ID":"288ec94a-2fa7-44a5-afaf-1bf7909336a7","Type":"ContainerStarted","Data":"194efe15c992e2650f97126774e3a291df6b040e2f70486a39a3184736824121"} Feb 18 14:17:20 crc kubenswrapper[4817]: I0218 14:17:20.916183 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-4q5qq" event={"ID":"639aeed9-1ba1-4ad8-acb2-90e3e800e4a9","Type":"ContainerStarted","Data":"6bd99071a4b4ea6f960205f55655e12ae918750bcbb2c439d956ac73363ecd21"} Feb 18 14:17:20 crc kubenswrapper[4817]: I0218 14:17:20.918305 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-hs8dt" event={"ID":"ff3fb952-c1b3-4311-832b-c8807407385e","Type":"ContainerStarted","Data":"5a971bc7e969d1a599d1e832cd360ecb032c66a718450f082528ac3e38f76911"} Feb 18 14:17:21 crc kubenswrapper[4817]: I0218 14:17:21.093003 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-zqdhk"] Feb 18 14:17:21 crc kubenswrapper[4817]: I0218 14:17:21.216659 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-bkwzx"] Feb 18 14:17:21 crc kubenswrapper[4817]: I0218 14:17:21.317299 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db0f-account-create-update-ljxz9"] Feb 18 14:17:21 crc kubenswrapper[4817]: W0218 14:17:21.344700 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4e9ee1a_8b93_4306_ad94_d154b80f60c3.slice/crio-6258f6054285c4f4b3ac07ec42870a1daff2a0b7f60722169313dfd90cc6c727 WatchSource:0}: Error finding container 6258f6054285c4f4b3ac07ec42870a1daff2a0b7f60722169313dfd90cc6c727: Status 404 returned error can't find the container with id 6258f6054285c4f4b3ac07ec42870a1daff2a0b7f60722169313dfd90cc6c727 Feb 18 14:17:21 crc kubenswrapper[4817]: I0218 14:17:21.346579 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-9002-account-create-update-x52d2"] Feb 18 14:17:21 crc kubenswrapper[4817]: I0218 14:17:21.934766 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9002-account-create-update-x52d2" event={"ID":"e4e9ee1a-8b93-4306-ad94-d154b80f60c3","Type":"ContainerStarted","Data":"2b988613ff01744e08861d50ddd930e5134baf7612fea8c6b33b9dfc712b239f"} Feb 18 14:17:21 crc kubenswrapper[4817]: I0218 14:17:21.934812 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9002-account-create-update-x52d2" event={"ID":"e4e9ee1a-8b93-4306-ad94-d154b80f60c3","Type":"ContainerStarted","Data":"6258f6054285c4f4b3ac07ec42870a1daff2a0b7f60722169313dfd90cc6c727"} Feb 18 14:17:21 crc kubenswrapper[4817]: I0218 14:17:21.938208 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bkwzx" event={"ID":"2d4b2205-ea38-4a29-858f-c2acb3cbd423","Type":"ContainerStarted","Data":"28dcfcfe333100e0d09c762f4330a068f4ac89f4ea7b3316743027d120631d18"} Feb 18 14:17:21 crc kubenswrapper[4817]: I0218 14:17:21.938236 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bkwzx" event={"ID":"2d4b2205-ea38-4a29-858f-c2acb3cbd423","Type":"ContainerStarted","Data":"cb4143a05e1ade8a5cade5e2ac11370525fe025761eda42a1dcb01a931ae52ba"} Feb 18 14:17:21 crc kubenswrapper[4817]: I0218 14:17:21.940828 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fbb28d6a-260d-45fa-80ec-9f583e8fc37b","Type":"ContainerStarted","Data":"51277fde8aea5163e24f18911652195090f67f937c05af03df887ea4415313aa"} Feb 18 14:17:21 crc kubenswrapper[4817]: I0218 14:17:21.944112 4817 generic.go:334] "Generic (PLEG): container finished" podID="ff3fb952-c1b3-4311-832b-c8807407385e" containerID="762c834cb57e2f8db2c45854f68cd8b6080465bc2c067c55577f0cd7b28e38c5" exitCode=0 Feb 18 14:17:21 crc kubenswrapper[4817]: I0218 14:17:21.944164 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-hs8dt" event={"ID":"ff3fb952-c1b3-4311-832b-c8807407385e","Type":"ContainerDied","Data":"762c834cb57e2f8db2c45854f68cd8b6080465bc2c067c55577f0cd7b28e38c5"} Feb 18 14:17:21 crc kubenswrapper[4817]: I0218 14:17:21.947224 4817 generic.go:334] "Generic (PLEG): container finished" podID="4135b831-d02a-45bc-aea0-4584e8b2a01f" containerID="d8f4f516ab60608db5f1a975a5b5cf3b0fe21f143c617f74af2f246e0dbc549a" exitCode=0 Feb 18 14:17:21 crc kubenswrapper[4817]: I0218 14:17:21.947285 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-553d-account-create-update-twtnj" event={"ID":"4135b831-d02a-45bc-aea0-4584e8b2a01f","Type":"ContainerDied","Data":"d8f4f516ab60608db5f1a975a5b5cf3b0fe21f143c617f74af2f246e0dbc549a"} Feb 18 14:17:21 crc kubenswrapper[4817]: I0218 14:17:21.955658 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-9002-account-create-update-x52d2" podStartSLOduration=2.955638913 podStartE2EDuration="2.955638913s" podCreationTimestamp="2026-02-18 14:17:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:17:21.947686053 +0000 UTC m=+1104.523222046" watchObservedRunningTime="2026-02-18 14:17:21.955638913 +0000 UTC m=+1104.531174896" Feb 18 14:17:21 crc kubenswrapper[4817]: I0218 14:17:21.958618 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db0f-account-create-update-ljxz9" event={"ID":"ed148ce2-1bf9-44a7-b0bd-444c12bead6c","Type":"ContainerStarted","Data":"0e889fb035dddb82131229f54177b65d68938b655179dffd1d11fed85e0dee9c"} Feb 18 14:17:21 crc kubenswrapper[4817]: I0218 14:17:21.958661 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db0f-account-create-update-ljxz9" event={"ID":"ed148ce2-1bf9-44a7-b0bd-444c12bead6c","Type":"ContainerStarted","Data":"be4c6603b6fa6960d8af6d4effbe8ec41d816a7166b5367c265715c96c94e4a8"} Feb 18 14:17:21 crc kubenswrapper[4817]: I0218 14:17:21.975933 4817 generic.go:334] "Generic (PLEG): container finished" podID="288ec94a-2fa7-44a5-afaf-1bf7909336a7" containerID="9535cedd3fd4e601b68b071a22fbf5c4ceeb7d050a20b90b8f5c374303d0f021" exitCode=0 Feb 18 14:17:21 crc kubenswrapper[4817]: I0218 14:17:21.976231 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5d24-account-create-update-rrt2r" event={"ID":"288ec94a-2fa7-44a5-afaf-1bf7909336a7","Type":"ContainerDied","Data":"9535cedd3fd4e601b68b071a22fbf5c4ceeb7d050a20b90b8f5c374303d0f021"} Feb 18 14:17:21 crc kubenswrapper[4817]: I0218 14:17:21.977734 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-bkwzx" podStartSLOduration=2.977713198 podStartE2EDuration="2.977713198s" podCreationTimestamp="2026-02-18 14:17:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:17:21.9706299 +0000 UTC m=+1104.546165883" watchObservedRunningTime="2026-02-18 14:17:21.977713198 +0000 UTC m=+1104.553249181" Feb 18 14:17:21 crc kubenswrapper[4817]: I0218 14:17:21.978256 4817 generic.go:334] "Generic (PLEG): container finished" podID="639aeed9-1ba1-4ad8-acb2-90e3e800e4a9" containerID="bbf5c931ca0895f50fd9e9110ea5016f6e9460cc528632ab583532d485da1404" exitCode=0 Feb 18 14:17:21 crc kubenswrapper[4817]: I0218 14:17:21.978348 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-4q5qq" event={"ID":"639aeed9-1ba1-4ad8-acb2-90e3e800e4a9","Type":"ContainerDied","Data":"bbf5c931ca0895f50fd9e9110ea5016f6e9460cc528632ab583532d485da1404"} Feb 18 14:17:21 crc kubenswrapper[4817]: I0218 14:17:21.979957 4817 generic.go:334] "Generic (PLEG): container finished" podID="67e72216-792c-4525-b231-2370a5b4d8ef" containerID="363469e2aac9c38739b2c8a3ca00303e3889f1255e53a4cdddebfff3e1d4b78c" exitCode=0 Feb 18 14:17:21 crc kubenswrapper[4817]: I0218 14:17:21.980149 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zqdhk" event={"ID":"67e72216-792c-4525-b231-2370a5b4d8ef","Type":"ContainerDied","Data":"363469e2aac9c38739b2c8a3ca00303e3889f1255e53a4cdddebfff3e1d4b78c"} Feb 18 14:17:21 crc kubenswrapper[4817]: I0218 14:17:21.980209 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zqdhk" event={"ID":"67e72216-792c-4525-b231-2370a5b4d8ef","Type":"ContainerStarted","Data":"5d419dd6c7fcb2e0dc5a38e043d6798d52b28d99db1e2b0a2fca598501b9ac15"} Feb 18 14:17:22 crc kubenswrapper[4817]: I0218 14:17:22.084316 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db0f-account-create-update-ljxz9" podStartSLOduration=3.084295517 podStartE2EDuration="3.084295517s" podCreationTimestamp="2026-02-18 14:17:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:17:22.080293457 +0000 UTC m=+1104.655829440" watchObservedRunningTime="2026-02-18 14:17:22.084295517 +0000 UTC m=+1104.659831500" Feb 18 14:17:22 crc kubenswrapper[4817]: I0218 14:17:22.993954 4817 generic.go:334] "Generic (PLEG): container finished" podID="ed148ce2-1bf9-44a7-b0bd-444c12bead6c" containerID="0e889fb035dddb82131229f54177b65d68938b655179dffd1d11fed85e0dee9c" exitCode=0 Feb 18 14:17:22 crc kubenswrapper[4817]: I0218 14:17:22.995242 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db0f-account-create-update-ljxz9" event={"ID":"ed148ce2-1bf9-44a7-b0bd-444c12bead6c","Type":"ContainerDied","Data":"0e889fb035dddb82131229f54177b65d68938b655179dffd1d11fed85e0dee9c"} Feb 18 14:17:22 crc kubenswrapper[4817]: I0218 14:17:22.997618 4817 generic.go:334] "Generic (PLEG): container finished" podID="e4e9ee1a-8b93-4306-ad94-d154b80f60c3" containerID="2b988613ff01744e08861d50ddd930e5134baf7612fea8c6b33b9dfc712b239f" exitCode=0 Feb 18 14:17:22 crc kubenswrapper[4817]: I0218 14:17:22.997667 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9002-account-create-update-x52d2" event={"ID":"e4e9ee1a-8b93-4306-ad94-d154b80f60c3","Type":"ContainerDied","Data":"2b988613ff01744e08861d50ddd930e5134baf7612fea8c6b33b9dfc712b239f"} Feb 18 14:17:23 crc kubenswrapper[4817]: I0218 14:17:23.004944 4817 generic.go:334] "Generic (PLEG): container finished" podID="2d4b2205-ea38-4a29-858f-c2acb3cbd423" containerID="28dcfcfe333100e0d09c762f4330a068f4ac89f4ea7b3316743027d120631d18" exitCode=0 Feb 18 14:17:23 crc kubenswrapper[4817]: I0218 14:17:23.005224 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bkwzx" event={"ID":"2d4b2205-ea38-4a29-858f-c2acb3cbd423","Type":"ContainerDied","Data":"28dcfcfe333100e0d09c762f4330a068f4ac89f4ea7b3316743027d120631d18"} Feb 18 14:17:23 crc kubenswrapper[4817]: I0218 14:17:23.475206 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zqdhk" Feb 18 14:17:23 crc kubenswrapper[4817]: I0218 14:17:23.635559 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67e72216-792c-4525-b231-2370a5b4d8ef-operator-scripts\") pod \"67e72216-792c-4525-b231-2370a5b4d8ef\" (UID: \"67e72216-792c-4525-b231-2370a5b4d8ef\") " Feb 18 14:17:23 crc kubenswrapper[4817]: I0218 14:17:23.636080 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gtdb\" (UniqueName: \"kubernetes.io/projected/67e72216-792c-4525-b231-2370a5b4d8ef-kube-api-access-6gtdb\") pod \"67e72216-792c-4525-b231-2370a5b4d8ef\" (UID: \"67e72216-792c-4525-b231-2370a5b4d8ef\") " Feb 18 14:17:23 crc kubenswrapper[4817]: I0218 14:17:23.636453 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67e72216-792c-4525-b231-2370a5b4d8ef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "67e72216-792c-4525-b231-2370a5b4d8ef" (UID: "67e72216-792c-4525-b231-2370a5b4d8ef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:17:23 crc kubenswrapper[4817]: I0218 14:17:23.636658 4817 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67e72216-792c-4525-b231-2370a5b4d8ef-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:23 crc kubenswrapper[4817]: I0218 14:17:23.642244 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67e72216-792c-4525-b231-2370a5b4d8ef-kube-api-access-6gtdb" (OuterVolumeSpecName: "kube-api-access-6gtdb") pod "67e72216-792c-4525-b231-2370a5b4d8ef" (UID: "67e72216-792c-4525-b231-2370a5b4d8ef"). InnerVolumeSpecName "kube-api-access-6gtdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:17:23 crc kubenswrapper[4817]: I0218 14:17:23.698894 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4q5qq" Feb 18 14:17:23 crc kubenswrapper[4817]: I0218 14:17:23.740908 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gtdb\" (UniqueName: \"kubernetes.io/projected/67e72216-792c-4525-b231-2370a5b4d8ef-kube-api-access-6gtdb\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:23 crc kubenswrapper[4817]: I0218 14:17:23.841684 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/639aeed9-1ba1-4ad8-acb2-90e3e800e4a9-operator-scripts\") pod \"639aeed9-1ba1-4ad8-acb2-90e3e800e4a9\" (UID: \"639aeed9-1ba1-4ad8-acb2-90e3e800e4a9\") " Feb 18 14:17:23 crc kubenswrapper[4817]: I0218 14:17:23.841795 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlrrs\" (UniqueName: \"kubernetes.io/projected/639aeed9-1ba1-4ad8-acb2-90e3e800e4a9-kube-api-access-rlrrs\") pod \"639aeed9-1ba1-4ad8-acb2-90e3e800e4a9\" (UID: \"639aeed9-1ba1-4ad8-acb2-90e3e800e4a9\") " Feb 18 14:17:23 crc kubenswrapper[4817]: I0218 14:17:23.843313 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/639aeed9-1ba1-4ad8-acb2-90e3e800e4a9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "639aeed9-1ba1-4ad8-acb2-90e3e800e4a9" (UID: "639aeed9-1ba1-4ad8-acb2-90e3e800e4a9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:17:23 crc kubenswrapper[4817]: I0218 14:17:23.845257 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/639aeed9-1ba1-4ad8-acb2-90e3e800e4a9-kube-api-access-rlrrs" (OuterVolumeSpecName: "kube-api-access-rlrrs") pod "639aeed9-1ba1-4ad8-acb2-90e3e800e4a9" (UID: "639aeed9-1ba1-4ad8-acb2-90e3e800e4a9"). InnerVolumeSpecName "kube-api-access-rlrrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:17:23 crc kubenswrapper[4817]: I0218 14:17:23.944671 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlrrs\" (UniqueName: \"kubernetes.io/projected/639aeed9-1ba1-4ad8-acb2-90e3e800e4a9-kube-api-access-rlrrs\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:23 crc kubenswrapper[4817]: I0218 14:17:23.944716 4817 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/639aeed9-1ba1-4ad8-acb2-90e3e800e4a9-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:24 crc kubenswrapper[4817]: I0218 14:17:24.017601 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-4q5qq" event={"ID":"639aeed9-1ba1-4ad8-acb2-90e3e800e4a9","Type":"ContainerDied","Data":"6bd99071a4b4ea6f960205f55655e12ae918750bcbb2c439d956ac73363ecd21"} Feb 18 14:17:24 crc kubenswrapper[4817]: I0218 14:17:24.017682 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bd99071a4b4ea6f960205f55655e12ae918750bcbb2c439d956ac73363ecd21" Feb 18 14:17:24 crc kubenswrapper[4817]: I0218 14:17:24.017639 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4q5qq" Feb 18 14:17:24 crc kubenswrapper[4817]: I0218 14:17:24.019621 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zqdhk" Feb 18 14:17:24 crc kubenswrapper[4817]: I0218 14:17:24.020811 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zqdhk" event={"ID":"67e72216-792c-4525-b231-2370a5b4d8ef","Type":"ContainerDied","Data":"5d419dd6c7fcb2e0dc5a38e043d6798d52b28d99db1e2b0a2fca598501b9ac15"} Feb 18 14:17:24 crc kubenswrapper[4817]: I0218 14:17:24.020839 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d419dd6c7fcb2e0dc5a38e043d6798d52b28d99db1e2b0a2fca598501b9ac15" Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.390042 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.523665 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5d24-account-create-update-rrt2r" Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.535747 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db0f-account-create-update-ljxz9" Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.548579 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-553d-account-create-update-twtnj" Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.564283 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-hs8dt" Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.571660 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bkwzx" Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.577931 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9002-account-create-update-x52d2" Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.600004 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqvps\" (UniqueName: \"kubernetes.io/projected/4135b831-d02a-45bc-aea0-4584e8b2a01f-kube-api-access-pqvps\") pod \"4135b831-d02a-45bc-aea0-4584e8b2a01f\" (UID: \"4135b831-d02a-45bc-aea0-4584e8b2a01f\") " Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.600052 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4tp9\" (UniqueName: \"kubernetes.io/projected/288ec94a-2fa7-44a5-afaf-1bf7909336a7-kube-api-access-f4tp9\") pod \"288ec94a-2fa7-44a5-afaf-1bf7909336a7\" (UID: \"288ec94a-2fa7-44a5-afaf-1bf7909336a7\") " Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.600172 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed148ce2-1bf9-44a7-b0bd-444c12bead6c-operator-scripts\") pod \"ed148ce2-1bf9-44a7-b0bd-444c12bead6c\" (UID: \"ed148ce2-1bf9-44a7-b0bd-444c12bead6c\") " Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.600269 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/288ec94a-2fa7-44a5-afaf-1bf7909336a7-operator-scripts\") pod \"288ec94a-2fa7-44a5-afaf-1bf7909336a7\" (UID: \"288ec94a-2fa7-44a5-afaf-1bf7909336a7\") " Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.600369 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzzvd\" (UniqueName: \"kubernetes.io/projected/ed148ce2-1bf9-44a7-b0bd-444c12bead6c-kube-api-access-wzzvd\") pod \"ed148ce2-1bf9-44a7-b0bd-444c12bead6c\" (UID: \"ed148ce2-1bf9-44a7-b0bd-444c12bead6c\") " Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.600462 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4135b831-d02a-45bc-aea0-4584e8b2a01f-operator-scripts\") pod \"4135b831-d02a-45bc-aea0-4584e8b2a01f\" (UID: \"4135b831-d02a-45bc-aea0-4584e8b2a01f\") " Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.601452 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed148ce2-1bf9-44a7-b0bd-444c12bead6c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ed148ce2-1bf9-44a7-b0bd-444c12bead6c" (UID: "ed148ce2-1bf9-44a7-b0bd-444c12bead6c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.602240 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4135b831-d02a-45bc-aea0-4584e8b2a01f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4135b831-d02a-45bc-aea0-4584e8b2a01f" (UID: "4135b831-d02a-45bc-aea0-4584e8b2a01f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.602716 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/288ec94a-2fa7-44a5-afaf-1bf7909336a7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "288ec94a-2fa7-44a5-afaf-1bf7909336a7" (UID: "288ec94a-2fa7-44a5-afaf-1bf7909336a7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.609560 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4135b831-d02a-45bc-aea0-4584e8b2a01f-kube-api-access-pqvps" (OuterVolumeSpecName: "kube-api-access-pqvps") pod "4135b831-d02a-45bc-aea0-4584e8b2a01f" (UID: "4135b831-d02a-45bc-aea0-4584e8b2a01f"). InnerVolumeSpecName "kube-api-access-pqvps". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.609907 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed148ce2-1bf9-44a7-b0bd-444c12bead6c-kube-api-access-wzzvd" (OuterVolumeSpecName: "kube-api-access-wzzvd") pod "ed148ce2-1bf9-44a7-b0bd-444c12bead6c" (UID: "ed148ce2-1bf9-44a7-b0bd-444c12bead6c"). InnerVolumeSpecName "kube-api-access-wzzvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.635521 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/288ec94a-2fa7-44a5-afaf-1bf7909336a7-kube-api-access-f4tp9" (OuterVolumeSpecName: "kube-api-access-f4tp9") pod "288ec94a-2fa7-44a5-afaf-1bf7909336a7" (UID: "288ec94a-2fa7-44a5-afaf-1bf7909336a7"). InnerVolumeSpecName "kube-api-access-f4tp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.701723 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d4b2205-ea38-4a29-858f-c2acb3cbd423-operator-scripts\") pod \"2d4b2205-ea38-4a29-858f-c2acb3cbd423\" (UID: \"2d4b2205-ea38-4a29-858f-c2acb3cbd423\") " Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.701771 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xhsq\" (UniqueName: \"kubernetes.io/projected/e4e9ee1a-8b93-4306-ad94-d154b80f60c3-kube-api-access-9xhsq\") pod \"e4e9ee1a-8b93-4306-ad94-d154b80f60c3\" (UID: \"e4e9ee1a-8b93-4306-ad94-d154b80f60c3\") " Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.701832 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4e9ee1a-8b93-4306-ad94-d154b80f60c3-operator-scripts\") pod \"e4e9ee1a-8b93-4306-ad94-d154b80f60c3\" (UID: \"e4e9ee1a-8b93-4306-ad94-d154b80f60c3\") " Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.701997 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6qfg\" (UniqueName: \"kubernetes.io/projected/ff3fb952-c1b3-4311-832b-c8807407385e-kube-api-access-l6qfg\") pod \"ff3fb952-c1b3-4311-832b-c8807407385e\" (UID: \"ff3fb952-c1b3-4311-832b-c8807407385e\") " Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.702059 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mr4sr\" (UniqueName: \"kubernetes.io/projected/2d4b2205-ea38-4a29-858f-c2acb3cbd423-kube-api-access-mr4sr\") pod \"2d4b2205-ea38-4a29-858f-c2acb3cbd423\" (UID: \"2d4b2205-ea38-4a29-858f-c2acb3cbd423\") " Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.702154 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff3fb952-c1b3-4311-832b-c8807407385e-operator-scripts\") pod \"ff3fb952-c1b3-4311-832b-c8807407385e\" (UID: \"ff3fb952-c1b3-4311-832b-c8807407385e\") " Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.702632 4817 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4135b831-d02a-45bc-aea0-4584e8b2a01f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.702659 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqvps\" (UniqueName: \"kubernetes.io/projected/4135b831-d02a-45bc-aea0-4584e8b2a01f-kube-api-access-pqvps\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.702673 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4tp9\" (UniqueName: \"kubernetes.io/projected/288ec94a-2fa7-44a5-afaf-1bf7909336a7-kube-api-access-f4tp9\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.702683 4817 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed148ce2-1bf9-44a7-b0bd-444c12bead6c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.702693 4817 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/288ec94a-2fa7-44a5-afaf-1bf7909336a7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.702703 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzzvd\" (UniqueName: \"kubernetes.io/projected/ed148ce2-1bf9-44a7-b0bd-444c12bead6c-kube-api-access-wzzvd\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.703318 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff3fb952-c1b3-4311-832b-c8807407385e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ff3fb952-c1b3-4311-832b-c8807407385e" (UID: "ff3fb952-c1b3-4311-832b-c8807407385e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.704381 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4e9ee1a-8b93-4306-ad94-d154b80f60c3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e4e9ee1a-8b93-4306-ad94-d154b80f60c3" (UID: "e4e9ee1a-8b93-4306-ad94-d154b80f60c3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.704494 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d4b2205-ea38-4a29-858f-c2acb3cbd423-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2d4b2205-ea38-4a29-858f-c2acb3cbd423" (UID: "2d4b2205-ea38-4a29-858f-c2acb3cbd423"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.708509 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4e9ee1a-8b93-4306-ad94-d154b80f60c3-kube-api-access-9xhsq" (OuterVolumeSpecName: "kube-api-access-9xhsq") pod "e4e9ee1a-8b93-4306-ad94-d154b80f60c3" (UID: "e4e9ee1a-8b93-4306-ad94-d154b80f60c3"). InnerVolumeSpecName "kube-api-access-9xhsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.708568 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff3fb952-c1b3-4311-832b-c8807407385e-kube-api-access-l6qfg" (OuterVolumeSpecName: "kube-api-access-l6qfg") pod "ff3fb952-c1b3-4311-832b-c8807407385e" (UID: "ff3fb952-c1b3-4311-832b-c8807407385e"). InnerVolumeSpecName "kube-api-access-l6qfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.708676 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d4b2205-ea38-4a29-858f-c2acb3cbd423-kube-api-access-mr4sr" (OuterVolumeSpecName: "kube-api-access-mr4sr") pod "2d4b2205-ea38-4a29-858f-c2acb3cbd423" (UID: "2d4b2205-ea38-4a29-858f-c2acb3cbd423"). InnerVolumeSpecName "kube-api-access-mr4sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.804926 4817 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff3fb952-c1b3-4311-832b-c8807407385e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.804964 4817 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d4b2205-ea38-4a29-858f-c2acb3cbd423-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.804995 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xhsq\" (UniqueName: \"kubernetes.io/projected/e4e9ee1a-8b93-4306-ad94-d154b80f60c3-kube-api-access-9xhsq\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.805010 4817 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4e9ee1a-8b93-4306-ad94-d154b80f60c3-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.805022 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6qfg\" (UniqueName: \"kubernetes.io/projected/ff3fb952-c1b3-4311-832b-c8807407385e-kube-api-access-l6qfg\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:26 crc kubenswrapper[4817]: I0218 14:17:26.809529 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mr4sr\" (UniqueName: \"kubernetes.io/projected/2d4b2205-ea38-4a29-858f-c2acb3cbd423-kube-api-access-mr4sr\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:27 crc kubenswrapper[4817]: I0218 14:17:27.047837 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db0f-account-create-update-ljxz9" event={"ID":"ed148ce2-1bf9-44a7-b0bd-444c12bead6c","Type":"ContainerDied","Data":"be4c6603b6fa6960d8af6d4effbe8ec41d816a7166b5367c265715c96c94e4a8"} Feb 18 14:17:27 crc kubenswrapper[4817]: I0218 14:17:27.048144 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be4c6603b6fa6960d8af6d4effbe8ec41d816a7166b5367c265715c96c94e4a8" Feb 18 14:17:27 crc kubenswrapper[4817]: I0218 14:17:27.047855 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db0f-account-create-update-ljxz9" Feb 18 14:17:27 crc kubenswrapper[4817]: I0218 14:17:27.049622 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9002-account-create-update-x52d2" Feb 18 14:17:27 crc kubenswrapper[4817]: I0218 14:17:27.049619 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9002-account-create-update-x52d2" event={"ID":"e4e9ee1a-8b93-4306-ad94-d154b80f60c3","Type":"ContainerDied","Data":"6258f6054285c4f4b3ac07ec42870a1daff2a0b7f60722169313dfd90cc6c727"} Feb 18 14:17:27 crc kubenswrapper[4817]: I0218 14:17:27.049784 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6258f6054285c4f4b3ac07ec42870a1daff2a0b7f60722169313dfd90cc6c727" Feb 18 14:17:27 crc kubenswrapper[4817]: I0218 14:17:27.052221 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5d24-account-create-update-rrt2r" event={"ID":"288ec94a-2fa7-44a5-afaf-1bf7909336a7","Type":"ContainerDied","Data":"194efe15c992e2650f97126774e3a291df6b040e2f70486a39a3184736824121"} Feb 18 14:17:27 crc kubenswrapper[4817]: I0218 14:17:27.052283 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="194efe15c992e2650f97126774e3a291df6b040e2f70486a39a3184736824121" Feb 18 14:17:27 crc kubenswrapper[4817]: I0218 14:17:27.052247 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5d24-account-create-update-rrt2r" Feb 18 14:17:27 crc kubenswrapper[4817]: I0218 14:17:27.055074 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bkwzx" event={"ID":"2d4b2205-ea38-4a29-858f-c2acb3cbd423","Type":"ContainerDied","Data":"cb4143a05e1ade8a5cade5e2ac11370525fe025761eda42a1dcb01a931ae52ba"} Feb 18 14:17:27 crc kubenswrapper[4817]: I0218 14:17:27.055111 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb4143a05e1ade8a5cade5e2ac11370525fe025761eda42a1dcb01a931ae52ba" Feb 18 14:17:27 crc kubenswrapper[4817]: I0218 14:17:27.055243 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bkwzx" Feb 18 14:17:27 crc kubenswrapper[4817]: I0218 14:17:27.060651 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-hs8dt" event={"ID":"ff3fb952-c1b3-4311-832b-c8807407385e","Type":"ContainerDied","Data":"5a971bc7e969d1a599d1e832cd360ecb032c66a718450f082528ac3e38f76911"} Feb 18 14:17:27 crc kubenswrapper[4817]: I0218 14:17:27.060825 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a971bc7e969d1a599d1e832cd360ecb032c66a718450f082528ac3e38f76911" Feb 18 14:17:27 crc kubenswrapper[4817]: I0218 14:17:27.060699 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-hs8dt" Feb 18 14:17:27 crc kubenswrapper[4817]: I0218 14:17:27.062936 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-553d-account-create-update-twtnj" event={"ID":"4135b831-d02a-45bc-aea0-4584e8b2a01f","Type":"ContainerDied","Data":"22e7a7194586191a134dbac47e3fe13b8588e75e15c70b6a80580b130bd7926e"} Feb 18 14:17:27 crc kubenswrapper[4817]: I0218 14:17:27.062968 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-553d-account-create-update-twtnj" Feb 18 14:17:27 crc kubenswrapper[4817]: I0218 14:17:27.063004 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22e7a7194586191a134dbac47e3fe13b8588e75e15c70b6a80580b130bd7926e" Feb 18 14:17:27 crc kubenswrapper[4817]: I0218 14:17:27.537603 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77585f5f8c-b795s" Feb 18 14:17:27 crc kubenswrapper[4817]: I0218 14:17:27.617030 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-4jpr4"] Feb 18 14:17:27 crc kubenswrapper[4817]: I0218 14:17:27.617316 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-4jpr4" podUID="6e365fd4-7c85-448e-b932-e12471d948d5" containerName="dnsmasq-dns" containerID="cri-o://3e253b2e3d34a6940187b42d285539b0c2670e0a2e80d4122ef03184dfff4251" gracePeriod=10 Feb 18 14:17:28 crc kubenswrapper[4817]: I0218 14:17:28.074019 4817 generic.go:334] "Generic (PLEG): container finished" podID="fbb28d6a-260d-45fa-80ec-9f583e8fc37b" containerID="51277fde8aea5163e24f18911652195090f67f937c05af03df887ea4415313aa" exitCode=0 Feb 18 14:17:28 crc kubenswrapper[4817]: I0218 14:17:28.074112 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fbb28d6a-260d-45fa-80ec-9f583e8fc37b","Type":"ContainerDied","Data":"51277fde8aea5163e24f18911652195090f67f937c05af03df887ea4415313aa"} Feb 18 14:17:28 crc kubenswrapper[4817]: I0218 14:17:28.076581 4817 generic.go:334] "Generic (PLEG): container finished" podID="6e365fd4-7c85-448e-b932-e12471d948d5" containerID="3e253b2e3d34a6940187b42d285539b0c2670e0a2e80d4122ef03184dfff4251" exitCode=0 Feb 18 14:17:28 crc kubenswrapper[4817]: I0218 14:17:28.076628 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-4jpr4" event={"ID":"6e365fd4-7c85-448e-b932-e12471d948d5","Type":"ContainerDied","Data":"3e253b2e3d34a6940187b42d285539b0c2670e0a2e80d4122ef03184dfff4251"} Feb 18 14:17:29 crc kubenswrapper[4817]: I0218 14:17:29.255493 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-4jpr4" podUID="6e365fd4-7c85-448e-b932-e12471d948d5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.133:5353: connect: connection refused" Feb 18 14:17:33 crc kubenswrapper[4817]: E0218 14:17:33.813055 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Feb 18 14:17:33 crc kubenswrapper[4817]: E0218 14:17:33.813969 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q7fzc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-2fk5c_openstack(8de51007-ada2-49f5-90b2-11151899e3cf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 14:17:33 crc kubenswrapper[4817]: E0218 14:17:33.815220 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-2fk5c" podUID="8de51007-ada2-49f5-90b2-11151899e3cf" Feb 18 14:17:34 crc kubenswrapper[4817]: I0218 14:17:34.137211 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-4jpr4" event={"ID":"6e365fd4-7c85-448e-b932-e12471d948d5","Type":"ContainerDied","Data":"0ff72cef63e0874ac39d44fdc5e450b328f51a202214e4b7aff780b37822df94"} Feb 18 14:17:34 crc kubenswrapper[4817]: I0218 14:17:34.137524 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ff72cef63e0874ac39d44fdc5e450b328f51a202214e4b7aff780b37822df94" Feb 18 14:17:34 crc kubenswrapper[4817]: E0218 14:17:34.140505 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-2fk5c" podUID="8de51007-ada2-49f5-90b2-11151899e3cf" Feb 18 14:17:34 crc kubenswrapper[4817]: I0218 14:17:34.157829 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-4jpr4" Feb 18 14:17:34 crc kubenswrapper[4817]: I0218 14:17:34.252287 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ct6xx\" (UniqueName: \"kubernetes.io/projected/6e365fd4-7c85-448e-b932-e12471d948d5-kube-api-access-ct6xx\") pod \"6e365fd4-7c85-448e-b932-e12471d948d5\" (UID: \"6e365fd4-7c85-448e-b932-e12471d948d5\") " Feb 18 14:17:34 crc kubenswrapper[4817]: I0218 14:17:34.252685 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e365fd4-7c85-448e-b932-e12471d948d5-ovsdbserver-sb\") pod \"6e365fd4-7c85-448e-b932-e12471d948d5\" (UID: \"6e365fd4-7c85-448e-b932-e12471d948d5\") " Feb 18 14:17:34 crc kubenswrapper[4817]: I0218 14:17:34.252710 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e365fd4-7c85-448e-b932-e12471d948d5-ovsdbserver-nb\") pod \"6e365fd4-7c85-448e-b932-e12471d948d5\" (UID: \"6e365fd4-7c85-448e-b932-e12471d948d5\") " Feb 18 14:17:34 crc kubenswrapper[4817]: I0218 14:17:34.252811 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e365fd4-7c85-448e-b932-e12471d948d5-config\") pod \"6e365fd4-7c85-448e-b932-e12471d948d5\" (UID: \"6e365fd4-7c85-448e-b932-e12471d948d5\") " Feb 18 14:17:34 crc kubenswrapper[4817]: I0218 14:17:34.252857 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e365fd4-7c85-448e-b932-e12471d948d5-dns-svc\") pod \"6e365fd4-7c85-448e-b932-e12471d948d5\" (UID: \"6e365fd4-7c85-448e-b932-e12471d948d5\") " Feb 18 14:17:34 crc kubenswrapper[4817]: I0218 14:17:34.266792 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e365fd4-7c85-448e-b932-e12471d948d5-kube-api-access-ct6xx" (OuterVolumeSpecName: "kube-api-access-ct6xx") pod "6e365fd4-7c85-448e-b932-e12471d948d5" (UID: "6e365fd4-7c85-448e-b932-e12471d948d5"). InnerVolumeSpecName "kube-api-access-ct6xx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:17:34 crc kubenswrapper[4817]: I0218 14:17:34.315393 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e365fd4-7c85-448e-b932-e12471d948d5-config" (OuterVolumeSpecName: "config") pod "6e365fd4-7c85-448e-b932-e12471d948d5" (UID: "6e365fd4-7c85-448e-b932-e12471d948d5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:17:34 crc kubenswrapper[4817]: I0218 14:17:34.316291 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e365fd4-7c85-448e-b932-e12471d948d5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6e365fd4-7c85-448e-b932-e12471d948d5" (UID: "6e365fd4-7c85-448e-b932-e12471d948d5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:17:34 crc kubenswrapper[4817]: I0218 14:17:34.327080 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e365fd4-7c85-448e-b932-e12471d948d5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6e365fd4-7c85-448e-b932-e12471d948d5" (UID: "6e365fd4-7c85-448e-b932-e12471d948d5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:17:34 crc kubenswrapper[4817]: I0218 14:17:34.328593 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e365fd4-7c85-448e-b932-e12471d948d5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6e365fd4-7c85-448e-b932-e12471d948d5" (UID: "6e365fd4-7c85-448e-b932-e12471d948d5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:17:34 crc kubenswrapper[4817]: I0218 14:17:34.355501 4817 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e365fd4-7c85-448e-b932-e12471d948d5-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:34 crc kubenswrapper[4817]: I0218 14:17:34.355567 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ct6xx\" (UniqueName: \"kubernetes.io/projected/6e365fd4-7c85-448e-b932-e12471d948d5-kube-api-access-ct6xx\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:34 crc kubenswrapper[4817]: I0218 14:17:34.355581 4817 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e365fd4-7c85-448e-b932-e12471d948d5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:34 crc kubenswrapper[4817]: I0218 14:17:34.355592 4817 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e365fd4-7c85-448e-b932-e12471d948d5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:34 crc kubenswrapper[4817]: I0218 14:17:34.355600 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e365fd4-7c85-448e-b932-e12471d948d5-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:35 crc kubenswrapper[4817]: I0218 14:17:35.159164 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fbb28d6a-260d-45fa-80ec-9f583e8fc37b","Type":"ContainerStarted","Data":"ed3d26c667caa20ba887deab03558f9faca6ec4c31492d2ed88b894093a4e321"} Feb 18 14:17:35 crc kubenswrapper[4817]: I0218 14:17:35.160761 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-4jpr4" Feb 18 14:17:35 crc kubenswrapper[4817]: I0218 14:17:35.160803 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jwsrh" event={"ID":"d5e8be8e-4ac1-4926-8790-e6910c1cbddf","Type":"ContainerStarted","Data":"2c2b15a5408ff421bfb9a43b871a3db26e198cfd4b2742e4727dc629955b23a9"} Feb 18 14:17:35 crc kubenswrapper[4817]: I0218 14:17:35.182806 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-jwsrh" podStartSLOduration=3.198433717 podStartE2EDuration="16.182776957s" podCreationTimestamp="2026-02-18 14:17:19 +0000 UTC" firstStartedPulling="2026-02-18 14:17:20.855184781 +0000 UTC m=+1103.430720764" lastFinishedPulling="2026-02-18 14:17:33.839528011 +0000 UTC m=+1116.415064004" observedRunningTime="2026-02-18 14:17:35.176360456 +0000 UTC m=+1117.751896459" watchObservedRunningTime="2026-02-18 14:17:35.182776957 +0000 UTC m=+1117.758312930" Feb 18 14:17:35 crc kubenswrapper[4817]: I0218 14:17:35.209014 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-4jpr4"] Feb 18 14:17:35 crc kubenswrapper[4817]: I0218 14:17:35.215435 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-4jpr4"] Feb 18 14:17:35 crc kubenswrapper[4817]: E0218 14:17:35.353676 4817 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e365fd4_7c85_448e_b932_e12471d948d5.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e365fd4_7c85_448e_b932_e12471d948d5.slice/crio-0ff72cef63e0874ac39d44fdc5e450b328f51a202214e4b7aff780b37822df94\": RecentStats: unable to find data in memory cache]" Feb 18 14:17:36 crc kubenswrapper[4817]: I0218 14:17:36.183040 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e365fd4-7c85-448e-b932-e12471d948d5" path="/var/lib/kubelet/pods/6e365fd4-7c85-448e-b932-e12471d948d5/volumes" Feb 18 14:17:38 crc kubenswrapper[4817]: I0218 14:17:38.186089 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fbb28d6a-260d-45fa-80ec-9f583e8fc37b","Type":"ContainerStarted","Data":"f6648343f44a69d6a17ec18c27bd19d64fcc4c017e435d7758676da76b828d29"} Feb 18 14:17:38 crc kubenswrapper[4817]: I0218 14:17:38.186404 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fbb28d6a-260d-45fa-80ec-9f583e8fc37b","Type":"ContainerStarted","Data":"c9bc7b89f5e2658a00f0a60e7eff1191526a7b6dc9cdcb5ecfe9e28ac503b661"} Feb 18 14:17:38 crc kubenswrapper[4817]: I0218 14:17:38.213646 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=22.213624474 podStartE2EDuration="22.213624474s" podCreationTimestamp="2026-02-18 14:17:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:17:38.211855519 +0000 UTC m=+1120.787391512" watchObservedRunningTime="2026-02-18 14:17:38.213624474 +0000 UTC m=+1120.789160457" Feb 18 14:17:39 crc kubenswrapper[4817]: I0218 14:17:39.196961 4817 generic.go:334] "Generic (PLEG): container finished" podID="d5e8be8e-4ac1-4926-8790-e6910c1cbddf" containerID="2c2b15a5408ff421bfb9a43b871a3db26e198cfd4b2742e4727dc629955b23a9" exitCode=0 Feb 18 14:17:39 crc kubenswrapper[4817]: I0218 14:17:39.198018 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jwsrh" event={"ID":"d5e8be8e-4ac1-4926-8790-e6910c1cbddf","Type":"ContainerDied","Data":"2c2b15a5408ff421bfb9a43b871a3db26e198cfd4b2742e4727dc629955b23a9"} Feb 18 14:17:40 crc kubenswrapper[4817]: I0218 14:17:40.558087 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jwsrh" Feb 18 14:17:40 crc kubenswrapper[4817]: I0218 14:17:40.678309 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjlsf\" (UniqueName: \"kubernetes.io/projected/d5e8be8e-4ac1-4926-8790-e6910c1cbddf-kube-api-access-jjlsf\") pod \"d5e8be8e-4ac1-4926-8790-e6910c1cbddf\" (UID: \"d5e8be8e-4ac1-4926-8790-e6910c1cbddf\") " Feb 18 14:17:40 crc kubenswrapper[4817]: I0218 14:17:40.678368 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e8be8e-4ac1-4926-8790-e6910c1cbddf-combined-ca-bundle\") pod \"d5e8be8e-4ac1-4926-8790-e6910c1cbddf\" (UID: \"d5e8be8e-4ac1-4926-8790-e6910c1cbddf\") " Feb 18 14:17:40 crc kubenswrapper[4817]: I0218 14:17:40.678412 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e8be8e-4ac1-4926-8790-e6910c1cbddf-config-data\") pod \"d5e8be8e-4ac1-4926-8790-e6910c1cbddf\" (UID: \"d5e8be8e-4ac1-4926-8790-e6910c1cbddf\") " Feb 18 14:17:40 crc kubenswrapper[4817]: I0218 14:17:40.683844 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5e8be8e-4ac1-4926-8790-e6910c1cbddf-kube-api-access-jjlsf" (OuterVolumeSpecName: "kube-api-access-jjlsf") pod "d5e8be8e-4ac1-4926-8790-e6910c1cbddf" (UID: "d5e8be8e-4ac1-4926-8790-e6910c1cbddf"). InnerVolumeSpecName "kube-api-access-jjlsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:17:40 crc kubenswrapper[4817]: I0218 14:17:40.704703 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5e8be8e-4ac1-4926-8790-e6910c1cbddf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5e8be8e-4ac1-4926-8790-e6910c1cbddf" (UID: "d5e8be8e-4ac1-4926-8790-e6910c1cbddf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:17:40 crc kubenswrapper[4817]: I0218 14:17:40.735839 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5e8be8e-4ac1-4926-8790-e6910c1cbddf-config-data" (OuterVolumeSpecName: "config-data") pod "d5e8be8e-4ac1-4926-8790-e6910c1cbddf" (UID: "d5e8be8e-4ac1-4926-8790-e6910c1cbddf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:17:40 crc kubenswrapper[4817]: I0218 14:17:40.781284 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjlsf\" (UniqueName: \"kubernetes.io/projected/d5e8be8e-4ac1-4926-8790-e6910c1cbddf-kube-api-access-jjlsf\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:40 crc kubenswrapper[4817]: I0218 14:17:40.781331 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e8be8e-4ac1-4926-8790-e6910c1cbddf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:40 crc kubenswrapper[4817]: I0218 14:17:40.781341 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e8be8e-4ac1-4926-8790-e6910c1cbddf-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.221280 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jwsrh" event={"ID":"d5e8be8e-4ac1-4926-8790-e6910c1cbddf","Type":"ContainerDied","Data":"616d213ed59849273a84fc3cca4693920944f5a9453a4d06c05db7799d1fc19a"} Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.221316 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="616d213ed59849273a84fc3cca4693920944f5a9453a4d06c05db7799d1fc19a" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.221324 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jwsrh" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.542365 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-znbw9"] Feb 18 14:17:41 crc kubenswrapper[4817]: E0218 14:17:41.543134 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e365fd4-7c85-448e-b932-e12471d948d5" containerName="init" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.543158 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e365fd4-7c85-448e-b932-e12471d948d5" containerName="init" Feb 18 14:17:41 crc kubenswrapper[4817]: E0218 14:17:41.543176 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5e8be8e-4ac1-4926-8790-e6910c1cbddf" containerName="keystone-db-sync" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.543186 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5e8be8e-4ac1-4926-8790-e6910c1cbddf" containerName="keystone-db-sync" Feb 18 14:17:41 crc kubenswrapper[4817]: E0218 14:17:41.543203 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4e9ee1a-8b93-4306-ad94-d154b80f60c3" containerName="mariadb-account-create-update" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.543211 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4e9ee1a-8b93-4306-ad94-d154b80f60c3" containerName="mariadb-account-create-update" Feb 18 14:17:41 crc kubenswrapper[4817]: E0218 14:17:41.543226 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4135b831-d02a-45bc-aea0-4584e8b2a01f" containerName="mariadb-account-create-update" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.543235 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="4135b831-d02a-45bc-aea0-4584e8b2a01f" containerName="mariadb-account-create-update" Feb 18 14:17:41 crc kubenswrapper[4817]: E0218 14:17:41.543252 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d4b2205-ea38-4a29-858f-c2acb3cbd423" containerName="mariadb-database-create" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.543264 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d4b2205-ea38-4a29-858f-c2acb3cbd423" containerName="mariadb-database-create" Feb 18 14:17:41 crc kubenswrapper[4817]: E0218 14:17:41.543307 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="288ec94a-2fa7-44a5-afaf-1bf7909336a7" containerName="mariadb-account-create-update" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.543316 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="288ec94a-2fa7-44a5-afaf-1bf7909336a7" containerName="mariadb-account-create-update" Feb 18 14:17:41 crc kubenswrapper[4817]: E0218 14:17:41.543345 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff3fb952-c1b3-4311-832b-c8807407385e" containerName="mariadb-database-create" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.543354 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff3fb952-c1b3-4311-832b-c8807407385e" containerName="mariadb-database-create" Feb 18 14:17:41 crc kubenswrapper[4817]: E0218 14:17:41.543375 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="639aeed9-1ba1-4ad8-acb2-90e3e800e4a9" containerName="mariadb-database-create" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.543383 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="639aeed9-1ba1-4ad8-acb2-90e3e800e4a9" containerName="mariadb-database-create" Feb 18 14:17:41 crc kubenswrapper[4817]: E0218 14:17:41.543399 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e72216-792c-4525-b231-2370a5b4d8ef" containerName="mariadb-database-create" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.543407 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e72216-792c-4525-b231-2370a5b4d8ef" containerName="mariadb-database-create" Feb 18 14:17:41 crc kubenswrapper[4817]: E0218 14:17:41.543423 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed148ce2-1bf9-44a7-b0bd-444c12bead6c" containerName="mariadb-account-create-update" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.543430 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed148ce2-1bf9-44a7-b0bd-444c12bead6c" containerName="mariadb-account-create-update" Feb 18 14:17:41 crc kubenswrapper[4817]: E0218 14:17:41.543444 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e365fd4-7c85-448e-b932-e12471d948d5" containerName="dnsmasq-dns" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.543452 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e365fd4-7c85-448e-b932-e12471d948d5" containerName="dnsmasq-dns" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.543695 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d4b2205-ea38-4a29-858f-c2acb3cbd423" containerName="mariadb-database-create" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.543712 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff3fb952-c1b3-4311-832b-c8807407385e" containerName="mariadb-database-create" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.543722 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5e8be8e-4ac1-4926-8790-e6910c1cbddf" containerName="keystone-db-sync" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.543738 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="639aeed9-1ba1-4ad8-acb2-90e3e800e4a9" containerName="mariadb-database-create" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.543753 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed148ce2-1bf9-44a7-b0bd-444c12bead6c" containerName="mariadb-account-create-update" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.543767 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e365fd4-7c85-448e-b932-e12471d948d5" containerName="dnsmasq-dns" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.543786 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4e9ee1a-8b93-4306-ad94-d154b80f60c3" containerName="mariadb-account-create-update" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.543796 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="4135b831-d02a-45bc-aea0-4584e8b2a01f" containerName="mariadb-account-create-update" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.543810 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="67e72216-792c-4525-b231-2370a5b4d8ef" containerName="mariadb-database-create" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.543822 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="288ec94a-2fa7-44a5-afaf-1bf7909336a7" containerName="mariadb-account-create-update" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.544766 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-znbw9" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.550303 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-znbw9"] Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.566822 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.567026 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.567172 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.567248 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.567379 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2ntmd" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.582883 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-z4c99"] Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.584770 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-z4c99" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.606044 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-z4c99"] Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.719941 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-2pxsw"] Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.721489 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2pxsw" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.724057 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-vs6xr" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.728956 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-z4c99\" (UID: \"1002ac08-1dd8-4224-8c9d-73ce218c2dc1\") " pod="openstack/dnsmasq-dns-55fff446b9-z4c99" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.729102 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-dns-svc\") pod \"dnsmasq-dns-55fff446b9-z4c99\" (UID: \"1002ac08-1dd8-4224-8c9d-73ce218c2dc1\") " pod="openstack/dnsmasq-dns-55fff446b9-z4c99" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.729136 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-z4c99\" (UID: \"1002ac08-1dd8-4224-8c9d-73ce218c2dc1\") " pod="openstack/dnsmasq-dns-55fff446b9-z4c99" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.729161 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9aa3676-feff-46e6-b4fd-bcbdb8260948-combined-ca-bundle\") pod \"keystone-bootstrap-znbw9\" (UID: \"c9aa3676-feff-46e6-b4fd-bcbdb8260948\") " pod="openstack/keystone-bootstrap-znbw9" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.729226 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-z4c99\" (UID: \"1002ac08-1dd8-4224-8c9d-73ce218c2dc1\") " pod="openstack/dnsmasq-dns-55fff446b9-z4c99" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.729245 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9aa3676-feff-46e6-b4fd-bcbdb8260948-config-data\") pod \"keystone-bootstrap-znbw9\" (UID: \"c9aa3676-feff-46e6-b4fd-bcbdb8260948\") " pod="openstack/keystone-bootstrap-znbw9" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.729275 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckfbt\" (UniqueName: \"kubernetes.io/projected/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-kube-api-access-ckfbt\") pod \"dnsmasq-dns-55fff446b9-z4c99\" (UID: \"1002ac08-1dd8-4224-8c9d-73ce218c2dc1\") " pod="openstack/dnsmasq-dns-55fff446b9-z4c99" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.729297 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c9aa3676-feff-46e6-b4fd-bcbdb8260948-fernet-keys\") pod \"keystone-bootstrap-znbw9\" (UID: \"c9aa3676-feff-46e6-b4fd-bcbdb8260948\") " pod="openstack/keystone-bootstrap-znbw9" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.729319 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9aa3676-feff-46e6-b4fd-bcbdb8260948-scripts\") pod \"keystone-bootstrap-znbw9\" (UID: \"c9aa3676-feff-46e6-b4fd-bcbdb8260948\") " pod="openstack/keystone-bootstrap-znbw9" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.729333 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c9aa3676-feff-46e6-b4fd-bcbdb8260948-credential-keys\") pod \"keystone-bootstrap-znbw9\" (UID: \"c9aa3676-feff-46e6-b4fd-bcbdb8260948\") " pod="openstack/keystone-bootstrap-znbw9" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.729361 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-config\") pod \"dnsmasq-dns-55fff446b9-z4c99\" (UID: \"1002ac08-1dd8-4224-8c9d-73ce218c2dc1\") " pod="openstack/dnsmasq-dns-55fff446b9-z4c99" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.729377 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8mgw\" (UniqueName: \"kubernetes.io/projected/c9aa3676-feff-46e6-b4fd-bcbdb8260948-kube-api-access-p8mgw\") pod \"keystone-bootstrap-znbw9\" (UID: \"c9aa3676-feff-46e6-b4fd-bcbdb8260948\") " pod="openstack/keystone-bootstrap-znbw9" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.729621 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.729747 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.744282 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-2pxsw"] Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.821843 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-4xc6g"] Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.823410 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-4xc6g" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.825219 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.828823 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.829151 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.829418 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-zgqz6" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.830640 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckfbt\" (UniqueName: \"kubernetes.io/projected/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-kube-api-access-ckfbt\") pod \"dnsmasq-dns-55fff446b9-z4c99\" (UID: \"1002ac08-1dd8-4224-8c9d-73ce218c2dc1\") " pod="openstack/dnsmasq-dns-55fff446b9-z4c99" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.830784 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07f0e519-a5f3-45a2-a5da-e10f851f18df-db-sync-config-data\") pod \"cinder-db-sync-2pxsw\" (UID: \"07f0e519-a5f3-45a2-a5da-e10f851f18df\") " pod="openstack/cinder-db-sync-2pxsw" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.830950 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c9aa3676-feff-46e6-b4fd-bcbdb8260948-fernet-keys\") pod \"keystone-bootstrap-znbw9\" (UID: \"c9aa3676-feff-46e6-b4fd-bcbdb8260948\") " pod="openstack/keystone-bootstrap-znbw9" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.831108 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9aa3676-feff-46e6-b4fd-bcbdb8260948-scripts\") pod \"keystone-bootstrap-znbw9\" (UID: \"c9aa3676-feff-46e6-b4fd-bcbdb8260948\") " pod="openstack/keystone-bootstrap-znbw9" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.831203 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c9aa3676-feff-46e6-b4fd-bcbdb8260948-credential-keys\") pod \"keystone-bootstrap-znbw9\" (UID: \"c9aa3676-feff-46e6-b4fd-bcbdb8260948\") " pod="openstack/keystone-bootstrap-znbw9" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.831304 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07f0e519-a5f3-45a2-a5da-e10f851f18df-scripts\") pod \"cinder-db-sync-2pxsw\" (UID: \"07f0e519-a5f3-45a2-a5da-e10f851f18df\") " pod="openstack/cinder-db-sync-2pxsw" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.832070 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-config\") pod \"dnsmasq-dns-55fff446b9-z4c99\" (UID: \"1002ac08-1dd8-4224-8c9d-73ce218c2dc1\") " pod="openstack/dnsmasq-dns-55fff446b9-z4c99" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.832114 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8mgw\" (UniqueName: \"kubernetes.io/projected/c9aa3676-feff-46e6-b4fd-bcbdb8260948-kube-api-access-p8mgw\") pod \"keystone-bootstrap-znbw9\" (UID: \"c9aa3676-feff-46e6-b4fd-bcbdb8260948\") " pod="openstack/keystone-bootstrap-znbw9" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.832160 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-z4c99\" (UID: \"1002ac08-1dd8-4224-8c9d-73ce218c2dc1\") " pod="openstack/dnsmasq-dns-55fff446b9-z4c99" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.832223 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-dns-svc\") pod \"dnsmasq-dns-55fff446b9-z4c99\" (UID: \"1002ac08-1dd8-4224-8c9d-73ce218c2dc1\") " pod="openstack/dnsmasq-dns-55fff446b9-z4c99" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.832262 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72d54\" (UniqueName: \"kubernetes.io/projected/07f0e519-a5f3-45a2-a5da-e10f851f18df-kube-api-access-72d54\") pod \"cinder-db-sync-2pxsw\" (UID: \"07f0e519-a5f3-45a2-a5da-e10f851f18df\") " pod="openstack/cinder-db-sync-2pxsw" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.832293 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-z4c99\" (UID: \"1002ac08-1dd8-4224-8c9d-73ce218c2dc1\") " pod="openstack/dnsmasq-dns-55fff446b9-z4c99" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.832324 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9aa3676-feff-46e6-b4fd-bcbdb8260948-combined-ca-bundle\") pod \"keystone-bootstrap-znbw9\" (UID: \"c9aa3676-feff-46e6-b4fd-bcbdb8260948\") " pod="openstack/keystone-bootstrap-znbw9" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.832422 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f0e519-a5f3-45a2-a5da-e10f851f18df-combined-ca-bundle\") pod \"cinder-db-sync-2pxsw\" (UID: \"07f0e519-a5f3-45a2-a5da-e10f851f18df\") " pod="openstack/cinder-db-sync-2pxsw" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.832466 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-z4c99\" (UID: \"1002ac08-1dd8-4224-8c9d-73ce218c2dc1\") " pod="openstack/dnsmasq-dns-55fff446b9-z4c99" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.832493 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9aa3676-feff-46e6-b4fd-bcbdb8260948-config-data\") pod \"keystone-bootstrap-znbw9\" (UID: \"c9aa3676-feff-46e6-b4fd-bcbdb8260948\") " pod="openstack/keystone-bootstrap-znbw9" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.832525 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07f0e519-a5f3-45a2-a5da-e10f851f18df-config-data\") pod \"cinder-db-sync-2pxsw\" (UID: \"07f0e519-a5f3-45a2-a5da-e10f851f18df\") " pod="openstack/cinder-db-sync-2pxsw" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.832563 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/07f0e519-a5f3-45a2-a5da-e10f851f18df-etc-machine-id\") pod \"cinder-db-sync-2pxsw\" (UID: \"07f0e519-a5f3-45a2-a5da-e10f851f18df\") " pod="openstack/cinder-db-sync-2pxsw" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.833438 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-z4c99\" (UID: \"1002ac08-1dd8-4224-8c9d-73ce218c2dc1\") " pod="openstack/dnsmasq-dns-55fff446b9-z4c99" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.834197 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-z4c99\" (UID: \"1002ac08-1dd8-4224-8c9d-73ce218c2dc1\") " pod="openstack/dnsmasq-dns-55fff446b9-z4c99" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.835224 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-config\") pod \"dnsmasq-dns-55fff446b9-z4c99\" (UID: \"1002ac08-1dd8-4224-8c9d-73ce218c2dc1\") " pod="openstack/dnsmasq-dns-55fff446b9-z4c99" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.837290 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-dns-svc\") pod \"dnsmasq-dns-55fff446b9-z4c99\" (UID: \"1002ac08-1dd8-4224-8c9d-73ce218c2dc1\") " pod="openstack/dnsmasq-dns-55fff446b9-z4c99" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.838151 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-z4c99\" (UID: \"1002ac08-1dd8-4224-8c9d-73ce218c2dc1\") " pod="openstack/dnsmasq-dns-55fff446b9-z4c99" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.844116 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-4xc6g"] Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.862012 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c9aa3676-feff-46e6-b4fd-bcbdb8260948-fernet-keys\") pod \"keystone-bootstrap-znbw9\" (UID: \"c9aa3676-feff-46e6-b4fd-bcbdb8260948\") " pod="openstack/keystone-bootstrap-znbw9" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.862681 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9aa3676-feff-46e6-b4fd-bcbdb8260948-scripts\") pod \"keystone-bootstrap-znbw9\" (UID: \"c9aa3676-feff-46e6-b4fd-bcbdb8260948\") " pod="openstack/keystone-bootstrap-znbw9" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.862907 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9aa3676-feff-46e6-b4fd-bcbdb8260948-config-data\") pod \"keystone-bootstrap-znbw9\" (UID: \"c9aa3676-feff-46e6-b4fd-bcbdb8260948\") " pod="openstack/keystone-bootstrap-znbw9" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.865654 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9aa3676-feff-46e6-b4fd-bcbdb8260948-combined-ca-bundle\") pod \"keystone-bootstrap-znbw9\" (UID: \"c9aa3676-feff-46e6-b4fd-bcbdb8260948\") " pod="openstack/keystone-bootstrap-znbw9" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.870554 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c9aa3676-feff-46e6-b4fd-bcbdb8260948-credential-keys\") pod \"keystone-bootstrap-znbw9\" (UID: \"c9aa3676-feff-46e6-b4fd-bcbdb8260948\") " pod="openstack/keystone-bootstrap-znbw9" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.873019 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckfbt\" (UniqueName: \"kubernetes.io/projected/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-kube-api-access-ckfbt\") pod \"dnsmasq-dns-55fff446b9-z4c99\" (UID: \"1002ac08-1dd8-4224-8c9d-73ce218c2dc1\") " pod="openstack/dnsmasq-dns-55fff446b9-z4c99" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.873075 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-sm9px"] Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.885296 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-sm9px" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.888506 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8mgw\" (UniqueName: \"kubernetes.io/projected/c9aa3676-feff-46e6-b4fd-bcbdb8260948-kube-api-access-p8mgw\") pod \"keystone-bootstrap-znbw9\" (UID: \"c9aa3676-feff-46e6-b4fd-bcbdb8260948\") " pod="openstack/keystone-bootstrap-znbw9" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.888916 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-znbw9" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.891586 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.892112 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-lms5l" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.897289 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.932125 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-z4c99"] Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.932860 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-z4c99" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.933768 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7ca3146-2c85-46da-baeb-ea06b64ffac0-config-data\") pod \"placement-db-sync-sm9px\" (UID: \"a7ca3146-2c85-46da-baeb-ea06b64ffac0\") " pod="openstack/placement-db-sync-sm9px" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.933814 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzncd\" (UniqueName: \"kubernetes.io/projected/0e385fdc-9c05-49ce-a823-dd99efa98e94-kube-api-access-fzncd\") pod \"cloudkitty-db-sync-4xc6g\" (UID: \"0e385fdc-9c05-49ce-a823-dd99efa98e94\") " pod="openstack/cloudkitty-db-sync-4xc6g" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.933844 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/0e385fdc-9c05-49ce-a823-dd99efa98e94-certs\") pod \"cloudkitty-db-sync-4xc6g\" (UID: \"0e385fdc-9c05-49ce-a823-dd99efa98e94\") " pod="openstack/cloudkitty-db-sync-4xc6g" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.933880 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e385fdc-9c05-49ce-a823-dd99efa98e94-combined-ca-bundle\") pod \"cloudkitty-db-sync-4xc6g\" (UID: \"0e385fdc-9c05-49ce-a823-dd99efa98e94\") " pod="openstack/cloudkitty-db-sync-4xc6g" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.933915 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72d54\" (UniqueName: \"kubernetes.io/projected/07f0e519-a5f3-45a2-a5da-e10f851f18df-kube-api-access-72d54\") pod \"cinder-db-sync-2pxsw\" (UID: \"07f0e519-a5f3-45a2-a5da-e10f851f18df\") " pod="openstack/cinder-db-sync-2pxsw" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.933942 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7ca3146-2c85-46da-baeb-ea06b64ffac0-scripts\") pod \"placement-db-sync-sm9px\" (UID: \"a7ca3146-2c85-46da-baeb-ea06b64ffac0\") " pod="openstack/placement-db-sync-sm9px" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.933998 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr979\" (UniqueName: \"kubernetes.io/projected/a7ca3146-2c85-46da-baeb-ea06b64ffac0-kube-api-access-zr979\") pod \"placement-db-sync-sm9px\" (UID: \"a7ca3146-2c85-46da-baeb-ea06b64ffac0\") " pod="openstack/placement-db-sync-sm9px" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.934027 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7ca3146-2c85-46da-baeb-ea06b64ffac0-combined-ca-bundle\") pod \"placement-db-sync-sm9px\" (UID: \"a7ca3146-2c85-46da-baeb-ea06b64ffac0\") " pod="openstack/placement-db-sync-sm9px" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.934056 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7ca3146-2c85-46da-baeb-ea06b64ffac0-logs\") pod \"placement-db-sync-sm9px\" (UID: \"a7ca3146-2c85-46da-baeb-ea06b64ffac0\") " pod="openstack/placement-db-sync-sm9px" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.934096 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e385fdc-9c05-49ce-a823-dd99efa98e94-config-data\") pod \"cloudkitty-db-sync-4xc6g\" (UID: \"0e385fdc-9c05-49ce-a823-dd99efa98e94\") " pod="openstack/cloudkitty-db-sync-4xc6g" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.934124 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f0e519-a5f3-45a2-a5da-e10f851f18df-combined-ca-bundle\") pod \"cinder-db-sync-2pxsw\" (UID: \"07f0e519-a5f3-45a2-a5da-e10f851f18df\") " pod="openstack/cinder-db-sync-2pxsw" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.934154 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e385fdc-9c05-49ce-a823-dd99efa98e94-scripts\") pod \"cloudkitty-db-sync-4xc6g\" (UID: \"0e385fdc-9c05-49ce-a823-dd99efa98e94\") " pod="openstack/cloudkitty-db-sync-4xc6g" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.934174 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07f0e519-a5f3-45a2-a5da-e10f851f18df-config-data\") pod \"cinder-db-sync-2pxsw\" (UID: \"07f0e519-a5f3-45a2-a5da-e10f851f18df\") " pod="openstack/cinder-db-sync-2pxsw" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.934191 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/07f0e519-a5f3-45a2-a5da-e10f851f18df-etc-machine-id\") pod \"cinder-db-sync-2pxsw\" (UID: \"07f0e519-a5f3-45a2-a5da-e10f851f18df\") " pod="openstack/cinder-db-sync-2pxsw" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.934214 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07f0e519-a5f3-45a2-a5da-e10f851f18df-db-sync-config-data\") pod \"cinder-db-sync-2pxsw\" (UID: \"07f0e519-a5f3-45a2-a5da-e10f851f18df\") " pod="openstack/cinder-db-sync-2pxsw" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.934244 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07f0e519-a5f3-45a2-a5da-e10f851f18df-scripts\") pod \"cinder-db-sync-2pxsw\" (UID: \"07f0e519-a5f3-45a2-a5da-e10f851f18df\") " pod="openstack/cinder-db-sync-2pxsw" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.934591 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/07f0e519-a5f3-45a2-a5da-e10f851f18df-etc-machine-id\") pod \"cinder-db-sync-2pxsw\" (UID: \"07f0e519-a5f3-45a2-a5da-e10f851f18df\") " pod="openstack/cinder-db-sync-2pxsw" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.941900 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07f0e519-a5f3-45a2-a5da-e10f851f18df-db-sync-config-data\") pod \"cinder-db-sync-2pxsw\" (UID: \"07f0e519-a5f3-45a2-a5da-e10f851f18df\") " pod="openstack/cinder-db-sync-2pxsw" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.950881 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-sm9px"] Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.956517 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07f0e519-a5f3-45a2-a5da-e10f851f18df-scripts\") pod \"cinder-db-sync-2pxsw\" (UID: \"07f0e519-a5f3-45a2-a5da-e10f851f18df\") " pod="openstack/cinder-db-sync-2pxsw" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.957435 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f0e519-a5f3-45a2-a5da-e10f851f18df-combined-ca-bundle\") pod \"cinder-db-sync-2pxsw\" (UID: \"07f0e519-a5f3-45a2-a5da-e10f851f18df\") " pod="openstack/cinder-db-sync-2pxsw" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.983706 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72d54\" (UniqueName: \"kubernetes.io/projected/07f0e519-a5f3-45a2-a5da-e10f851f18df-kube-api-access-72d54\") pod \"cinder-db-sync-2pxsw\" (UID: \"07f0e519-a5f3-45a2-a5da-e10f851f18df\") " pod="openstack/cinder-db-sync-2pxsw" Feb 18 14:17:41 crc kubenswrapper[4817]: I0218 14:17:41.988187 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07f0e519-a5f3-45a2-a5da-e10f851f18df-config-data\") pod \"cinder-db-sync-2pxsw\" (UID: \"07f0e519-a5f3-45a2-a5da-e10f851f18df\") " pod="openstack/cinder-db-sync-2pxsw" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.021203 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-kg5vn"] Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.024488 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-kg5vn" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.042186 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6690ffc6-f624-448b-81d7-e36a8e059a44-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-kg5vn\" (UID: \"6690ffc6-f624-448b-81d7-e36a8e059a44\") " pod="openstack/dnsmasq-dns-76fcf4b695-kg5vn" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.042239 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e385fdc-9c05-49ce-a823-dd99efa98e94-scripts\") pod \"cloudkitty-db-sync-4xc6g\" (UID: \"0e385fdc-9c05-49ce-a823-dd99efa98e94\") " pod="openstack/cloudkitty-db-sync-4xc6g" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.042283 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6690ffc6-f624-448b-81d7-e36a8e059a44-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-kg5vn\" (UID: \"6690ffc6-f624-448b-81d7-e36a8e059a44\") " pod="openstack/dnsmasq-dns-76fcf4b695-kg5vn" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.042314 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6690ffc6-f624-448b-81d7-e36a8e059a44-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-kg5vn\" (UID: \"6690ffc6-f624-448b-81d7-e36a8e059a44\") " pod="openstack/dnsmasq-dns-76fcf4b695-kg5vn" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.042339 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6690ffc6-f624-448b-81d7-e36a8e059a44-config\") pod \"dnsmasq-dns-76fcf4b695-kg5vn\" (UID: \"6690ffc6-f624-448b-81d7-e36a8e059a44\") " pod="openstack/dnsmasq-dns-76fcf4b695-kg5vn" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.042364 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtrbb\" (UniqueName: \"kubernetes.io/projected/6690ffc6-f624-448b-81d7-e36a8e059a44-kube-api-access-rtrbb\") pod \"dnsmasq-dns-76fcf4b695-kg5vn\" (UID: \"6690ffc6-f624-448b-81d7-e36a8e059a44\") " pod="openstack/dnsmasq-dns-76fcf4b695-kg5vn" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.042391 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7ca3146-2c85-46da-baeb-ea06b64ffac0-config-data\") pod \"placement-db-sync-sm9px\" (UID: \"a7ca3146-2c85-46da-baeb-ea06b64ffac0\") " pod="openstack/placement-db-sync-sm9px" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.042426 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzncd\" (UniqueName: \"kubernetes.io/projected/0e385fdc-9c05-49ce-a823-dd99efa98e94-kube-api-access-fzncd\") pod \"cloudkitty-db-sync-4xc6g\" (UID: \"0e385fdc-9c05-49ce-a823-dd99efa98e94\") " pod="openstack/cloudkitty-db-sync-4xc6g" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.042462 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/0e385fdc-9c05-49ce-a823-dd99efa98e94-certs\") pod \"cloudkitty-db-sync-4xc6g\" (UID: \"0e385fdc-9c05-49ce-a823-dd99efa98e94\") " pod="openstack/cloudkitty-db-sync-4xc6g" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.042513 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e385fdc-9c05-49ce-a823-dd99efa98e94-combined-ca-bundle\") pod \"cloudkitty-db-sync-4xc6g\" (UID: \"0e385fdc-9c05-49ce-a823-dd99efa98e94\") " pod="openstack/cloudkitty-db-sync-4xc6g" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.042560 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7ca3146-2c85-46da-baeb-ea06b64ffac0-scripts\") pod \"placement-db-sync-sm9px\" (UID: \"a7ca3146-2c85-46da-baeb-ea06b64ffac0\") " pod="openstack/placement-db-sync-sm9px" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.042604 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6690ffc6-f624-448b-81d7-e36a8e059a44-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-kg5vn\" (UID: \"6690ffc6-f624-448b-81d7-e36a8e059a44\") " pod="openstack/dnsmasq-dns-76fcf4b695-kg5vn" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.042646 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr979\" (UniqueName: \"kubernetes.io/projected/a7ca3146-2c85-46da-baeb-ea06b64ffac0-kube-api-access-zr979\") pod \"placement-db-sync-sm9px\" (UID: \"a7ca3146-2c85-46da-baeb-ea06b64ffac0\") " pod="openstack/placement-db-sync-sm9px" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.042678 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7ca3146-2c85-46da-baeb-ea06b64ffac0-combined-ca-bundle\") pod \"placement-db-sync-sm9px\" (UID: \"a7ca3146-2c85-46da-baeb-ea06b64ffac0\") " pod="openstack/placement-db-sync-sm9px" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.042719 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7ca3146-2c85-46da-baeb-ea06b64ffac0-logs\") pod \"placement-db-sync-sm9px\" (UID: \"a7ca3146-2c85-46da-baeb-ea06b64ffac0\") " pod="openstack/placement-db-sync-sm9px" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.042778 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e385fdc-9c05-49ce-a823-dd99efa98e94-config-data\") pod \"cloudkitty-db-sync-4xc6g\" (UID: \"0e385fdc-9c05-49ce-a823-dd99efa98e94\") " pod="openstack/cloudkitty-db-sync-4xc6g" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.054600 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e385fdc-9c05-49ce-a823-dd99efa98e94-combined-ca-bundle\") pod \"cloudkitty-db-sync-4xc6g\" (UID: \"0e385fdc-9c05-49ce-a823-dd99efa98e94\") " pod="openstack/cloudkitty-db-sync-4xc6g" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.058716 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/0e385fdc-9c05-49ce-a823-dd99efa98e94-certs\") pod \"cloudkitty-db-sync-4xc6g\" (UID: \"0e385fdc-9c05-49ce-a823-dd99efa98e94\") " pod="openstack/cloudkitty-db-sync-4xc6g" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.058861 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2pxsw" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.059423 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7ca3146-2c85-46da-baeb-ea06b64ffac0-logs\") pod \"placement-db-sync-sm9px\" (UID: \"a7ca3146-2c85-46da-baeb-ea06b64ffac0\") " pod="openstack/placement-db-sync-sm9px" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.060765 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e385fdc-9c05-49ce-a823-dd99efa98e94-config-data\") pod \"cloudkitty-db-sync-4xc6g\" (UID: \"0e385fdc-9c05-49ce-a823-dd99efa98e94\") " pod="openstack/cloudkitty-db-sync-4xc6g" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.064889 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e385fdc-9c05-49ce-a823-dd99efa98e94-scripts\") pod \"cloudkitty-db-sync-4xc6g\" (UID: \"0e385fdc-9c05-49ce-a823-dd99efa98e94\") " pod="openstack/cloudkitty-db-sync-4xc6g" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.074532 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7ca3146-2c85-46da-baeb-ea06b64ffac0-combined-ca-bundle\") pod \"placement-db-sync-sm9px\" (UID: \"a7ca3146-2c85-46da-baeb-ea06b64ffac0\") " pod="openstack/placement-db-sync-sm9px" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.076047 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7ca3146-2c85-46da-baeb-ea06b64ffac0-config-data\") pod \"placement-db-sync-sm9px\" (UID: \"a7ca3146-2c85-46da-baeb-ea06b64ffac0\") " pod="openstack/placement-db-sync-sm9px" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.087377 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzncd\" (UniqueName: \"kubernetes.io/projected/0e385fdc-9c05-49ce-a823-dd99efa98e94-kube-api-access-fzncd\") pod \"cloudkitty-db-sync-4xc6g\" (UID: \"0e385fdc-9c05-49ce-a823-dd99efa98e94\") " pod="openstack/cloudkitty-db-sync-4xc6g" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.100138 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7ca3146-2c85-46da-baeb-ea06b64ffac0-scripts\") pod \"placement-db-sync-sm9px\" (UID: \"a7ca3146-2c85-46da-baeb-ea06b64ffac0\") " pod="openstack/placement-db-sync-sm9px" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.107128 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-w48xt"] Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.109679 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr979\" (UniqueName: \"kubernetes.io/projected/a7ca3146-2c85-46da-baeb-ea06b64ffac0-kube-api-access-zr979\") pod \"placement-db-sync-sm9px\" (UID: \"a7ca3146-2c85-46da-baeb-ea06b64ffac0\") " pod="openstack/placement-db-sync-sm9px" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.120197 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-w48xt" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.124764 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.125003 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-v5lxv" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.127549 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.130610 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-4xc6g" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.144666 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6690ffc6-f624-448b-81d7-e36a8e059a44-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-kg5vn\" (UID: \"6690ffc6-f624-448b-81d7-e36a8e059a44\") " pod="openstack/dnsmasq-dns-76fcf4b695-kg5vn" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.144821 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6690ffc6-f624-448b-81d7-e36a8e059a44-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-kg5vn\" (UID: \"6690ffc6-f624-448b-81d7-e36a8e059a44\") " pod="openstack/dnsmasq-dns-76fcf4b695-kg5vn" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.144883 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6690ffc6-f624-448b-81d7-e36a8e059a44-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-kg5vn\" (UID: \"6690ffc6-f624-448b-81d7-e36a8e059a44\") " pod="openstack/dnsmasq-dns-76fcf4b695-kg5vn" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.144928 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6690ffc6-f624-448b-81d7-e36a8e059a44-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-kg5vn\" (UID: \"6690ffc6-f624-448b-81d7-e36a8e059a44\") " pod="openstack/dnsmasq-dns-76fcf4b695-kg5vn" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.145091 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6690ffc6-f624-448b-81d7-e36a8e059a44-config\") pod \"dnsmasq-dns-76fcf4b695-kg5vn\" (UID: \"6690ffc6-f624-448b-81d7-e36a8e059a44\") " pod="openstack/dnsmasq-dns-76fcf4b695-kg5vn" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.145161 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtrbb\" (UniqueName: \"kubernetes.io/projected/6690ffc6-f624-448b-81d7-e36a8e059a44-kube-api-access-rtrbb\") pod \"dnsmasq-dns-76fcf4b695-kg5vn\" (UID: \"6690ffc6-f624-448b-81d7-e36a8e059a44\") " pod="openstack/dnsmasq-dns-76fcf4b695-kg5vn" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.149178 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6690ffc6-f624-448b-81d7-e36a8e059a44-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-kg5vn\" (UID: \"6690ffc6-f624-448b-81d7-e36a8e059a44\") " pod="openstack/dnsmasq-dns-76fcf4b695-kg5vn" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.150750 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6690ffc6-f624-448b-81d7-e36a8e059a44-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-kg5vn\" (UID: \"6690ffc6-f624-448b-81d7-e36a8e059a44\") " pod="openstack/dnsmasq-dns-76fcf4b695-kg5vn" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.150886 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6690ffc6-f624-448b-81d7-e36a8e059a44-config\") pod \"dnsmasq-dns-76fcf4b695-kg5vn\" (UID: \"6690ffc6-f624-448b-81d7-e36a8e059a44\") " pod="openstack/dnsmasq-dns-76fcf4b695-kg5vn" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.151430 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6690ffc6-f624-448b-81d7-e36a8e059a44-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-kg5vn\" (UID: \"6690ffc6-f624-448b-81d7-e36a8e059a44\") " pod="openstack/dnsmasq-dns-76fcf4b695-kg5vn" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.152003 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6690ffc6-f624-448b-81d7-e36a8e059a44-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-kg5vn\" (UID: \"6690ffc6-f624-448b-81d7-e36a8e059a44\") " pod="openstack/dnsmasq-dns-76fcf4b695-kg5vn" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.162679 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-sm9px" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.186166 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtrbb\" (UniqueName: \"kubernetes.io/projected/6690ffc6-f624-448b-81d7-e36a8e059a44-kube-api-access-rtrbb\") pod \"dnsmasq-dns-76fcf4b695-kg5vn\" (UID: \"6690ffc6-f624-448b-81d7-e36a8e059a44\") " pod="openstack/dnsmasq-dns-76fcf4b695-kg5vn" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.189884 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-kg5vn" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.210297 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-kg5vn"] Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.227295 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-dzzzv"] Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.228793 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dzzzv" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.235161 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.235437 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-jhphm" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.237596 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-dzzzv"] Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.248388 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb12a33e-172a-4c2d-8c97-8ae5486ce22d-combined-ca-bundle\") pod \"neutron-db-sync-w48xt\" (UID: \"fb12a33e-172a-4c2d-8c97-8ae5486ce22d\") " pod="openstack/neutron-db-sync-w48xt" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.252141 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r97sn\" (UniqueName: \"kubernetes.io/projected/fb12a33e-172a-4c2d-8c97-8ae5486ce22d-kube-api-access-r97sn\") pod \"neutron-db-sync-w48xt\" (UID: \"fb12a33e-172a-4c2d-8c97-8ae5486ce22d\") " pod="openstack/neutron-db-sync-w48xt" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.252354 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb12a33e-172a-4c2d-8c97-8ae5486ce22d-config\") pod \"neutron-db-sync-w48xt\" (UID: \"fb12a33e-172a-4c2d-8c97-8ae5486ce22d\") " pod="openstack/neutron-db-sync-w48xt" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.262334 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-w48xt"] Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.270562 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.272779 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.275408 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.288755 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.290189 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.290331 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.356244 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw8g5\" (UniqueName: \"kubernetes.io/projected/162fd834-bb59-43f4-98f0-9acb0333e71c-kube-api-access-mw8g5\") pod \"barbican-db-sync-dzzzv\" (UID: \"162fd834-bb59-43f4-98f0-9acb0333e71c\") " pod="openstack/barbican-db-sync-dzzzv" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.356337 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/162fd834-bb59-43f4-98f0-9acb0333e71c-combined-ca-bundle\") pod \"barbican-db-sync-dzzzv\" (UID: \"162fd834-bb59-43f4-98f0-9acb0333e71c\") " pod="openstack/barbican-db-sync-dzzzv" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.356407 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/162fd834-bb59-43f4-98f0-9acb0333e71c-db-sync-config-data\") pod \"barbican-db-sync-dzzzv\" (UID: \"162fd834-bb59-43f4-98f0-9acb0333e71c\") " pod="openstack/barbican-db-sync-dzzzv" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.356477 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb12a33e-172a-4c2d-8c97-8ae5486ce22d-combined-ca-bundle\") pod \"neutron-db-sync-w48xt\" (UID: \"fb12a33e-172a-4c2d-8c97-8ae5486ce22d\") " pod="openstack/neutron-db-sync-w48xt" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.356534 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r97sn\" (UniqueName: \"kubernetes.io/projected/fb12a33e-172a-4c2d-8c97-8ae5486ce22d-kube-api-access-r97sn\") pod \"neutron-db-sync-w48xt\" (UID: \"fb12a33e-172a-4c2d-8c97-8ae5486ce22d\") " pod="openstack/neutron-db-sync-w48xt" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.356629 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb12a33e-172a-4c2d-8c97-8ae5486ce22d-config\") pod \"neutron-db-sync-w48xt\" (UID: \"fb12a33e-172a-4c2d-8c97-8ae5486ce22d\") " pod="openstack/neutron-db-sync-w48xt" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.373809 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb12a33e-172a-4c2d-8c97-8ae5486ce22d-combined-ca-bundle\") pod \"neutron-db-sync-w48xt\" (UID: \"fb12a33e-172a-4c2d-8c97-8ae5486ce22d\") " pod="openstack/neutron-db-sync-w48xt" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.382802 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb12a33e-172a-4c2d-8c97-8ae5486ce22d-config\") pod \"neutron-db-sync-w48xt\" (UID: \"fb12a33e-172a-4c2d-8c97-8ae5486ce22d\") " pod="openstack/neutron-db-sync-w48xt" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.397727 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r97sn\" (UniqueName: \"kubernetes.io/projected/fb12a33e-172a-4c2d-8c97-8ae5486ce22d-kube-api-access-r97sn\") pod \"neutron-db-sync-w48xt\" (UID: \"fb12a33e-172a-4c2d-8c97-8ae5486ce22d\") " pod="openstack/neutron-db-sync-w48xt" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.458907 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-scripts\") pod \"ceilometer-0\" (UID: \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\") " pod="openstack/ceilometer-0" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.458957 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\") " pod="openstack/ceilometer-0" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.459375 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw8g5\" (UniqueName: \"kubernetes.io/projected/162fd834-bb59-43f4-98f0-9acb0333e71c-kube-api-access-mw8g5\") pod \"barbican-db-sync-dzzzv\" (UID: \"162fd834-bb59-43f4-98f0-9acb0333e71c\") " pod="openstack/barbican-db-sync-dzzzv" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.460580 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\") " pod="openstack/ceilometer-0" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.460711 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/162fd834-bb59-43f4-98f0-9acb0333e71c-combined-ca-bundle\") pod \"barbican-db-sync-dzzzv\" (UID: \"162fd834-bb59-43f4-98f0-9acb0333e71c\") " pod="openstack/barbican-db-sync-dzzzv" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.460783 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-run-httpd\") pod \"ceilometer-0\" (UID: \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\") " pod="openstack/ceilometer-0" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.460869 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tfr9\" (UniqueName: \"kubernetes.io/projected/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-kube-api-access-8tfr9\") pod \"ceilometer-0\" (UID: \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\") " pod="openstack/ceilometer-0" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.460896 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/162fd834-bb59-43f4-98f0-9acb0333e71c-db-sync-config-data\") pod \"barbican-db-sync-dzzzv\" (UID: \"162fd834-bb59-43f4-98f0-9acb0333e71c\") " pod="openstack/barbican-db-sync-dzzzv" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.460932 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-log-httpd\") pod \"ceilometer-0\" (UID: \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\") " pod="openstack/ceilometer-0" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.461235 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-config-data\") pod \"ceilometer-0\" (UID: \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\") " pod="openstack/ceilometer-0" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.479062 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/162fd834-bb59-43f4-98f0-9acb0333e71c-db-sync-config-data\") pod \"barbican-db-sync-dzzzv\" (UID: \"162fd834-bb59-43f4-98f0-9acb0333e71c\") " pod="openstack/barbican-db-sync-dzzzv" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.483501 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw8g5\" (UniqueName: \"kubernetes.io/projected/162fd834-bb59-43f4-98f0-9acb0333e71c-kube-api-access-mw8g5\") pod \"barbican-db-sync-dzzzv\" (UID: \"162fd834-bb59-43f4-98f0-9acb0333e71c\") " pod="openstack/barbican-db-sync-dzzzv" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.484302 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/162fd834-bb59-43f4-98f0-9acb0333e71c-combined-ca-bundle\") pod \"barbican-db-sync-dzzzv\" (UID: \"162fd834-bb59-43f4-98f0-9acb0333e71c\") " pod="openstack/barbican-db-sync-dzzzv" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.510209 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-w48xt" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.566210 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-log-httpd\") pod \"ceilometer-0\" (UID: \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\") " pod="openstack/ceilometer-0" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.566335 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-config-data\") pod \"ceilometer-0\" (UID: \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\") " pod="openstack/ceilometer-0" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.566450 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-scripts\") pod \"ceilometer-0\" (UID: \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\") " pod="openstack/ceilometer-0" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.566511 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\") " pod="openstack/ceilometer-0" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.566646 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\") " pod="openstack/ceilometer-0" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.566749 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-run-httpd\") pod \"ceilometer-0\" (UID: \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\") " pod="openstack/ceilometer-0" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.566829 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tfr9\" (UniqueName: \"kubernetes.io/projected/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-kube-api-access-8tfr9\") pod \"ceilometer-0\" (UID: \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\") " pod="openstack/ceilometer-0" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.567604 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-log-httpd\") pod \"ceilometer-0\" (UID: \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\") " pod="openstack/ceilometer-0" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.568810 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-run-httpd\") pod \"ceilometer-0\" (UID: \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\") " pod="openstack/ceilometer-0" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.573580 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\") " pod="openstack/ceilometer-0" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.574391 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\") " pod="openstack/ceilometer-0" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.574826 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-scripts\") pod \"ceilometer-0\" (UID: \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\") " pod="openstack/ceilometer-0" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.575802 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-config-data\") pod \"ceilometer-0\" (UID: \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\") " pod="openstack/ceilometer-0" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.595826 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tfr9\" (UniqueName: \"kubernetes.io/projected/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-kube-api-access-8tfr9\") pod \"ceilometer-0\" (UID: \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\") " pod="openstack/ceilometer-0" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.727047 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dzzzv" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.740699 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.810447 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-znbw9"] Feb 18 14:17:42 crc kubenswrapper[4817]: I0218 14:17:42.828956 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-2pxsw"] Feb 18 14:17:43 crc kubenswrapper[4817]: I0218 14:17:43.047174 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-z4c99"] Feb 18 14:17:43 crc kubenswrapper[4817]: I0218 14:17:43.256207 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-kg5vn"] Feb 18 14:17:43 crc kubenswrapper[4817]: W0218 14:17:43.278308 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6690ffc6_f624_448b_81d7_e36a8e059a44.slice/crio-18ff0d42aef7248985e94b80487347aca4fe620a6723074d51de5283441d48b3 WatchSource:0}: Error finding container 18ff0d42aef7248985e94b80487347aca4fe620a6723074d51de5283441d48b3: Status 404 returned error can't find the container with id 18ff0d42aef7248985e94b80487347aca4fe620a6723074d51de5283441d48b3 Feb 18 14:17:43 crc kubenswrapper[4817]: I0218 14:17:43.314678 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-znbw9" event={"ID":"c9aa3676-feff-46e6-b4fd-bcbdb8260948","Type":"ContainerStarted","Data":"1e774f0b2f33f063990281c8368c8db2fb981afeb1cb839effa4c8860d5c3761"} Feb 18 14:17:43 crc kubenswrapper[4817]: I0218 14:17:43.339027 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-z4c99" event={"ID":"1002ac08-1dd8-4224-8c9d-73ce218c2dc1","Type":"ContainerStarted","Data":"6e63a69db21086b5f6f9ceb3773949f401227a2d8d07c7b7eaab9b9760f9928d"} Feb 18 14:17:43 crc kubenswrapper[4817]: I0218 14:17:43.362277 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2pxsw" event={"ID":"07f0e519-a5f3-45a2-a5da-e10f851f18df","Type":"ContainerStarted","Data":"c24ed8c4067b5ac77147b6ac1e83f5638ebec6400da8610a6339156d9754294b"} Feb 18 14:17:43 crc kubenswrapper[4817]: I0218 14:17:43.492839 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-w48xt"] Feb 18 14:17:43 crc kubenswrapper[4817]: I0218 14:17:43.540117 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-4xc6g"] Feb 18 14:17:43 crc kubenswrapper[4817]: I0218 14:17:43.560602 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-sm9px"] Feb 18 14:17:43 crc kubenswrapper[4817]: I0218 14:17:43.621157 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-dzzzv"] Feb 18 14:17:43 crc kubenswrapper[4817]: I0218 14:17:43.645850 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:17:44 crc kubenswrapper[4817]: I0218 14:17:44.008924 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:17:44 crc kubenswrapper[4817]: I0218 14:17:44.398465 4817 generic.go:334] "Generic (PLEG): container finished" podID="6690ffc6-f624-448b-81d7-e36a8e059a44" containerID="424995e206af878c989f6525a5276cac37e9a5d54275e333164250b565d75508" exitCode=0 Feb 18 14:17:44 crc kubenswrapper[4817]: I0218 14:17:44.399033 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-kg5vn" event={"ID":"6690ffc6-f624-448b-81d7-e36a8e059a44","Type":"ContainerDied","Data":"424995e206af878c989f6525a5276cac37e9a5d54275e333164250b565d75508"} Feb 18 14:17:44 crc kubenswrapper[4817]: I0218 14:17:44.399418 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-kg5vn" event={"ID":"6690ffc6-f624-448b-81d7-e36a8e059a44","Type":"ContainerStarted","Data":"18ff0d42aef7248985e94b80487347aca4fe620a6723074d51de5283441d48b3"} Feb 18 14:17:44 crc kubenswrapper[4817]: I0218 14:17:44.402069 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-znbw9" event={"ID":"c9aa3676-feff-46e6-b4fd-bcbdb8260948","Type":"ContainerStarted","Data":"6666a003b16cfcafe8266b9637813f3eb39a7b98265f5a7618da6683932a4c02"} Feb 18 14:17:44 crc kubenswrapper[4817]: I0218 14:17:44.437336 4817 generic.go:334] "Generic (PLEG): container finished" podID="1002ac08-1dd8-4224-8c9d-73ce218c2dc1" containerID="01cf88fd64a207525d35d8fd76c619c669ad90a43d8e8fe191c8755c1510a2c0" exitCode=0 Feb 18 14:17:44 crc kubenswrapper[4817]: I0218 14:17:44.437436 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-z4c99" event={"ID":"1002ac08-1dd8-4224-8c9d-73ce218c2dc1","Type":"ContainerDied","Data":"01cf88fd64a207525d35d8fd76c619c669ad90a43d8e8fe191c8755c1510a2c0"} Feb 18 14:17:44 crc kubenswrapper[4817]: I0218 14:17:44.467266 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-w48xt" event={"ID":"fb12a33e-172a-4c2d-8c97-8ae5486ce22d","Type":"ContainerStarted","Data":"47021a4d2507c0e35400943910b6d9219bf27a681ec61204b6398fd99f2a3061"} Feb 18 14:17:44 crc kubenswrapper[4817]: I0218 14:17:44.467315 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-w48xt" event={"ID":"fb12a33e-172a-4c2d-8c97-8ae5486ce22d","Type":"ContainerStarted","Data":"429985b7d225d27c1d2c8b413a6eb9965dcb67726c7bf772f1316f08f004e838"} Feb 18 14:17:44 crc kubenswrapper[4817]: I0218 14:17:44.472535 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-znbw9" podStartSLOduration=3.472513794 podStartE2EDuration="3.472513794s" podCreationTimestamp="2026-02-18 14:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:17:44.454888101 +0000 UTC m=+1127.030424084" watchObservedRunningTime="2026-02-18 14:17:44.472513794 +0000 UTC m=+1127.048049767" Feb 18 14:17:44 crc kubenswrapper[4817]: I0218 14:17:44.473000 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-4xc6g" event={"ID":"0e385fdc-9c05-49ce-a823-dd99efa98e94","Type":"ContainerStarted","Data":"48d77ced66073362c74a1ad7dac9c386205f7a7b9ae6a91001dd855c7440a975"} Feb 18 14:17:44 crc kubenswrapper[4817]: I0218 14:17:44.489495 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4","Type":"ContainerStarted","Data":"674a8b91dc75ed6f1cd15f65a56f985f37af91a4bd2d749959aa5203fd6d43bd"} Feb 18 14:17:44 crc kubenswrapper[4817]: I0218 14:17:44.491330 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-sm9px" event={"ID":"a7ca3146-2c85-46da-baeb-ea06b64ffac0","Type":"ContainerStarted","Data":"a8758ad50f7e5fa376f09caa9a6fd2463f71789ff720dde4e798d8036b38554d"} Feb 18 14:17:44 crc kubenswrapper[4817]: I0218 14:17:44.495247 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dzzzv" event={"ID":"162fd834-bb59-43f4-98f0-9acb0333e71c","Type":"ContainerStarted","Data":"89eb4e6b2539f55acfb94cd58dcb6775472482b604951c503207fc700cb1db6b"} Feb 18 14:17:44 crc kubenswrapper[4817]: I0218 14:17:44.512484 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-w48xt" podStartSLOduration=3.512464659 podStartE2EDuration="3.512464659s" podCreationTimestamp="2026-02-18 14:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:17:44.499337939 +0000 UTC m=+1127.074873942" watchObservedRunningTime="2026-02-18 14:17:44.512464659 +0000 UTC m=+1127.088000642" Feb 18 14:17:45 crc kubenswrapper[4817]: I0218 14:17:45.046861 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-z4c99" Feb 18 14:17:45 crc kubenswrapper[4817]: I0218 14:17:45.203027 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-ovsdbserver-nb\") pod \"1002ac08-1dd8-4224-8c9d-73ce218c2dc1\" (UID: \"1002ac08-1dd8-4224-8c9d-73ce218c2dc1\") " Feb 18 14:17:45 crc kubenswrapper[4817]: I0218 14:17:45.203099 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-config\") pod \"1002ac08-1dd8-4224-8c9d-73ce218c2dc1\" (UID: \"1002ac08-1dd8-4224-8c9d-73ce218c2dc1\") " Feb 18 14:17:45 crc kubenswrapper[4817]: I0218 14:17:45.203237 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-dns-swift-storage-0\") pod \"1002ac08-1dd8-4224-8c9d-73ce218c2dc1\" (UID: \"1002ac08-1dd8-4224-8c9d-73ce218c2dc1\") " Feb 18 14:17:45 crc kubenswrapper[4817]: I0218 14:17:45.203822 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-ovsdbserver-sb\") pod \"1002ac08-1dd8-4224-8c9d-73ce218c2dc1\" (UID: \"1002ac08-1dd8-4224-8c9d-73ce218c2dc1\") " Feb 18 14:17:45 crc kubenswrapper[4817]: I0218 14:17:45.203851 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckfbt\" (UniqueName: \"kubernetes.io/projected/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-kube-api-access-ckfbt\") pod \"1002ac08-1dd8-4224-8c9d-73ce218c2dc1\" (UID: \"1002ac08-1dd8-4224-8c9d-73ce218c2dc1\") " Feb 18 14:17:45 crc kubenswrapper[4817]: I0218 14:17:45.203910 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-dns-svc\") pod \"1002ac08-1dd8-4224-8c9d-73ce218c2dc1\" (UID: \"1002ac08-1dd8-4224-8c9d-73ce218c2dc1\") " Feb 18 14:17:45 crc kubenswrapper[4817]: I0218 14:17:45.230909 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-kube-api-access-ckfbt" (OuterVolumeSpecName: "kube-api-access-ckfbt") pod "1002ac08-1dd8-4224-8c9d-73ce218c2dc1" (UID: "1002ac08-1dd8-4224-8c9d-73ce218c2dc1"). InnerVolumeSpecName "kube-api-access-ckfbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:17:45 crc kubenswrapper[4817]: I0218 14:17:45.264232 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1002ac08-1dd8-4224-8c9d-73ce218c2dc1" (UID: "1002ac08-1dd8-4224-8c9d-73ce218c2dc1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:17:45 crc kubenswrapper[4817]: I0218 14:17:45.277260 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1002ac08-1dd8-4224-8c9d-73ce218c2dc1" (UID: "1002ac08-1dd8-4224-8c9d-73ce218c2dc1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:17:45 crc kubenswrapper[4817]: I0218 14:17:45.306764 4817 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:45 crc kubenswrapper[4817]: I0218 14:17:45.306796 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckfbt\" (UniqueName: \"kubernetes.io/projected/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-kube-api-access-ckfbt\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:45 crc kubenswrapper[4817]: I0218 14:17:45.306806 4817 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:45 crc kubenswrapper[4817]: I0218 14:17:45.320455 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1002ac08-1dd8-4224-8c9d-73ce218c2dc1" (UID: "1002ac08-1dd8-4224-8c9d-73ce218c2dc1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:17:45 crc kubenswrapper[4817]: I0218 14:17:45.322152 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1002ac08-1dd8-4224-8c9d-73ce218c2dc1" (UID: "1002ac08-1dd8-4224-8c9d-73ce218c2dc1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:17:45 crc kubenswrapper[4817]: I0218 14:17:45.361916 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-config" (OuterVolumeSpecName: "config") pod "1002ac08-1dd8-4224-8c9d-73ce218c2dc1" (UID: "1002ac08-1dd8-4224-8c9d-73ce218c2dc1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:17:45 crc kubenswrapper[4817]: I0218 14:17:45.408502 4817 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:45 crc kubenswrapper[4817]: I0218 14:17:45.408534 4817 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:45 crc kubenswrapper[4817]: I0218 14:17:45.408545 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1002ac08-1dd8-4224-8c9d-73ce218c2dc1-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:17:45 crc kubenswrapper[4817]: I0218 14:17:45.532235 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-z4c99" event={"ID":"1002ac08-1dd8-4224-8c9d-73ce218c2dc1","Type":"ContainerDied","Data":"6e63a69db21086b5f6f9ceb3773949f401227a2d8d07c7b7eaab9b9760f9928d"} Feb 18 14:17:45 crc kubenswrapper[4817]: I0218 14:17:45.532294 4817 scope.go:117] "RemoveContainer" containerID="01cf88fd64a207525d35d8fd76c619c669ad90a43d8e8fe191c8755c1510a2c0" Feb 18 14:17:45 crc kubenswrapper[4817]: I0218 14:17:45.532445 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-z4c99" Feb 18 14:17:45 crc kubenswrapper[4817]: I0218 14:17:45.543180 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-kg5vn" event={"ID":"6690ffc6-f624-448b-81d7-e36a8e059a44","Type":"ContainerStarted","Data":"ec983c9bde2a9cfc3193bd7dfe622c99f90676a21ce1f58fa9eaf2e7ca242469"} Feb 18 14:17:45 crc kubenswrapper[4817]: I0218 14:17:45.607591 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76fcf4b695-kg5vn" podStartSLOduration=4.607569807 podStartE2EDuration="4.607569807s" podCreationTimestamp="2026-02-18 14:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:17:45.56952147 +0000 UTC m=+1128.145057453" watchObservedRunningTime="2026-02-18 14:17:45.607569807 +0000 UTC m=+1128.183105790" Feb 18 14:17:45 crc kubenswrapper[4817]: I0218 14:17:45.650786 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-z4c99"] Feb 18 14:17:45 crc kubenswrapper[4817]: I0218 14:17:45.676686 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-z4c99"] Feb 18 14:17:46 crc kubenswrapper[4817]: I0218 14:17:46.240057 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1002ac08-1dd8-4224-8c9d-73ce218c2dc1" path="/var/lib/kubelet/pods/1002ac08-1dd8-4224-8c9d-73ce218c2dc1/volumes" Feb 18 14:17:46 crc kubenswrapper[4817]: I0218 14:17:46.574043 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76fcf4b695-kg5vn" Feb 18 14:17:47 crc kubenswrapper[4817]: I0218 14:17:47.268280 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:47 crc kubenswrapper[4817]: I0218 14:17:47.281255 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:47 crc kubenswrapper[4817]: I0218 14:17:47.595198 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 18 14:17:50 crc kubenswrapper[4817]: I0218 14:17:50.618859 4817 generic.go:334] "Generic (PLEG): container finished" podID="c9aa3676-feff-46e6-b4fd-bcbdb8260948" containerID="6666a003b16cfcafe8266b9637813f3eb39a7b98265f5a7618da6683932a4c02" exitCode=0 Feb 18 14:17:50 crc kubenswrapper[4817]: I0218 14:17:50.618965 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-znbw9" event={"ID":"c9aa3676-feff-46e6-b4fd-bcbdb8260948","Type":"ContainerDied","Data":"6666a003b16cfcafe8266b9637813f3eb39a7b98265f5a7618da6683932a4c02"} Feb 18 14:17:52 crc kubenswrapper[4817]: I0218 14:17:52.192927 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76fcf4b695-kg5vn" Feb 18 14:17:52 crc kubenswrapper[4817]: I0218 14:17:52.264334 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-b795s"] Feb 18 14:17:52 crc kubenswrapper[4817]: I0218 14:17:52.265098 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77585f5f8c-b795s" podUID="eb3fc91e-a9df-429a-b494-fbac21db2ab9" containerName="dnsmasq-dns" containerID="cri-o://65c0df07b7ef056ed33a7df783205f0d6e71f08915af3ed65418d22a3a7a4a15" gracePeriod=10 Feb 18 14:17:52 crc kubenswrapper[4817]: I0218 14:17:52.535793 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-b795s" podUID="eb3fc91e-a9df-429a-b494-fbac21db2ab9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.146:5353: connect: connection refused" Feb 18 14:17:53 crc kubenswrapper[4817]: I0218 14:17:53.658101 4817 generic.go:334] "Generic (PLEG): container finished" podID="eb3fc91e-a9df-429a-b494-fbac21db2ab9" containerID="65c0df07b7ef056ed33a7df783205f0d6e71f08915af3ed65418d22a3a7a4a15" exitCode=0 Feb 18 14:17:53 crc kubenswrapper[4817]: I0218 14:17:53.658196 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-b795s" event={"ID":"eb3fc91e-a9df-429a-b494-fbac21db2ab9","Type":"ContainerDied","Data":"65c0df07b7ef056ed33a7df783205f0d6e71f08915af3ed65418d22a3a7a4a15"} Feb 18 14:18:02 crc kubenswrapper[4817]: I0218 14:18:02.536577 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-b795s" podUID="eb3fc91e-a9df-429a-b494-fbac21db2ab9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.146:5353: i/o timeout" Feb 18 14:18:03 crc kubenswrapper[4817]: I0218 14:18:03.008606 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-znbw9" Feb 18 14:18:03 crc kubenswrapper[4817]: I0218 14:18:03.201307 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9aa3676-feff-46e6-b4fd-bcbdb8260948-combined-ca-bundle\") pod \"c9aa3676-feff-46e6-b4fd-bcbdb8260948\" (UID: \"c9aa3676-feff-46e6-b4fd-bcbdb8260948\") " Feb 18 14:18:03 crc kubenswrapper[4817]: I0218 14:18:03.201388 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9aa3676-feff-46e6-b4fd-bcbdb8260948-config-data\") pod \"c9aa3676-feff-46e6-b4fd-bcbdb8260948\" (UID: \"c9aa3676-feff-46e6-b4fd-bcbdb8260948\") " Feb 18 14:18:03 crc kubenswrapper[4817]: I0218 14:18:03.201446 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8mgw\" (UniqueName: \"kubernetes.io/projected/c9aa3676-feff-46e6-b4fd-bcbdb8260948-kube-api-access-p8mgw\") pod \"c9aa3676-feff-46e6-b4fd-bcbdb8260948\" (UID: \"c9aa3676-feff-46e6-b4fd-bcbdb8260948\") " Feb 18 14:18:03 crc kubenswrapper[4817]: I0218 14:18:03.201576 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9aa3676-feff-46e6-b4fd-bcbdb8260948-scripts\") pod \"c9aa3676-feff-46e6-b4fd-bcbdb8260948\" (UID: \"c9aa3676-feff-46e6-b4fd-bcbdb8260948\") " Feb 18 14:18:03 crc kubenswrapper[4817]: I0218 14:18:03.201636 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c9aa3676-feff-46e6-b4fd-bcbdb8260948-credential-keys\") pod \"c9aa3676-feff-46e6-b4fd-bcbdb8260948\" (UID: \"c9aa3676-feff-46e6-b4fd-bcbdb8260948\") " Feb 18 14:18:03 crc kubenswrapper[4817]: I0218 14:18:03.201738 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c9aa3676-feff-46e6-b4fd-bcbdb8260948-fernet-keys\") pod \"c9aa3676-feff-46e6-b4fd-bcbdb8260948\" (UID: \"c9aa3676-feff-46e6-b4fd-bcbdb8260948\") " Feb 18 14:18:03 crc kubenswrapper[4817]: I0218 14:18:03.208587 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9aa3676-feff-46e6-b4fd-bcbdb8260948-scripts" (OuterVolumeSpecName: "scripts") pod "c9aa3676-feff-46e6-b4fd-bcbdb8260948" (UID: "c9aa3676-feff-46e6-b4fd-bcbdb8260948"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:18:03 crc kubenswrapper[4817]: I0218 14:18:03.209515 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9aa3676-feff-46e6-b4fd-bcbdb8260948-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "c9aa3676-feff-46e6-b4fd-bcbdb8260948" (UID: "c9aa3676-feff-46e6-b4fd-bcbdb8260948"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:18:03 crc kubenswrapper[4817]: I0218 14:18:03.211766 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9aa3676-feff-46e6-b4fd-bcbdb8260948-kube-api-access-p8mgw" (OuterVolumeSpecName: "kube-api-access-p8mgw") pod "c9aa3676-feff-46e6-b4fd-bcbdb8260948" (UID: "c9aa3676-feff-46e6-b4fd-bcbdb8260948"). InnerVolumeSpecName "kube-api-access-p8mgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:18:03 crc kubenswrapper[4817]: I0218 14:18:03.213227 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9aa3676-feff-46e6-b4fd-bcbdb8260948-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c9aa3676-feff-46e6-b4fd-bcbdb8260948" (UID: "c9aa3676-feff-46e6-b4fd-bcbdb8260948"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:18:03 crc kubenswrapper[4817]: I0218 14:18:03.233970 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9aa3676-feff-46e6-b4fd-bcbdb8260948-config-data" (OuterVolumeSpecName: "config-data") pod "c9aa3676-feff-46e6-b4fd-bcbdb8260948" (UID: "c9aa3676-feff-46e6-b4fd-bcbdb8260948"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:18:03 crc kubenswrapper[4817]: I0218 14:18:03.237305 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9aa3676-feff-46e6-b4fd-bcbdb8260948-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9aa3676-feff-46e6-b4fd-bcbdb8260948" (UID: "c9aa3676-feff-46e6-b4fd-bcbdb8260948"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:18:03 crc kubenswrapper[4817]: I0218 14:18:03.305733 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9aa3676-feff-46e6-b4fd-bcbdb8260948-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:03 crc kubenswrapper[4817]: I0218 14:18:03.305763 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9aa3676-feff-46e6-b4fd-bcbdb8260948-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:03 crc kubenswrapper[4817]: I0218 14:18:03.305772 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8mgw\" (UniqueName: \"kubernetes.io/projected/c9aa3676-feff-46e6-b4fd-bcbdb8260948-kube-api-access-p8mgw\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:03 crc kubenswrapper[4817]: I0218 14:18:03.305782 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9aa3676-feff-46e6-b4fd-bcbdb8260948-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:03 crc kubenswrapper[4817]: I0218 14:18:03.305791 4817 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c9aa3676-feff-46e6-b4fd-bcbdb8260948-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:03 crc kubenswrapper[4817]: I0218 14:18:03.305798 4817 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c9aa3676-feff-46e6-b4fd-bcbdb8260948-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:03 crc kubenswrapper[4817]: E0218 14:18:03.387451 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Feb 18 14:18:03 crc kubenswrapper[4817]: E0218 14:18:03.387625 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhcdh5b4h656h5f4h55bh58ch548h5ffh557h57bh646h58dh677h694h5cch9ch66bh697hf5h4h55bhf7h577h594h658hd8h58bh5c6h658h684h587q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8tfr9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(4f601e4c-dd3b-487c-98bb-8c4a0f4985f4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 14:18:03 crc kubenswrapper[4817]: I0218 14:18:03.743327 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-znbw9" event={"ID":"c9aa3676-feff-46e6-b4fd-bcbdb8260948","Type":"ContainerDied","Data":"1e774f0b2f33f063990281c8368c8db2fb981afeb1cb839effa4c8860d5c3761"} Feb 18 14:18:03 crc kubenswrapper[4817]: I0218 14:18:03.743365 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e774f0b2f33f063990281c8368c8db2fb981afeb1cb839effa4c8860d5c3761" Feb 18 14:18:03 crc kubenswrapper[4817]: I0218 14:18:03.743415 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-znbw9" Feb 18 14:18:03 crc kubenswrapper[4817]: E0218 14:18:03.891435 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Feb 18 14:18:03 crc kubenswrapper[4817]: E0218 14:18:03.891707 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mw8g5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-dzzzv_openstack(162fd834-bb59-43f4-98f0-9acb0333e71c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 14:18:03 crc kubenswrapper[4817]: E0218 14:18:03.893189 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-dzzzv" podUID="162fd834-bb59-43f4-98f0-9acb0333e71c" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.089233 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-znbw9"] Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.099653 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-znbw9"] Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.211075 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9aa3676-feff-46e6-b4fd-bcbdb8260948" path="/var/lib/kubelet/pods/c9aa3676-feff-46e6-b4fd-bcbdb8260948/volumes" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.211698 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-khzqv"] Feb 18 14:18:04 crc kubenswrapper[4817]: E0218 14:18:04.212081 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1002ac08-1dd8-4224-8c9d-73ce218c2dc1" containerName="init" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.212100 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="1002ac08-1dd8-4224-8c9d-73ce218c2dc1" containerName="init" Feb 18 14:18:04 crc kubenswrapper[4817]: E0218 14:18:04.212126 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9aa3676-feff-46e6-b4fd-bcbdb8260948" containerName="keystone-bootstrap" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.212135 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9aa3676-feff-46e6-b4fd-bcbdb8260948" containerName="keystone-bootstrap" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.212378 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9aa3676-feff-46e6-b4fd-bcbdb8260948" containerName="keystone-bootstrap" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.212401 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="1002ac08-1dd8-4224-8c9d-73ce218c2dc1" containerName="init" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.213296 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-khzqv" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.216329 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2ntmd" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.216351 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.216411 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.216360 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.216936 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-khzqv"] Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.323370 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bbf9142d-d5ef-4793-a36f-9f91a8146527-credential-keys\") pod \"keystone-bootstrap-khzqv\" (UID: \"bbf9142d-d5ef-4793-a36f-9f91a8146527\") " pod="openstack/keystone-bootstrap-khzqv" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.323431 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf9142d-d5ef-4793-a36f-9f91a8146527-config-data\") pod \"keystone-bootstrap-khzqv\" (UID: \"bbf9142d-d5ef-4793-a36f-9f91a8146527\") " pod="openstack/keystone-bootstrap-khzqv" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.323463 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bbf9142d-d5ef-4793-a36f-9f91a8146527-fernet-keys\") pod \"keystone-bootstrap-khzqv\" (UID: \"bbf9142d-d5ef-4793-a36f-9f91a8146527\") " pod="openstack/keystone-bootstrap-khzqv" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.323553 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbf9142d-d5ef-4793-a36f-9f91a8146527-scripts\") pod \"keystone-bootstrap-khzqv\" (UID: \"bbf9142d-d5ef-4793-a36f-9f91a8146527\") " pod="openstack/keystone-bootstrap-khzqv" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.323703 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc99v\" (UniqueName: \"kubernetes.io/projected/bbf9142d-d5ef-4793-a36f-9f91a8146527-kube-api-access-rc99v\") pod \"keystone-bootstrap-khzqv\" (UID: \"bbf9142d-d5ef-4793-a36f-9f91a8146527\") " pod="openstack/keystone-bootstrap-khzqv" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.323785 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf9142d-d5ef-4793-a36f-9f91a8146527-combined-ca-bundle\") pod \"keystone-bootstrap-khzqv\" (UID: \"bbf9142d-d5ef-4793-a36f-9f91a8146527\") " pod="openstack/keystone-bootstrap-khzqv" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.396202 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-b795s" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.425807 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc99v\" (UniqueName: \"kubernetes.io/projected/bbf9142d-d5ef-4793-a36f-9f91a8146527-kube-api-access-rc99v\") pod \"keystone-bootstrap-khzqv\" (UID: \"bbf9142d-d5ef-4793-a36f-9f91a8146527\") " pod="openstack/keystone-bootstrap-khzqv" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.425899 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf9142d-d5ef-4793-a36f-9f91a8146527-combined-ca-bundle\") pod \"keystone-bootstrap-khzqv\" (UID: \"bbf9142d-d5ef-4793-a36f-9f91a8146527\") " pod="openstack/keystone-bootstrap-khzqv" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.426007 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bbf9142d-d5ef-4793-a36f-9f91a8146527-credential-keys\") pod \"keystone-bootstrap-khzqv\" (UID: \"bbf9142d-d5ef-4793-a36f-9f91a8146527\") " pod="openstack/keystone-bootstrap-khzqv" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.426056 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf9142d-d5ef-4793-a36f-9f91a8146527-config-data\") pod \"keystone-bootstrap-khzqv\" (UID: \"bbf9142d-d5ef-4793-a36f-9f91a8146527\") " pod="openstack/keystone-bootstrap-khzqv" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.426093 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bbf9142d-d5ef-4793-a36f-9f91a8146527-fernet-keys\") pod \"keystone-bootstrap-khzqv\" (UID: \"bbf9142d-d5ef-4793-a36f-9f91a8146527\") " pod="openstack/keystone-bootstrap-khzqv" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.426139 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbf9142d-d5ef-4793-a36f-9f91a8146527-scripts\") pod \"keystone-bootstrap-khzqv\" (UID: \"bbf9142d-d5ef-4793-a36f-9f91a8146527\") " pod="openstack/keystone-bootstrap-khzqv" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.431284 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bbf9142d-d5ef-4793-a36f-9f91a8146527-fernet-keys\") pod \"keystone-bootstrap-khzqv\" (UID: \"bbf9142d-d5ef-4793-a36f-9f91a8146527\") " pod="openstack/keystone-bootstrap-khzqv" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.437915 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf9142d-d5ef-4793-a36f-9f91a8146527-config-data\") pod \"keystone-bootstrap-khzqv\" (UID: \"bbf9142d-d5ef-4793-a36f-9f91a8146527\") " pod="openstack/keystone-bootstrap-khzqv" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.437915 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bbf9142d-d5ef-4793-a36f-9f91a8146527-credential-keys\") pod \"keystone-bootstrap-khzqv\" (UID: \"bbf9142d-d5ef-4793-a36f-9f91a8146527\") " pod="openstack/keystone-bootstrap-khzqv" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.438078 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbf9142d-d5ef-4793-a36f-9f91a8146527-scripts\") pod \"keystone-bootstrap-khzqv\" (UID: \"bbf9142d-d5ef-4793-a36f-9f91a8146527\") " pod="openstack/keystone-bootstrap-khzqv" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.442397 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf9142d-d5ef-4793-a36f-9f91a8146527-combined-ca-bundle\") pod \"keystone-bootstrap-khzqv\" (UID: \"bbf9142d-d5ef-4793-a36f-9f91a8146527\") " pod="openstack/keystone-bootstrap-khzqv" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.447251 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc99v\" (UniqueName: \"kubernetes.io/projected/bbf9142d-d5ef-4793-a36f-9f91a8146527-kube-api-access-rc99v\") pod \"keystone-bootstrap-khzqv\" (UID: \"bbf9142d-d5ef-4793-a36f-9f91a8146527\") " pod="openstack/keystone-bootstrap-khzqv" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.527889 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb3fc91e-a9df-429a-b494-fbac21db2ab9-ovsdbserver-nb\") pod \"eb3fc91e-a9df-429a-b494-fbac21db2ab9\" (UID: \"eb3fc91e-a9df-429a-b494-fbac21db2ab9\") " Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.527940 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb3fc91e-a9df-429a-b494-fbac21db2ab9-dns-svc\") pod \"eb3fc91e-a9df-429a-b494-fbac21db2ab9\" (UID: \"eb3fc91e-a9df-429a-b494-fbac21db2ab9\") " Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.527961 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnc6r\" (UniqueName: \"kubernetes.io/projected/eb3fc91e-a9df-429a-b494-fbac21db2ab9-kube-api-access-jnc6r\") pod \"eb3fc91e-a9df-429a-b494-fbac21db2ab9\" (UID: \"eb3fc91e-a9df-429a-b494-fbac21db2ab9\") " Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.528164 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb3fc91e-a9df-429a-b494-fbac21db2ab9-dns-swift-storage-0\") pod \"eb3fc91e-a9df-429a-b494-fbac21db2ab9\" (UID: \"eb3fc91e-a9df-429a-b494-fbac21db2ab9\") " Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.528183 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb3fc91e-a9df-429a-b494-fbac21db2ab9-ovsdbserver-sb\") pod \"eb3fc91e-a9df-429a-b494-fbac21db2ab9\" (UID: \"eb3fc91e-a9df-429a-b494-fbac21db2ab9\") " Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.528213 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb3fc91e-a9df-429a-b494-fbac21db2ab9-config\") pod \"eb3fc91e-a9df-429a-b494-fbac21db2ab9\" (UID: \"eb3fc91e-a9df-429a-b494-fbac21db2ab9\") " Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.531668 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb3fc91e-a9df-429a-b494-fbac21db2ab9-kube-api-access-jnc6r" (OuterVolumeSpecName: "kube-api-access-jnc6r") pod "eb3fc91e-a9df-429a-b494-fbac21db2ab9" (UID: "eb3fc91e-a9df-429a-b494-fbac21db2ab9"). InnerVolumeSpecName "kube-api-access-jnc6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.550557 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-khzqv" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.575646 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb3fc91e-a9df-429a-b494-fbac21db2ab9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "eb3fc91e-a9df-429a-b494-fbac21db2ab9" (UID: "eb3fc91e-a9df-429a-b494-fbac21db2ab9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.578135 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb3fc91e-a9df-429a-b494-fbac21db2ab9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "eb3fc91e-a9df-429a-b494-fbac21db2ab9" (UID: "eb3fc91e-a9df-429a-b494-fbac21db2ab9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.580564 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb3fc91e-a9df-429a-b494-fbac21db2ab9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "eb3fc91e-a9df-429a-b494-fbac21db2ab9" (UID: "eb3fc91e-a9df-429a-b494-fbac21db2ab9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.587073 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb3fc91e-a9df-429a-b494-fbac21db2ab9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eb3fc91e-a9df-429a-b494-fbac21db2ab9" (UID: "eb3fc91e-a9df-429a-b494-fbac21db2ab9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.587750 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb3fc91e-a9df-429a-b494-fbac21db2ab9-config" (OuterVolumeSpecName: "config") pod "eb3fc91e-a9df-429a-b494-fbac21db2ab9" (UID: "eb3fc91e-a9df-429a-b494-fbac21db2ab9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.630282 4817 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb3fc91e-a9df-429a-b494-fbac21db2ab9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.630322 4817 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb3fc91e-a9df-429a-b494-fbac21db2ab9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.630336 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb3fc91e-a9df-429a-b494-fbac21db2ab9-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.630347 4817 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb3fc91e-a9df-429a-b494-fbac21db2ab9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.630356 4817 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb3fc91e-a9df-429a-b494-fbac21db2ab9-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.630364 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnc6r\" (UniqueName: \"kubernetes.io/projected/eb3fc91e-a9df-429a-b494-fbac21db2ab9-kube-api-access-jnc6r\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.755850 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-b795s" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.759417 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-b795s" event={"ID":"eb3fc91e-a9df-429a-b494-fbac21db2ab9","Type":"ContainerDied","Data":"ede924ac3267edd61ebd005671c9730bba813cafcadcc33a6786d93a813e2140"} Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.759484 4817 scope.go:117] "RemoveContainer" containerID="65c0df07b7ef056ed33a7df783205f0d6e71f08915af3ed65418d22a3a7a4a15" Feb 18 14:18:04 crc kubenswrapper[4817]: E0218 14:18:04.760273 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-dzzzv" podUID="162fd834-bb59-43f4-98f0-9acb0333e71c" Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.803745 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-b795s"] Feb 18 14:18:04 crc kubenswrapper[4817]: I0218 14:18:04.814838 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-b795s"] Feb 18 14:18:06 crc kubenswrapper[4817]: I0218 14:18:06.190182 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb3fc91e-a9df-429a-b494-fbac21db2ab9" path="/var/lib/kubelet/pods/eb3fc91e-a9df-429a-b494-fbac21db2ab9/volumes" Feb 18 14:18:07 crc kubenswrapper[4817]: I0218 14:18:07.538050 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-b795s" podUID="eb3fc91e-a9df-429a-b494-fbac21db2ab9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.146:5353: i/o timeout" Feb 18 14:18:18 crc kubenswrapper[4817]: E0218 14:18:18.715563 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Feb 18 14:18:18 crc kubenswrapper[4817]: E0218 14:18:18.716284 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-72d54,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-2pxsw_openstack(07f0e519-a5f3-45a2-a5da-e10f851f18df): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 14:18:18 crc kubenswrapper[4817]: E0218 14:18:18.717786 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-2pxsw" podUID="07f0e519-a5f3-45a2-a5da-e10f851f18df" Feb 18 14:18:18 crc kubenswrapper[4817]: E0218 14:18:18.907521 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-2pxsw" podUID="07f0e519-a5f3-45a2-a5da-e10f851f18df" Feb 18 14:18:21 crc kubenswrapper[4817]: I0218 14:18:20.998076 4817 scope.go:117] "RemoveContainer" containerID="e38ac216656ee1ec7984ec44b6b0624827bed0e3ba8fdb0f3b57817dde46e0d9" Feb 18 14:18:21 crc kubenswrapper[4817]: E0218 14:18:21.730171 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 18 14:18:21 crc kubenswrapper[4817]: E0218 14:18:21.730549 4817 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 18 14:18:21 crc kubenswrapper[4817]: E0218 14:18:21.730704 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fzncd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-4xc6g_openstack(0e385fdc-9c05-49ce-a823-dd99efa98e94): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 14:18:21 crc kubenswrapper[4817]: E0218 14:18:21.732008 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cloudkitty-db-sync-4xc6g" podUID="0e385fdc-9c05-49ce-a823-dd99efa98e94" Feb 18 14:18:21 crc kubenswrapper[4817]: E0218 14:18:21.943540 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-4xc6g" podUID="0e385fdc-9c05-49ce-a823-dd99efa98e94" Feb 18 14:18:22 crc kubenswrapper[4817]: E0218 14:18:22.009013 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified" Feb 18 14:18:22 crc kubenswrapper[4817]: E0218 14:18:22.009208 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-notification-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhcdh5b4h656h5f4h55bh58ch548h5ffh557h57bh646h58dh677h694h5cch9ch66bh697hf5h4h55bhf7h577h594h658hd8h58bh5c6h658h684h587q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-notification-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8tfr9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/notificationhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(4f601e4c-dd3b-487c-98bb-8c4a0f4985f4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 14:18:22 crc kubenswrapper[4817]: I0218 14:18:22.566245 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-khzqv"] Feb 18 14:18:22 crc kubenswrapper[4817]: W0218 14:18:22.587501 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbf9142d_d5ef_4793_a36f_9f91a8146527.slice/crio-7c19c67f58ce7f049d9f096f2593fde02dce0a9e251eb11c20771d23124979b4 WatchSource:0}: Error finding container 7c19c67f58ce7f049d9f096f2593fde02dce0a9e251eb11c20771d23124979b4: Status 404 returned error can't find the container with id 7c19c67f58ce7f049d9f096f2593fde02dce0a9e251eb11c20771d23124979b4 Feb 18 14:18:22 crc kubenswrapper[4817]: I0218 14:18:22.957413 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-2fk5c" event={"ID":"8de51007-ada2-49f5-90b2-11151899e3cf","Type":"ContainerStarted","Data":"a2b4cacda3d056587cd39f7a1cbb66789b5d848ebe2fd068888ed5b7d9b5436a"} Feb 18 14:18:22 crc kubenswrapper[4817]: I0218 14:18:22.959794 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-khzqv" event={"ID":"bbf9142d-d5ef-4793-a36f-9f91a8146527","Type":"ContainerStarted","Data":"567f816430e7e0392dac1420abe93af070fcfc13cb22e0b64a9d79548fe9e016"} Feb 18 14:18:22 crc kubenswrapper[4817]: I0218 14:18:22.959833 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-khzqv" event={"ID":"bbf9142d-d5ef-4793-a36f-9f91a8146527","Type":"ContainerStarted","Data":"7c19c67f58ce7f049d9f096f2593fde02dce0a9e251eb11c20771d23124979b4"} Feb 18 14:18:22 crc kubenswrapper[4817]: I0218 14:18:22.961960 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-sm9px" event={"ID":"a7ca3146-2c85-46da-baeb-ea06b64ffac0","Type":"ContainerStarted","Data":"e793335c53e9d73b4e1af11f18328dabf6d835bbe47c5bd702d8544002a500f0"} Feb 18 14:18:22 crc kubenswrapper[4817]: I0218 14:18:22.965138 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dzzzv" event={"ID":"162fd834-bb59-43f4-98f0-9acb0333e71c","Type":"ContainerStarted","Data":"9adb2e5a4fdf34e1a9a8889725d117c3402d9cc2068cccc21ebd90da49d70abf"} Feb 18 14:18:22 crc kubenswrapper[4817]: I0218 14:18:22.983254 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-2fk5c" podStartSLOduration=3.408791933 podStartE2EDuration="1m10.983225836s" podCreationTimestamp="2026-02-18 14:17:12 +0000 UTC" firstStartedPulling="2026-02-18 14:17:13.433001667 +0000 UTC m=+1096.008537650" lastFinishedPulling="2026-02-18 14:18:21.00743557 +0000 UTC m=+1163.582971553" observedRunningTime="2026-02-18 14:18:22.97505067 +0000 UTC m=+1165.550586653" watchObservedRunningTime="2026-02-18 14:18:22.983225836 +0000 UTC m=+1165.558761819" Feb 18 14:18:23 crc kubenswrapper[4817]: I0218 14:18:23.006641 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-khzqv" podStartSLOduration=19.006620384 podStartE2EDuration="19.006620384s" podCreationTimestamp="2026-02-18 14:18:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:18:23.002015288 +0000 UTC m=+1165.577551291" watchObservedRunningTime="2026-02-18 14:18:23.006620384 +0000 UTC m=+1165.582156367" Feb 18 14:18:23 crc kubenswrapper[4817]: I0218 14:18:23.023419 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-sm9px" podStartSLOduration=4.605937965 podStartE2EDuration="42.023398995s" podCreationTimestamp="2026-02-18 14:17:41 +0000 UTC" firstStartedPulling="2026-02-18 14:17:43.556441937 +0000 UTC m=+1126.131977920" lastFinishedPulling="2026-02-18 14:18:20.973902967 +0000 UTC m=+1163.549438950" observedRunningTime="2026-02-18 14:18:23.018329828 +0000 UTC m=+1165.593865811" watchObservedRunningTime="2026-02-18 14:18:23.023398995 +0000 UTC m=+1165.598934978" Feb 18 14:18:23 crc kubenswrapper[4817]: I0218 14:18:23.041458 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-dzzzv" podStartSLOduration=3.283761839 podStartE2EDuration="42.041436579s" podCreationTimestamp="2026-02-18 14:17:41 +0000 UTC" firstStartedPulling="2026-02-18 14:17:43.664040101 +0000 UTC m=+1126.239576084" lastFinishedPulling="2026-02-18 14:18:22.421714841 +0000 UTC m=+1164.997250824" observedRunningTime="2026-02-18 14:18:23.037349396 +0000 UTC m=+1165.612885379" watchObservedRunningTime="2026-02-18 14:18:23.041436579 +0000 UTC m=+1165.616972582" Feb 18 14:18:27 crc kubenswrapper[4817]: I0218 14:18:27.015912 4817 generic.go:334] "Generic (PLEG): container finished" podID="bbf9142d-d5ef-4793-a36f-9f91a8146527" containerID="567f816430e7e0392dac1420abe93af070fcfc13cb22e0b64a9d79548fe9e016" exitCode=0 Feb 18 14:18:27 crc kubenswrapper[4817]: I0218 14:18:27.016007 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-khzqv" event={"ID":"bbf9142d-d5ef-4793-a36f-9f91a8146527","Type":"ContainerDied","Data":"567f816430e7e0392dac1420abe93af070fcfc13cb22e0b64a9d79548fe9e016"} Feb 18 14:18:36 crc kubenswrapper[4817]: I0218 14:18:36.867818 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-khzqv" Feb 18 14:18:36 crc kubenswrapper[4817]: I0218 14:18:36.910264 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rc99v\" (UniqueName: \"kubernetes.io/projected/bbf9142d-d5ef-4793-a36f-9f91a8146527-kube-api-access-rc99v\") pod \"bbf9142d-d5ef-4793-a36f-9f91a8146527\" (UID: \"bbf9142d-d5ef-4793-a36f-9f91a8146527\") " Feb 18 14:18:36 crc kubenswrapper[4817]: I0218 14:18:36.910367 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbf9142d-d5ef-4793-a36f-9f91a8146527-scripts\") pod \"bbf9142d-d5ef-4793-a36f-9f91a8146527\" (UID: \"bbf9142d-d5ef-4793-a36f-9f91a8146527\") " Feb 18 14:18:36 crc kubenswrapper[4817]: I0218 14:18:36.910473 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bbf9142d-d5ef-4793-a36f-9f91a8146527-credential-keys\") pod \"bbf9142d-d5ef-4793-a36f-9f91a8146527\" (UID: \"bbf9142d-d5ef-4793-a36f-9f91a8146527\") " Feb 18 14:18:36 crc kubenswrapper[4817]: I0218 14:18:36.910577 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf9142d-d5ef-4793-a36f-9f91a8146527-combined-ca-bundle\") pod \"bbf9142d-d5ef-4793-a36f-9f91a8146527\" (UID: \"bbf9142d-d5ef-4793-a36f-9f91a8146527\") " Feb 18 14:18:36 crc kubenswrapper[4817]: I0218 14:18:36.910735 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf9142d-d5ef-4793-a36f-9f91a8146527-config-data\") pod \"bbf9142d-d5ef-4793-a36f-9f91a8146527\" (UID: \"bbf9142d-d5ef-4793-a36f-9f91a8146527\") " Feb 18 14:18:36 crc kubenswrapper[4817]: I0218 14:18:36.910811 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bbf9142d-d5ef-4793-a36f-9f91a8146527-fernet-keys\") pod \"bbf9142d-d5ef-4793-a36f-9f91a8146527\" (UID: \"bbf9142d-d5ef-4793-a36f-9f91a8146527\") " Feb 18 14:18:36 crc kubenswrapper[4817]: I0218 14:18:36.917885 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf9142d-d5ef-4793-a36f-9f91a8146527-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "bbf9142d-d5ef-4793-a36f-9f91a8146527" (UID: "bbf9142d-d5ef-4793-a36f-9f91a8146527"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:18:36 crc kubenswrapper[4817]: I0218 14:18:36.918165 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf9142d-d5ef-4793-a36f-9f91a8146527-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "bbf9142d-d5ef-4793-a36f-9f91a8146527" (UID: "bbf9142d-d5ef-4793-a36f-9f91a8146527"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:18:36 crc kubenswrapper[4817]: I0218 14:18:36.918936 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf9142d-d5ef-4793-a36f-9f91a8146527-scripts" (OuterVolumeSpecName: "scripts") pod "bbf9142d-d5ef-4793-a36f-9f91a8146527" (UID: "bbf9142d-d5ef-4793-a36f-9f91a8146527"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:18:36 crc kubenswrapper[4817]: I0218 14:18:36.919423 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbf9142d-d5ef-4793-a36f-9f91a8146527-kube-api-access-rc99v" (OuterVolumeSpecName: "kube-api-access-rc99v") pod "bbf9142d-d5ef-4793-a36f-9f91a8146527" (UID: "bbf9142d-d5ef-4793-a36f-9f91a8146527"). InnerVolumeSpecName "kube-api-access-rc99v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:18:36 crc kubenswrapper[4817]: I0218 14:18:36.944751 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf9142d-d5ef-4793-a36f-9f91a8146527-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bbf9142d-d5ef-4793-a36f-9f91a8146527" (UID: "bbf9142d-d5ef-4793-a36f-9f91a8146527"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:18:36 crc kubenswrapper[4817]: I0218 14:18:36.962254 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf9142d-d5ef-4793-a36f-9f91a8146527-config-data" (OuterVolumeSpecName: "config-data") pod "bbf9142d-d5ef-4793-a36f-9f91a8146527" (UID: "bbf9142d-d5ef-4793-a36f-9f91a8146527"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:18:37 crc kubenswrapper[4817]: I0218 14:18:37.013482 4817 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bbf9142d-d5ef-4793-a36f-9f91a8146527-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:37 crc kubenswrapper[4817]: I0218 14:18:37.013509 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf9142d-d5ef-4793-a36f-9f91a8146527-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:37 crc kubenswrapper[4817]: I0218 14:18:37.013524 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf9142d-d5ef-4793-a36f-9f91a8146527-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:37 crc kubenswrapper[4817]: I0218 14:18:37.013531 4817 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bbf9142d-d5ef-4793-a36f-9f91a8146527-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:37 crc kubenswrapper[4817]: I0218 14:18:37.013540 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rc99v\" (UniqueName: \"kubernetes.io/projected/bbf9142d-d5ef-4793-a36f-9f91a8146527-kube-api-access-rc99v\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:37 crc kubenswrapper[4817]: I0218 14:18:37.013550 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbf9142d-d5ef-4793-a36f-9f91a8146527-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:37 crc kubenswrapper[4817]: I0218 14:18:37.119647 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-khzqv" event={"ID":"bbf9142d-d5ef-4793-a36f-9f91a8146527","Type":"ContainerDied","Data":"7c19c67f58ce7f049d9f096f2593fde02dce0a9e251eb11c20771d23124979b4"} Feb 18 14:18:37 crc kubenswrapper[4817]: I0218 14:18:37.119689 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c19c67f58ce7f049d9f096f2593fde02dce0a9e251eb11c20771d23124979b4" Feb 18 14:18:37 crc kubenswrapper[4817]: I0218 14:18:37.119728 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-khzqv" Feb 18 14:18:37 crc kubenswrapper[4817]: I0218 14:18:37.993845 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-755bd56c8d-4mwpl"] Feb 18 14:18:37 crc kubenswrapper[4817]: E0218 14:18:37.994551 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb3fc91e-a9df-429a-b494-fbac21db2ab9" containerName="init" Feb 18 14:18:37 crc kubenswrapper[4817]: I0218 14:18:37.994564 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb3fc91e-a9df-429a-b494-fbac21db2ab9" containerName="init" Feb 18 14:18:37 crc kubenswrapper[4817]: E0218 14:18:37.994596 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbf9142d-d5ef-4793-a36f-9f91a8146527" containerName="keystone-bootstrap" Feb 18 14:18:37 crc kubenswrapper[4817]: I0218 14:18:37.994603 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf9142d-d5ef-4793-a36f-9f91a8146527" containerName="keystone-bootstrap" Feb 18 14:18:37 crc kubenswrapper[4817]: E0218 14:18:37.994612 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb3fc91e-a9df-429a-b494-fbac21db2ab9" containerName="dnsmasq-dns" Feb 18 14:18:37 crc kubenswrapper[4817]: I0218 14:18:37.994621 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb3fc91e-a9df-429a-b494-fbac21db2ab9" containerName="dnsmasq-dns" Feb 18 14:18:37 crc kubenswrapper[4817]: I0218 14:18:37.994815 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbf9142d-d5ef-4793-a36f-9f91a8146527" containerName="keystone-bootstrap" Feb 18 14:18:37 crc kubenswrapper[4817]: I0218 14:18:37.994837 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb3fc91e-a9df-429a-b494-fbac21db2ab9" containerName="dnsmasq-dns" Feb 18 14:18:37 crc kubenswrapper[4817]: I0218 14:18:37.995539 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-755bd56c8d-4mwpl" Feb 18 14:18:37 crc kubenswrapper[4817]: I0218 14:18:37.997850 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 18 14:18:37 crc kubenswrapper[4817]: I0218 14:18:37.998176 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 18 14:18:37 crc kubenswrapper[4817]: I0218 14:18:37.998341 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 18 14:18:37 crc kubenswrapper[4817]: I0218 14:18:37.998560 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 18 14:18:37 crc kubenswrapper[4817]: I0218 14:18:37.998749 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2ntmd" Feb 18 14:18:38 crc kubenswrapper[4817]: I0218 14:18:38.000437 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 18 14:18:38 crc kubenswrapper[4817]: I0218 14:18:38.021829 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-755bd56c8d-4mwpl"] Feb 18 14:18:38 crc kubenswrapper[4817]: I0218 14:18:38.040251 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47ace64c-6cdb-4868-8655-7e149f33a069-combined-ca-bundle\") pod \"keystone-755bd56c8d-4mwpl\" (UID: \"47ace64c-6cdb-4868-8655-7e149f33a069\") " pod="openstack/keystone-755bd56c8d-4mwpl" Feb 18 14:18:38 crc kubenswrapper[4817]: I0218 14:18:38.040320 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwtbj\" (UniqueName: \"kubernetes.io/projected/47ace64c-6cdb-4868-8655-7e149f33a069-kube-api-access-fwtbj\") pod \"keystone-755bd56c8d-4mwpl\" (UID: \"47ace64c-6cdb-4868-8655-7e149f33a069\") " pod="openstack/keystone-755bd56c8d-4mwpl" Feb 18 14:18:38 crc kubenswrapper[4817]: I0218 14:18:38.040361 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/47ace64c-6cdb-4868-8655-7e149f33a069-credential-keys\") pod \"keystone-755bd56c8d-4mwpl\" (UID: \"47ace64c-6cdb-4868-8655-7e149f33a069\") " pod="openstack/keystone-755bd56c8d-4mwpl" Feb 18 14:18:38 crc kubenswrapper[4817]: I0218 14:18:38.040432 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47ace64c-6cdb-4868-8655-7e149f33a069-config-data\") pod \"keystone-755bd56c8d-4mwpl\" (UID: \"47ace64c-6cdb-4868-8655-7e149f33a069\") " pod="openstack/keystone-755bd56c8d-4mwpl" Feb 18 14:18:38 crc kubenswrapper[4817]: I0218 14:18:38.040455 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/47ace64c-6cdb-4868-8655-7e149f33a069-fernet-keys\") pod \"keystone-755bd56c8d-4mwpl\" (UID: \"47ace64c-6cdb-4868-8655-7e149f33a069\") " pod="openstack/keystone-755bd56c8d-4mwpl" Feb 18 14:18:38 crc kubenswrapper[4817]: I0218 14:18:38.040528 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47ace64c-6cdb-4868-8655-7e149f33a069-public-tls-certs\") pod \"keystone-755bd56c8d-4mwpl\" (UID: \"47ace64c-6cdb-4868-8655-7e149f33a069\") " pod="openstack/keystone-755bd56c8d-4mwpl" Feb 18 14:18:38 crc kubenswrapper[4817]: I0218 14:18:38.040550 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47ace64c-6cdb-4868-8655-7e149f33a069-internal-tls-certs\") pod \"keystone-755bd56c8d-4mwpl\" (UID: \"47ace64c-6cdb-4868-8655-7e149f33a069\") " pod="openstack/keystone-755bd56c8d-4mwpl" Feb 18 14:18:38 crc kubenswrapper[4817]: I0218 14:18:38.040581 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47ace64c-6cdb-4868-8655-7e149f33a069-scripts\") pod \"keystone-755bd56c8d-4mwpl\" (UID: \"47ace64c-6cdb-4868-8655-7e149f33a069\") " pod="openstack/keystone-755bd56c8d-4mwpl" Feb 18 14:18:38 crc kubenswrapper[4817]: I0218 14:18:38.142090 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47ace64c-6cdb-4868-8655-7e149f33a069-config-data\") pod \"keystone-755bd56c8d-4mwpl\" (UID: \"47ace64c-6cdb-4868-8655-7e149f33a069\") " pod="openstack/keystone-755bd56c8d-4mwpl" Feb 18 14:18:38 crc kubenswrapper[4817]: I0218 14:18:38.142361 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/47ace64c-6cdb-4868-8655-7e149f33a069-fernet-keys\") pod \"keystone-755bd56c8d-4mwpl\" (UID: \"47ace64c-6cdb-4868-8655-7e149f33a069\") " pod="openstack/keystone-755bd56c8d-4mwpl" Feb 18 14:18:38 crc kubenswrapper[4817]: I0218 14:18:38.142499 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47ace64c-6cdb-4868-8655-7e149f33a069-public-tls-certs\") pod \"keystone-755bd56c8d-4mwpl\" (UID: \"47ace64c-6cdb-4868-8655-7e149f33a069\") " pod="openstack/keystone-755bd56c8d-4mwpl" Feb 18 14:18:38 crc kubenswrapper[4817]: I0218 14:18:38.142582 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47ace64c-6cdb-4868-8655-7e149f33a069-internal-tls-certs\") pod \"keystone-755bd56c8d-4mwpl\" (UID: \"47ace64c-6cdb-4868-8655-7e149f33a069\") " pod="openstack/keystone-755bd56c8d-4mwpl" Feb 18 14:18:38 crc kubenswrapper[4817]: I0218 14:18:38.142696 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47ace64c-6cdb-4868-8655-7e149f33a069-scripts\") pod \"keystone-755bd56c8d-4mwpl\" (UID: \"47ace64c-6cdb-4868-8655-7e149f33a069\") " pod="openstack/keystone-755bd56c8d-4mwpl" Feb 18 14:18:38 crc kubenswrapper[4817]: I0218 14:18:38.142836 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47ace64c-6cdb-4868-8655-7e149f33a069-combined-ca-bundle\") pod \"keystone-755bd56c8d-4mwpl\" (UID: \"47ace64c-6cdb-4868-8655-7e149f33a069\") " pod="openstack/keystone-755bd56c8d-4mwpl" Feb 18 14:18:38 crc kubenswrapper[4817]: I0218 14:18:38.142918 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwtbj\" (UniqueName: \"kubernetes.io/projected/47ace64c-6cdb-4868-8655-7e149f33a069-kube-api-access-fwtbj\") pod \"keystone-755bd56c8d-4mwpl\" (UID: \"47ace64c-6cdb-4868-8655-7e149f33a069\") " pod="openstack/keystone-755bd56c8d-4mwpl" Feb 18 14:18:38 crc kubenswrapper[4817]: I0218 14:18:38.143021 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/47ace64c-6cdb-4868-8655-7e149f33a069-credential-keys\") pod \"keystone-755bd56c8d-4mwpl\" (UID: \"47ace64c-6cdb-4868-8655-7e149f33a069\") " pod="openstack/keystone-755bd56c8d-4mwpl" Feb 18 14:18:38 crc kubenswrapper[4817]: I0218 14:18:38.146220 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47ace64c-6cdb-4868-8655-7e149f33a069-public-tls-certs\") pod \"keystone-755bd56c8d-4mwpl\" (UID: \"47ace64c-6cdb-4868-8655-7e149f33a069\") " pod="openstack/keystone-755bd56c8d-4mwpl" Feb 18 14:18:38 crc kubenswrapper[4817]: I0218 14:18:38.146268 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47ace64c-6cdb-4868-8655-7e149f33a069-internal-tls-certs\") pod \"keystone-755bd56c8d-4mwpl\" (UID: \"47ace64c-6cdb-4868-8655-7e149f33a069\") " pod="openstack/keystone-755bd56c8d-4mwpl" Feb 18 14:18:38 crc kubenswrapper[4817]: I0218 14:18:38.147173 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/47ace64c-6cdb-4868-8655-7e149f33a069-credential-keys\") pod \"keystone-755bd56c8d-4mwpl\" (UID: \"47ace64c-6cdb-4868-8655-7e149f33a069\") " pod="openstack/keystone-755bd56c8d-4mwpl" Feb 18 14:18:38 crc kubenswrapper[4817]: I0218 14:18:38.148156 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47ace64c-6cdb-4868-8655-7e149f33a069-scripts\") pod \"keystone-755bd56c8d-4mwpl\" (UID: \"47ace64c-6cdb-4868-8655-7e149f33a069\") " pod="openstack/keystone-755bd56c8d-4mwpl" Feb 18 14:18:38 crc kubenswrapper[4817]: I0218 14:18:38.149781 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47ace64c-6cdb-4868-8655-7e149f33a069-config-data\") pod \"keystone-755bd56c8d-4mwpl\" (UID: \"47ace64c-6cdb-4868-8655-7e149f33a069\") " pod="openstack/keystone-755bd56c8d-4mwpl" Feb 18 14:18:38 crc kubenswrapper[4817]: I0218 14:18:38.159765 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47ace64c-6cdb-4868-8655-7e149f33a069-combined-ca-bundle\") pod \"keystone-755bd56c8d-4mwpl\" (UID: \"47ace64c-6cdb-4868-8655-7e149f33a069\") " pod="openstack/keystone-755bd56c8d-4mwpl" Feb 18 14:18:38 crc kubenswrapper[4817]: I0218 14:18:38.161142 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwtbj\" (UniqueName: \"kubernetes.io/projected/47ace64c-6cdb-4868-8655-7e149f33a069-kube-api-access-fwtbj\") pod \"keystone-755bd56c8d-4mwpl\" (UID: \"47ace64c-6cdb-4868-8655-7e149f33a069\") " pod="openstack/keystone-755bd56c8d-4mwpl" Feb 18 14:18:38 crc kubenswrapper[4817]: I0218 14:18:38.161645 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/47ace64c-6cdb-4868-8655-7e149f33a069-fernet-keys\") pod \"keystone-755bd56c8d-4mwpl\" (UID: \"47ace64c-6cdb-4868-8655-7e149f33a069\") " pod="openstack/keystone-755bd56c8d-4mwpl" Feb 18 14:18:38 crc kubenswrapper[4817]: I0218 14:18:38.315489 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-755bd56c8d-4mwpl" Feb 18 14:18:38 crc kubenswrapper[4817]: I0218 14:18:38.501935 4817 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 14:18:38 crc kubenswrapper[4817]: I0218 14:18:38.934467 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-755bd56c8d-4mwpl"] Feb 18 14:18:38 crc kubenswrapper[4817]: W0218 14:18:38.942250 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47ace64c_6cdb_4868_8655_7e149f33a069.slice/crio-ffa4fa9024ffa5e641d9c5789e00b711025d489c54ccbdb69681110e489fb33d WatchSource:0}: Error finding container ffa4fa9024ffa5e641d9c5789e00b711025d489c54ccbdb69681110e489fb33d: Status 404 returned error can't find the container with id ffa4fa9024ffa5e641d9c5789e00b711025d489c54ccbdb69681110e489fb33d Feb 18 14:18:39 crc kubenswrapper[4817]: I0218 14:18:39.146445 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4","Type":"ContainerStarted","Data":"b7e051dd86487059370e005d589a352b61e72aace98e62bd087e96d3895fdb12"} Feb 18 14:18:39 crc kubenswrapper[4817]: I0218 14:18:39.149863 4817 generic.go:334] "Generic (PLEG): container finished" podID="a7ca3146-2c85-46da-baeb-ea06b64ffac0" containerID="e793335c53e9d73b4e1af11f18328dabf6d835bbe47c5bd702d8544002a500f0" exitCode=0 Feb 18 14:18:39 crc kubenswrapper[4817]: I0218 14:18:39.149942 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-sm9px" event={"ID":"a7ca3146-2c85-46da-baeb-ea06b64ffac0","Type":"ContainerDied","Data":"e793335c53e9d73b4e1af11f18328dabf6d835bbe47c5bd702d8544002a500f0"} Feb 18 14:18:39 crc kubenswrapper[4817]: I0218 14:18:39.152039 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-4xc6g" event={"ID":"0e385fdc-9c05-49ce-a823-dd99efa98e94","Type":"ContainerStarted","Data":"00e17bd6056b65696e77d9641aa5c8e9534f3cfe54f718e4038fd916bdae0100"} Feb 18 14:18:39 crc kubenswrapper[4817]: I0218 14:18:39.153631 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-755bd56c8d-4mwpl" event={"ID":"47ace64c-6cdb-4868-8655-7e149f33a069","Type":"ContainerStarted","Data":"ffa4fa9024ffa5e641d9c5789e00b711025d489c54ccbdb69681110e489fb33d"} Feb 18 14:18:39 crc kubenswrapper[4817]: I0218 14:18:39.155488 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2pxsw" event={"ID":"07f0e519-a5f3-45a2-a5da-e10f851f18df","Type":"ContainerStarted","Data":"215f78b9b93a35d655cb71cb9ad214093e60b960a2b02feae20952738469759d"} Feb 18 14:18:39 crc kubenswrapper[4817]: I0218 14:18:39.194958 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-4xc6g" podStartSLOduration=3.517854824 podStartE2EDuration="58.194938092s" podCreationTimestamp="2026-02-18 14:17:41 +0000 UTC" firstStartedPulling="2026-02-18 14:17:43.554268852 +0000 UTC m=+1126.129804835" lastFinishedPulling="2026-02-18 14:18:38.23135213 +0000 UTC m=+1180.806888103" observedRunningTime="2026-02-18 14:18:39.189601908 +0000 UTC m=+1181.765137901" watchObservedRunningTime="2026-02-18 14:18:39.194938092 +0000 UTC m=+1181.770474075" Feb 18 14:18:39 crc kubenswrapper[4817]: I0218 14:18:39.224297 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-2pxsw" podStartSLOduration=3.285781362 podStartE2EDuration="58.22427006s" podCreationTimestamp="2026-02-18 14:17:41 +0000 UTC" firstStartedPulling="2026-02-18 14:17:42.830270674 +0000 UTC m=+1125.405806657" lastFinishedPulling="2026-02-18 14:18:37.768759372 +0000 UTC m=+1180.344295355" observedRunningTime="2026-02-18 14:18:39.211377266 +0000 UTC m=+1181.786913279" watchObservedRunningTime="2026-02-18 14:18:39.22427006 +0000 UTC m=+1181.799806053" Feb 18 14:18:40 crc kubenswrapper[4817]: I0218 14:18:40.166641 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-755bd56c8d-4mwpl" event={"ID":"47ace64c-6cdb-4868-8655-7e149f33a069","Type":"ContainerStarted","Data":"bc2e8a88c10a257d267fb811d0e14c496939070d74a1ce74d9db5d098c9a5429"} Feb 18 14:18:40 crc kubenswrapper[4817]: I0218 14:18:40.199190 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-755bd56c8d-4mwpl" podStartSLOduration=3.199162276 podStartE2EDuration="3.199162276s" podCreationTimestamp="2026-02-18 14:18:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:18:40.185464152 +0000 UTC m=+1182.761000135" watchObservedRunningTime="2026-02-18 14:18:40.199162276 +0000 UTC m=+1182.774698269" Feb 18 14:18:40 crc kubenswrapper[4817]: I0218 14:18:40.660471 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-sm9px" Feb 18 14:18:40 crc kubenswrapper[4817]: I0218 14:18:40.710906 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7ca3146-2c85-46da-baeb-ea06b64ffac0-scripts\") pod \"a7ca3146-2c85-46da-baeb-ea06b64ffac0\" (UID: \"a7ca3146-2c85-46da-baeb-ea06b64ffac0\") " Feb 18 14:18:40 crc kubenswrapper[4817]: I0218 14:18:40.710962 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zr979\" (UniqueName: \"kubernetes.io/projected/a7ca3146-2c85-46da-baeb-ea06b64ffac0-kube-api-access-zr979\") pod \"a7ca3146-2c85-46da-baeb-ea06b64ffac0\" (UID: \"a7ca3146-2c85-46da-baeb-ea06b64ffac0\") " Feb 18 14:18:40 crc kubenswrapper[4817]: I0218 14:18:40.711261 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7ca3146-2c85-46da-baeb-ea06b64ffac0-logs\") pod \"a7ca3146-2c85-46da-baeb-ea06b64ffac0\" (UID: \"a7ca3146-2c85-46da-baeb-ea06b64ffac0\") " Feb 18 14:18:40 crc kubenswrapper[4817]: I0218 14:18:40.711280 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7ca3146-2c85-46da-baeb-ea06b64ffac0-config-data\") pod \"a7ca3146-2c85-46da-baeb-ea06b64ffac0\" (UID: \"a7ca3146-2c85-46da-baeb-ea06b64ffac0\") " Feb 18 14:18:40 crc kubenswrapper[4817]: I0218 14:18:40.711316 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7ca3146-2c85-46da-baeb-ea06b64ffac0-combined-ca-bundle\") pod \"a7ca3146-2c85-46da-baeb-ea06b64ffac0\" (UID: \"a7ca3146-2c85-46da-baeb-ea06b64ffac0\") " Feb 18 14:18:40 crc kubenswrapper[4817]: I0218 14:18:40.711656 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7ca3146-2c85-46da-baeb-ea06b64ffac0-logs" (OuterVolumeSpecName: "logs") pod "a7ca3146-2c85-46da-baeb-ea06b64ffac0" (UID: "a7ca3146-2c85-46da-baeb-ea06b64ffac0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:18:40 crc kubenswrapper[4817]: I0218 14:18:40.711967 4817 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7ca3146-2c85-46da-baeb-ea06b64ffac0-logs\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:40 crc kubenswrapper[4817]: I0218 14:18:40.728304 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7ca3146-2c85-46da-baeb-ea06b64ffac0-kube-api-access-zr979" (OuterVolumeSpecName: "kube-api-access-zr979") pod "a7ca3146-2c85-46da-baeb-ea06b64ffac0" (UID: "a7ca3146-2c85-46da-baeb-ea06b64ffac0"). InnerVolumeSpecName "kube-api-access-zr979". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:18:40 crc kubenswrapper[4817]: I0218 14:18:40.734164 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7ca3146-2c85-46da-baeb-ea06b64ffac0-scripts" (OuterVolumeSpecName: "scripts") pod "a7ca3146-2c85-46da-baeb-ea06b64ffac0" (UID: "a7ca3146-2c85-46da-baeb-ea06b64ffac0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:18:40 crc kubenswrapper[4817]: I0218 14:18:40.797189 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7ca3146-2c85-46da-baeb-ea06b64ffac0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7ca3146-2c85-46da-baeb-ea06b64ffac0" (UID: "a7ca3146-2c85-46da-baeb-ea06b64ffac0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:18:40 crc kubenswrapper[4817]: I0218 14:18:40.817261 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7ca3146-2c85-46da-baeb-ea06b64ffac0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:40 crc kubenswrapper[4817]: I0218 14:18:40.817293 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7ca3146-2c85-46da-baeb-ea06b64ffac0-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:40 crc kubenswrapper[4817]: I0218 14:18:40.817304 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zr979\" (UniqueName: \"kubernetes.io/projected/a7ca3146-2c85-46da-baeb-ea06b64ffac0-kube-api-access-zr979\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:40 crc kubenswrapper[4817]: I0218 14:18:40.845290 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7ca3146-2c85-46da-baeb-ea06b64ffac0-config-data" (OuterVolumeSpecName: "config-data") pod "a7ca3146-2c85-46da-baeb-ea06b64ffac0" (UID: "a7ca3146-2c85-46da-baeb-ea06b64ffac0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:18:40 crc kubenswrapper[4817]: I0218 14:18:40.921783 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7ca3146-2c85-46da-baeb-ea06b64ffac0-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.181629 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-sm9px" Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.181635 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-sm9px" event={"ID":"a7ca3146-2c85-46da-baeb-ea06b64ffac0","Type":"ContainerDied","Data":"a8758ad50f7e5fa376f09caa9a6fd2463f71789ff720dde4e798d8036b38554d"} Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.181855 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8758ad50f7e5fa376f09caa9a6fd2463f71789ff720dde4e798d8036b38554d" Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.181917 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-755bd56c8d-4mwpl" Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.350914 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7b57694bbd-rpg5b"] Feb 18 14:18:41 crc kubenswrapper[4817]: E0218 14:18:41.351392 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ca3146-2c85-46da-baeb-ea06b64ffac0" containerName="placement-db-sync" Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.351420 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ca3146-2c85-46da-baeb-ea06b64ffac0" containerName="placement-db-sync" Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.351687 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7ca3146-2c85-46da-baeb-ea06b64ffac0" containerName="placement-db-sync" Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.353012 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7b57694bbd-rpg5b" Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.357066 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.357281 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.357458 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-lms5l" Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.357595 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.357744 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.378119 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7b57694bbd-rpg5b"] Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.435016 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d284bf7a-b9d0-4fe3-b8bc-06a64d104853-logs\") pod \"placement-7b57694bbd-rpg5b\" (UID: \"d284bf7a-b9d0-4fe3-b8bc-06a64d104853\") " pod="openstack/placement-7b57694bbd-rpg5b" Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.435073 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rf6v\" (UniqueName: \"kubernetes.io/projected/d284bf7a-b9d0-4fe3-b8bc-06a64d104853-kube-api-access-2rf6v\") pod \"placement-7b57694bbd-rpg5b\" (UID: \"d284bf7a-b9d0-4fe3-b8bc-06a64d104853\") " pod="openstack/placement-7b57694bbd-rpg5b" Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.435111 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d284bf7a-b9d0-4fe3-b8bc-06a64d104853-internal-tls-certs\") pod \"placement-7b57694bbd-rpg5b\" (UID: \"d284bf7a-b9d0-4fe3-b8bc-06a64d104853\") " pod="openstack/placement-7b57694bbd-rpg5b" Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.435182 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d284bf7a-b9d0-4fe3-b8bc-06a64d104853-config-data\") pod \"placement-7b57694bbd-rpg5b\" (UID: \"d284bf7a-b9d0-4fe3-b8bc-06a64d104853\") " pod="openstack/placement-7b57694bbd-rpg5b" Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.435207 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d284bf7a-b9d0-4fe3-b8bc-06a64d104853-public-tls-certs\") pod \"placement-7b57694bbd-rpg5b\" (UID: \"d284bf7a-b9d0-4fe3-b8bc-06a64d104853\") " pod="openstack/placement-7b57694bbd-rpg5b" Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.435275 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d284bf7a-b9d0-4fe3-b8bc-06a64d104853-scripts\") pod \"placement-7b57694bbd-rpg5b\" (UID: \"d284bf7a-b9d0-4fe3-b8bc-06a64d104853\") " pod="openstack/placement-7b57694bbd-rpg5b" Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.435316 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d284bf7a-b9d0-4fe3-b8bc-06a64d104853-combined-ca-bundle\") pod \"placement-7b57694bbd-rpg5b\" (UID: \"d284bf7a-b9d0-4fe3-b8bc-06a64d104853\") " pod="openstack/placement-7b57694bbd-rpg5b" Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.536575 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d284bf7a-b9d0-4fe3-b8bc-06a64d104853-config-data\") pod \"placement-7b57694bbd-rpg5b\" (UID: \"d284bf7a-b9d0-4fe3-b8bc-06a64d104853\") " pod="openstack/placement-7b57694bbd-rpg5b" Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.536677 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d284bf7a-b9d0-4fe3-b8bc-06a64d104853-public-tls-certs\") pod \"placement-7b57694bbd-rpg5b\" (UID: \"d284bf7a-b9d0-4fe3-b8bc-06a64d104853\") " pod="openstack/placement-7b57694bbd-rpg5b" Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.536713 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d284bf7a-b9d0-4fe3-b8bc-06a64d104853-scripts\") pod \"placement-7b57694bbd-rpg5b\" (UID: \"d284bf7a-b9d0-4fe3-b8bc-06a64d104853\") " pod="openstack/placement-7b57694bbd-rpg5b" Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.537316 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d284bf7a-b9d0-4fe3-b8bc-06a64d104853-combined-ca-bundle\") pod \"placement-7b57694bbd-rpg5b\" (UID: \"d284bf7a-b9d0-4fe3-b8bc-06a64d104853\") " pod="openstack/placement-7b57694bbd-rpg5b" Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.537470 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d284bf7a-b9d0-4fe3-b8bc-06a64d104853-logs\") pod \"placement-7b57694bbd-rpg5b\" (UID: \"d284bf7a-b9d0-4fe3-b8bc-06a64d104853\") " pod="openstack/placement-7b57694bbd-rpg5b" Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.537512 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rf6v\" (UniqueName: \"kubernetes.io/projected/d284bf7a-b9d0-4fe3-b8bc-06a64d104853-kube-api-access-2rf6v\") pod \"placement-7b57694bbd-rpg5b\" (UID: \"d284bf7a-b9d0-4fe3-b8bc-06a64d104853\") " pod="openstack/placement-7b57694bbd-rpg5b" Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.537552 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d284bf7a-b9d0-4fe3-b8bc-06a64d104853-internal-tls-certs\") pod \"placement-7b57694bbd-rpg5b\" (UID: \"d284bf7a-b9d0-4fe3-b8bc-06a64d104853\") " pod="openstack/placement-7b57694bbd-rpg5b" Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.538465 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d284bf7a-b9d0-4fe3-b8bc-06a64d104853-logs\") pod \"placement-7b57694bbd-rpg5b\" (UID: \"d284bf7a-b9d0-4fe3-b8bc-06a64d104853\") " pod="openstack/placement-7b57694bbd-rpg5b" Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.555472 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d284bf7a-b9d0-4fe3-b8bc-06a64d104853-scripts\") pod \"placement-7b57694bbd-rpg5b\" (UID: \"d284bf7a-b9d0-4fe3-b8bc-06a64d104853\") " pod="openstack/placement-7b57694bbd-rpg5b" Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.555604 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d284bf7a-b9d0-4fe3-b8bc-06a64d104853-public-tls-certs\") pod \"placement-7b57694bbd-rpg5b\" (UID: \"d284bf7a-b9d0-4fe3-b8bc-06a64d104853\") " pod="openstack/placement-7b57694bbd-rpg5b" Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.556193 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d284bf7a-b9d0-4fe3-b8bc-06a64d104853-config-data\") pod \"placement-7b57694bbd-rpg5b\" (UID: \"d284bf7a-b9d0-4fe3-b8bc-06a64d104853\") " pod="openstack/placement-7b57694bbd-rpg5b" Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.556370 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d284bf7a-b9d0-4fe3-b8bc-06a64d104853-internal-tls-certs\") pod \"placement-7b57694bbd-rpg5b\" (UID: \"d284bf7a-b9d0-4fe3-b8bc-06a64d104853\") " pod="openstack/placement-7b57694bbd-rpg5b" Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.556418 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d284bf7a-b9d0-4fe3-b8bc-06a64d104853-combined-ca-bundle\") pod \"placement-7b57694bbd-rpg5b\" (UID: \"d284bf7a-b9d0-4fe3-b8bc-06a64d104853\") " pod="openstack/placement-7b57694bbd-rpg5b" Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.570722 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rf6v\" (UniqueName: \"kubernetes.io/projected/d284bf7a-b9d0-4fe3-b8bc-06a64d104853-kube-api-access-2rf6v\") pod \"placement-7b57694bbd-rpg5b\" (UID: \"d284bf7a-b9d0-4fe3-b8bc-06a64d104853\") " pod="openstack/placement-7b57694bbd-rpg5b" Feb 18 14:18:41 crc kubenswrapper[4817]: I0218 14:18:41.698799 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7b57694bbd-rpg5b" Feb 18 14:18:42 crc kubenswrapper[4817]: I0218 14:18:42.237202 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7b57694bbd-rpg5b"] Feb 18 14:18:43 crc kubenswrapper[4817]: I0218 14:18:43.211251 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7b57694bbd-rpg5b" event={"ID":"d284bf7a-b9d0-4fe3-b8bc-06a64d104853","Type":"ContainerStarted","Data":"de56beb2817fd3ea748cf6442ba7e7400ebdb6c0f8ac5e5e0f53aee7529bcbfb"} Feb 18 14:18:43 crc kubenswrapper[4817]: I0218 14:18:43.211595 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7b57694bbd-rpg5b" event={"ID":"d284bf7a-b9d0-4fe3-b8bc-06a64d104853","Type":"ContainerStarted","Data":"8aaa807b5f812d220e9e3f5593319889a2fb2d082bfee19aba698db7f8419b37"} Feb 18 14:18:44 crc kubenswrapper[4817]: I0218 14:18:44.224251 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7b57694bbd-rpg5b" event={"ID":"d284bf7a-b9d0-4fe3-b8bc-06a64d104853","Type":"ContainerStarted","Data":"98fdafa97b59ff0b309bf0e884a04a7bb5d6ac946503e5fc9a6f84b0ca37917d"} Feb 18 14:18:44 crc kubenswrapper[4817]: I0218 14:18:44.225482 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7b57694bbd-rpg5b" Feb 18 14:18:44 crc kubenswrapper[4817]: I0218 14:18:44.254917 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7b57694bbd-rpg5b" podStartSLOduration=3.254893855 podStartE2EDuration="3.254893855s" podCreationTimestamp="2026-02-18 14:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:18:44.249332955 +0000 UTC m=+1186.824868948" watchObservedRunningTime="2026-02-18 14:18:44.254893855 +0000 UTC m=+1186.830429838" Feb 18 14:18:45 crc kubenswrapper[4817]: I0218 14:18:45.236153 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7b57694bbd-rpg5b" Feb 18 14:18:51 crc kubenswrapper[4817]: E0218 14:18:51.158527 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"ceilometer-notification-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack/ceilometer-0" podUID="4f601e4c-dd3b-487c-98bb-8c4a0f4985f4" Feb 18 14:18:51 crc kubenswrapper[4817]: I0218 14:18:51.311511 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4","Type":"ContainerStarted","Data":"01da3665c7ec3ac743280a0d552e3b141c54ad91499ac0f2313eec7bfc54c168"} Feb 18 14:18:51 crc kubenswrapper[4817]: I0218 14:18:51.311630 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4f601e4c-dd3b-487c-98bb-8c4a0f4985f4" containerName="sg-core" containerID="cri-o://b7e051dd86487059370e005d589a352b61e72aace98e62bd087e96d3895fdb12" gracePeriod=30 Feb 18 14:18:51 crc kubenswrapper[4817]: I0218 14:18:51.311679 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 18 14:18:51 crc kubenswrapper[4817]: I0218 14:18:51.311746 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4f601e4c-dd3b-487c-98bb-8c4a0f4985f4" containerName="proxy-httpd" containerID="cri-o://01da3665c7ec3ac743280a0d552e3b141c54ad91499ac0f2313eec7bfc54c168" gracePeriod=30 Feb 18 14:18:51 crc kubenswrapper[4817]: I0218 14:18:51.777230 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:18:51 crc kubenswrapper[4817]: I0218 14:18:51.867954 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-sg-core-conf-yaml\") pod \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\" (UID: \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\") " Feb 18 14:18:51 crc kubenswrapper[4817]: I0218 14:18:51.868037 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-run-httpd\") pod \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\" (UID: \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\") " Feb 18 14:18:51 crc kubenswrapper[4817]: I0218 14:18:51.868076 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tfr9\" (UniqueName: \"kubernetes.io/projected/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-kube-api-access-8tfr9\") pod \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\" (UID: \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\") " Feb 18 14:18:51 crc kubenswrapper[4817]: I0218 14:18:51.868162 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-log-httpd\") pod \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\" (UID: \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\") " Feb 18 14:18:51 crc kubenswrapper[4817]: I0218 14:18:51.868340 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-config-data\") pod \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\" (UID: \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\") " Feb 18 14:18:51 crc kubenswrapper[4817]: I0218 14:18:51.868396 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-scripts\") pod \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\" (UID: \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\") " Feb 18 14:18:51 crc kubenswrapper[4817]: I0218 14:18:51.868426 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-combined-ca-bundle\") pod \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\" (UID: \"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4\") " Feb 18 14:18:51 crc kubenswrapper[4817]: I0218 14:18:51.868656 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4f601e4c-dd3b-487c-98bb-8c4a0f4985f4" (UID: "4f601e4c-dd3b-487c-98bb-8c4a0f4985f4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:18:51 crc kubenswrapper[4817]: I0218 14:18:51.869325 4817 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:51 crc kubenswrapper[4817]: I0218 14:18:51.869601 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4f601e4c-dd3b-487c-98bb-8c4a0f4985f4" (UID: "4f601e4c-dd3b-487c-98bb-8c4a0f4985f4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:18:51 crc kubenswrapper[4817]: I0218 14:18:51.874740 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-scripts" (OuterVolumeSpecName: "scripts") pod "4f601e4c-dd3b-487c-98bb-8c4a0f4985f4" (UID: "4f601e4c-dd3b-487c-98bb-8c4a0f4985f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:18:51 crc kubenswrapper[4817]: I0218 14:18:51.875752 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-kube-api-access-8tfr9" (OuterVolumeSpecName: "kube-api-access-8tfr9") pod "4f601e4c-dd3b-487c-98bb-8c4a0f4985f4" (UID: "4f601e4c-dd3b-487c-98bb-8c4a0f4985f4"). InnerVolumeSpecName "kube-api-access-8tfr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:18:51 crc kubenswrapper[4817]: I0218 14:18:51.899111 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4f601e4c-dd3b-487c-98bb-8c4a0f4985f4" (UID: "4f601e4c-dd3b-487c-98bb-8c4a0f4985f4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:18:51 crc kubenswrapper[4817]: I0218 14:18:51.903331 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f601e4c-dd3b-487c-98bb-8c4a0f4985f4" (UID: "4f601e4c-dd3b-487c-98bb-8c4a0f4985f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:18:51 crc kubenswrapper[4817]: I0218 14:18:51.919319 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-config-data" (OuterVolumeSpecName: "config-data") pod "4f601e4c-dd3b-487c-98bb-8c4a0f4985f4" (UID: "4f601e4c-dd3b-487c-98bb-8c4a0f4985f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:18:51 crc kubenswrapper[4817]: I0218 14:18:51.971222 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:51 crc kubenswrapper[4817]: I0218 14:18:51.971258 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:51 crc kubenswrapper[4817]: I0218 14:18:51.971268 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:51 crc kubenswrapper[4817]: I0218 14:18:51.971280 4817 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:51 crc kubenswrapper[4817]: I0218 14:18:51.971288 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tfr9\" (UniqueName: \"kubernetes.io/projected/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-kube-api-access-8tfr9\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:51 crc kubenswrapper[4817]: I0218 14:18:51.971297 4817 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.323767 4817 generic.go:334] "Generic (PLEG): container finished" podID="4f601e4c-dd3b-487c-98bb-8c4a0f4985f4" containerID="01da3665c7ec3ac743280a0d552e3b141c54ad91499ac0f2313eec7bfc54c168" exitCode=0 Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.323804 4817 generic.go:334] "Generic (PLEG): container finished" podID="4f601e4c-dd3b-487c-98bb-8c4a0f4985f4" containerID="b7e051dd86487059370e005d589a352b61e72aace98e62bd087e96d3895fdb12" exitCode=2 Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.323826 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4","Type":"ContainerDied","Data":"01da3665c7ec3ac743280a0d552e3b141c54ad91499ac0f2313eec7bfc54c168"} Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.323853 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4","Type":"ContainerDied","Data":"b7e051dd86487059370e005d589a352b61e72aace98e62bd087e96d3895fdb12"} Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.323865 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f601e4c-dd3b-487c-98bb-8c4a0f4985f4","Type":"ContainerDied","Data":"674a8b91dc75ed6f1cd15f65a56f985f37af91a4bd2d749959aa5203fd6d43bd"} Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.323882 4817 scope.go:117] "RemoveContainer" containerID="01da3665c7ec3ac743280a0d552e3b141c54ad91499ac0f2313eec7bfc54c168" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.324038 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.371173 4817 scope.go:117] "RemoveContainer" containerID="b7e051dd86487059370e005d589a352b61e72aace98e62bd087e96d3895fdb12" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.385746 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.395773 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.416323 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:18:52 crc kubenswrapper[4817]: E0218 14:18:52.416874 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f601e4c-dd3b-487c-98bb-8c4a0f4985f4" containerName="sg-core" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.416888 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f601e4c-dd3b-487c-98bb-8c4a0f4985f4" containerName="sg-core" Feb 18 14:18:52 crc kubenswrapper[4817]: E0218 14:18:52.416913 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f601e4c-dd3b-487c-98bb-8c4a0f4985f4" containerName="proxy-httpd" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.416919 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f601e4c-dd3b-487c-98bb-8c4a0f4985f4" containerName="proxy-httpd" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.417106 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f601e4c-dd3b-487c-98bb-8c4a0f4985f4" containerName="sg-core" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.417126 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f601e4c-dd3b-487c-98bb-8c4a0f4985f4" containerName="proxy-httpd" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.418767 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.420724 4817 scope.go:117] "RemoveContainer" containerID="01da3665c7ec3ac743280a0d552e3b141c54ad91499ac0f2313eec7bfc54c168" Feb 18 14:18:52 crc kubenswrapper[4817]: E0218 14:18:52.421251 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01da3665c7ec3ac743280a0d552e3b141c54ad91499ac0f2313eec7bfc54c168\": container with ID starting with 01da3665c7ec3ac743280a0d552e3b141c54ad91499ac0f2313eec7bfc54c168 not found: ID does not exist" containerID="01da3665c7ec3ac743280a0d552e3b141c54ad91499ac0f2313eec7bfc54c168" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.421282 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01da3665c7ec3ac743280a0d552e3b141c54ad91499ac0f2313eec7bfc54c168"} err="failed to get container status \"01da3665c7ec3ac743280a0d552e3b141c54ad91499ac0f2313eec7bfc54c168\": rpc error: code = NotFound desc = could not find container \"01da3665c7ec3ac743280a0d552e3b141c54ad91499ac0f2313eec7bfc54c168\": container with ID starting with 01da3665c7ec3ac743280a0d552e3b141c54ad91499ac0f2313eec7bfc54c168 not found: ID does not exist" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.421306 4817 scope.go:117] "RemoveContainer" containerID="b7e051dd86487059370e005d589a352b61e72aace98e62bd087e96d3895fdb12" Feb 18 14:18:52 crc kubenswrapper[4817]: E0218 14:18:52.421932 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7e051dd86487059370e005d589a352b61e72aace98e62bd087e96d3895fdb12\": container with ID starting with b7e051dd86487059370e005d589a352b61e72aace98e62bd087e96d3895fdb12 not found: ID does not exist" containerID="b7e051dd86487059370e005d589a352b61e72aace98e62bd087e96d3895fdb12" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.421958 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7e051dd86487059370e005d589a352b61e72aace98e62bd087e96d3895fdb12"} err="failed to get container status \"b7e051dd86487059370e005d589a352b61e72aace98e62bd087e96d3895fdb12\": rpc error: code = NotFound desc = could not find container \"b7e051dd86487059370e005d589a352b61e72aace98e62bd087e96d3895fdb12\": container with ID starting with b7e051dd86487059370e005d589a352b61e72aace98e62bd087e96d3895fdb12 not found: ID does not exist" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.421972 4817 scope.go:117] "RemoveContainer" containerID="01da3665c7ec3ac743280a0d552e3b141c54ad91499ac0f2313eec7bfc54c168" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.422152 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01da3665c7ec3ac743280a0d552e3b141c54ad91499ac0f2313eec7bfc54c168"} err="failed to get container status \"01da3665c7ec3ac743280a0d552e3b141c54ad91499ac0f2313eec7bfc54c168\": rpc error: code = NotFound desc = could not find container \"01da3665c7ec3ac743280a0d552e3b141c54ad91499ac0f2313eec7bfc54c168\": container with ID starting with 01da3665c7ec3ac743280a0d552e3b141c54ad91499ac0f2313eec7bfc54c168 not found: ID does not exist" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.422166 4817 scope.go:117] "RemoveContainer" containerID="b7e051dd86487059370e005d589a352b61e72aace98e62bd087e96d3895fdb12" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.422376 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7e051dd86487059370e005d589a352b61e72aace98e62bd087e96d3895fdb12"} err="failed to get container status \"b7e051dd86487059370e005d589a352b61e72aace98e62bd087e96d3895fdb12\": rpc error: code = NotFound desc = could not find container \"b7e051dd86487059370e005d589a352b61e72aace98e62bd087e96d3895fdb12\": container with ID starting with b7e051dd86487059370e005d589a352b61e72aace98e62bd087e96d3895fdb12 not found: ID does not exist" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.422566 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.422722 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.435526 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.481915 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b709a176-218a-4047-990d-72930e70a179-config-data\") pod \"ceilometer-0\" (UID: \"b709a176-218a-4047-990d-72930e70a179\") " pod="openstack/ceilometer-0" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.482272 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b709a176-218a-4047-990d-72930e70a179-log-httpd\") pod \"ceilometer-0\" (UID: \"b709a176-218a-4047-990d-72930e70a179\") " pod="openstack/ceilometer-0" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.482301 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b709a176-218a-4047-990d-72930e70a179-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b709a176-218a-4047-990d-72930e70a179\") " pod="openstack/ceilometer-0" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.482423 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhn5k\" (UniqueName: \"kubernetes.io/projected/b709a176-218a-4047-990d-72930e70a179-kube-api-access-jhn5k\") pod \"ceilometer-0\" (UID: \"b709a176-218a-4047-990d-72930e70a179\") " pod="openstack/ceilometer-0" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.482457 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b709a176-218a-4047-990d-72930e70a179-scripts\") pod \"ceilometer-0\" (UID: \"b709a176-218a-4047-990d-72930e70a179\") " pod="openstack/ceilometer-0" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.482493 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b709a176-218a-4047-990d-72930e70a179-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b709a176-218a-4047-990d-72930e70a179\") " pod="openstack/ceilometer-0" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.482541 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b709a176-218a-4047-990d-72930e70a179-run-httpd\") pod \"ceilometer-0\" (UID: \"b709a176-218a-4047-990d-72930e70a179\") " pod="openstack/ceilometer-0" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.517803 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:18:52 crc kubenswrapper[4817]: E0218 14:18:52.518518 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-jhn5k log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="b709a176-218a-4047-990d-72930e70a179" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.585063 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b709a176-218a-4047-990d-72930e70a179-scripts\") pod \"ceilometer-0\" (UID: \"b709a176-218a-4047-990d-72930e70a179\") " pod="openstack/ceilometer-0" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.585131 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b709a176-218a-4047-990d-72930e70a179-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b709a176-218a-4047-990d-72930e70a179\") " pod="openstack/ceilometer-0" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.585187 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b709a176-218a-4047-990d-72930e70a179-run-httpd\") pod \"ceilometer-0\" (UID: \"b709a176-218a-4047-990d-72930e70a179\") " pod="openstack/ceilometer-0" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.585228 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b709a176-218a-4047-990d-72930e70a179-config-data\") pod \"ceilometer-0\" (UID: \"b709a176-218a-4047-990d-72930e70a179\") " pod="openstack/ceilometer-0" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.585254 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b709a176-218a-4047-990d-72930e70a179-log-httpd\") pod \"ceilometer-0\" (UID: \"b709a176-218a-4047-990d-72930e70a179\") " pod="openstack/ceilometer-0" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.585285 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b709a176-218a-4047-990d-72930e70a179-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b709a176-218a-4047-990d-72930e70a179\") " pod="openstack/ceilometer-0" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.585437 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhn5k\" (UniqueName: \"kubernetes.io/projected/b709a176-218a-4047-990d-72930e70a179-kube-api-access-jhn5k\") pod \"ceilometer-0\" (UID: \"b709a176-218a-4047-990d-72930e70a179\") " pod="openstack/ceilometer-0" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.586348 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b709a176-218a-4047-990d-72930e70a179-run-httpd\") pod \"ceilometer-0\" (UID: \"b709a176-218a-4047-990d-72930e70a179\") " pod="openstack/ceilometer-0" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.586361 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b709a176-218a-4047-990d-72930e70a179-log-httpd\") pod \"ceilometer-0\" (UID: \"b709a176-218a-4047-990d-72930e70a179\") " pod="openstack/ceilometer-0" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.588894 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b709a176-218a-4047-990d-72930e70a179-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b709a176-218a-4047-990d-72930e70a179\") " pod="openstack/ceilometer-0" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.589692 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b709a176-218a-4047-990d-72930e70a179-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b709a176-218a-4047-990d-72930e70a179\") " pod="openstack/ceilometer-0" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.597697 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b709a176-218a-4047-990d-72930e70a179-scripts\") pod \"ceilometer-0\" (UID: \"b709a176-218a-4047-990d-72930e70a179\") " pod="openstack/ceilometer-0" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.598372 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b709a176-218a-4047-990d-72930e70a179-config-data\") pod \"ceilometer-0\" (UID: \"b709a176-218a-4047-990d-72930e70a179\") " pod="openstack/ceilometer-0" Feb 18 14:18:52 crc kubenswrapper[4817]: I0218 14:18:52.604992 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhn5k\" (UniqueName: \"kubernetes.io/projected/b709a176-218a-4047-990d-72930e70a179-kube-api-access-jhn5k\") pod \"ceilometer-0\" (UID: \"b709a176-218a-4047-990d-72930e70a179\") " pod="openstack/ceilometer-0" Feb 18 14:18:53 crc kubenswrapper[4817]: I0218 14:18:53.335446 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:18:53 crc kubenswrapper[4817]: I0218 14:18:53.349747 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:18:53 crc kubenswrapper[4817]: I0218 14:18:53.401080 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b709a176-218a-4047-990d-72930e70a179-log-httpd\") pod \"b709a176-218a-4047-990d-72930e70a179\" (UID: \"b709a176-218a-4047-990d-72930e70a179\") " Feb 18 14:18:53 crc kubenswrapper[4817]: I0218 14:18:53.401257 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b709a176-218a-4047-990d-72930e70a179-config-data\") pod \"b709a176-218a-4047-990d-72930e70a179\" (UID: \"b709a176-218a-4047-990d-72930e70a179\") " Feb 18 14:18:53 crc kubenswrapper[4817]: I0218 14:18:53.401304 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhn5k\" (UniqueName: \"kubernetes.io/projected/b709a176-218a-4047-990d-72930e70a179-kube-api-access-jhn5k\") pod \"b709a176-218a-4047-990d-72930e70a179\" (UID: \"b709a176-218a-4047-990d-72930e70a179\") " Feb 18 14:18:53 crc kubenswrapper[4817]: I0218 14:18:53.401344 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b709a176-218a-4047-990d-72930e70a179-scripts\") pod \"b709a176-218a-4047-990d-72930e70a179\" (UID: \"b709a176-218a-4047-990d-72930e70a179\") " Feb 18 14:18:53 crc kubenswrapper[4817]: I0218 14:18:53.401375 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b709a176-218a-4047-990d-72930e70a179-sg-core-conf-yaml\") pod \"b709a176-218a-4047-990d-72930e70a179\" (UID: \"b709a176-218a-4047-990d-72930e70a179\") " Feb 18 14:18:53 crc kubenswrapper[4817]: I0218 14:18:53.401457 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b709a176-218a-4047-990d-72930e70a179-run-httpd\") pod \"b709a176-218a-4047-990d-72930e70a179\" (UID: \"b709a176-218a-4047-990d-72930e70a179\") " Feb 18 14:18:53 crc kubenswrapper[4817]: I0218 14:18:53.401498 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b709a176-218a-4047-990d-72930e70a179-combined-ca-bundle\") pod \"b709a176-218a-4047-990d-72930e70a179\" (UID: \"b709a176-218a-4047-990d-72930e70a179\") " Feb 18 14:18:53 crc kubenswrapper[4817]: I0218 14:18:53.402366 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b709a176-218a-4047-990d-72930e70a179-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b709a176-218a-4047-990d-72930e70a179" (UID: "b709a176-218a-4047-990d-72930e70a179"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:18:53 crc kubenswrapper[4817]: I0218 14:18:53.402487 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b709a176-218a-4047-990d-72930e70a179-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b709a176-218a-4047-990d-72930e70a179" (UID: "b709a176-218a-4047-990d-72930e70a179"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:18:53 crc kubenswrapper[4817]: I0218 14:18:53.406358 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b709a176-218a-4047-990d-72930e70a179-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b709a176-218a-4047-990d-72930e70a179" (UID: "b709a176-218a-4047-990d-72930e70a179"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:18:53 crc kubenswrapper[4817]: I0218 14:18:53.406381 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b709a176-218a-4047-990d-72930e70a179-scripts" (OuterVolumeSpecName: "scripts") pod "b709a176-218a-4047-990d-72930e70a179" (UID: "b709a176-218a-4047-990d-72930e70a179"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:18:53 crc kubenswrapper[4817]: I0218 14:18:53.406438 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b709a176-218a-4047-990d-72930e70a179-kube-api-access-jhn5k" (OuterVolumeSpecName: "kube-api-access-jhn5k") pod "b709a176-218a-4047-990d-72930e70a179" (UID: "b709a176-218a-4047-990d-72930e70a179"). InnerVolumeSpecName "kube-api-access-jhn5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:18:53 crc kubenswrapper[4817]: I0218 14:18:53.407019 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b709a176-218a-4047-990d-72930e70a179-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b709a176-218a-4047-990d-72930e70a179" (UID: "b709a176-218a-4047-990d-72930e70a179"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:18:53 crc kubenswrapper[4817]: I0218 14:18:53.415225 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b709a176-218a-4047-990d-72930e70a179-config-data" (OuterVolumeSpecName: "config-data") pod "b709a176-218a-4047-990d-72930e70a179" (UID: "b709a176-218a-4047-990d-72930e70a179"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:18:53 crc kubenswrapper[4817]: I0218 14:18:53.503556 4817 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b709a176-218a-4047-990d-72930e70a179-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:53 crc kubenswrapper[4817]: I0218 14:18:53.503589 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b709a176-218a-4047-990d-72930e70a179-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:53 crc kubenswrapper[4817]: I0218 14:18:53.503602 4817 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b709a176-218a-4047-990d-72930e70a179-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:53 crc kubenswrapper[4817]: I0218 14:18:53.503610 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b709a176-218a-4047-990d-72930e70a179-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:53 crc kubenswrapper[4817]: I0218 14:18:53.503617 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhn5k\" (UniqueName: \"kubernetes.io/projected/b709a176-218a-4047-990d-72930e70a179-kube-api-access-jhn5k\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:53 crc kubenswrapper[4817]: I0218 14:18:53.503625 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b709a176-218a-4047-990d-72930e70a179-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:53 crc kubenswrapper[4817]: I0218 14:18:53.503635 4817 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b709a176-218a-4047-990d-72930e70a179-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:54 crc kubenswrapper[4817]: I0218 14:18:54.182752 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f601e4c-dd3b-487c-98bb-8c4a0f4985f4" path="/var/lib/kubelet/pods/4f601e4c-dd3b-487c-98bb-8c4a0f4985f4/volumes" Feb 18 14:18:54 crc kubenswrapper[4817]: I0218 14:18:54.342702 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:18:54 crc kubenswrapper[4817]: I0218 14:18:54.396040 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:18:54 crc kubenswrapper[4817]: I0218 14:18:54.414453 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:18:54 crc kubenswrapper[4817]: I0218 14:18:54.426492 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:18:54 crc kubenswrapper[4817]: I0218 14:18:54.428901 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:18:54 crc kubenswrapper[4817]: I0218 14:18:54.442593 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 14:18:54 crc kubenswrapper[4817]: I0218 14:18:54.442808 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 14:18:54 crc kubenswrapper[4817]: I0218 14:18:54.450905 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:18:54 crc kubenswrapper[4817]: I0218 14:18:54.526406 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69289a6c-7b95-4c16-a326-dab582fc4b86-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69289a6c-7b95-4c16-a326-dab582fc4b86\") " pod="openstack/ceilometer-0" Feb 18 14:18:54 crc kubenswrapper[4817]: I0218 14:18:54.526474 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69289a6c-7b95-4c16-a326-dab582fc4b86-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69289a6c-7b95-4c16-a326-dab582fc4b86\") " pod="openstack/ceilometer-0" Feb 18 14:18:54 crc kubenswrapper[4817]: I0218 14:18:54.526769 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69289a6c-7b95-4c16-a326-dab582fc4b86-log-httpd\") pod \"ceilometer-0\" (UID: \"69289a6c-7b95-4c16-a326-dab582fc4b86\") " pod="openstack/ceilometer-0" Feb 18 14:18:54 crc kubenswrapper[4817]: I0218 14:18:54.526897 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69289a6c-7b95-4c16-a326-dab582fc4b86-run-httpd\") pod \"ceilometer-0\" (UID: \"69289a6c-7b95-4c16-a326-dab582fc4b86\") " pod="openstack/ceilometer-0" Feb 18 14:18:54 crc kubenswrapper[4817]: I0218 14:18:54.526956 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69289a6c-7b95-4c16-a326-dab582fc4b86-config-data\") pod \"ceilometer-0\" (UID: \"69289a6c-7b95-4c16-a326-dab582fc4b86\") " pod="openstack/ceilometer-0" Feb 18 14:18:54 crc kubenswrapper[4817]: I0218 14:18:54.527068 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69289a6c-7b95-4c16-a326-dab582fc4b86-scripts\") pod \"ceilometer-0\" (UID: \"69289a6c-7b95-4c16-a326-dab582fc4b86\") " pod="openstack/ceilometer-0" Feb 18 14:18:54 crc kubenswrapper[4817]: I0218 14:18:54.527114 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pf8t\" (UniqueName: \"kubernetes.io/projected/69289a6c-7b95-4c16-a326-dab582fc4b86-kube-api-access-5pf8t\") pod \"ceilometer-0\" (UID: \"69289a6c-7b95-4c16-a326-dab582fc4b86\") " pod="openstack/ceilometer-0" Feb 18 14:18:54 crc kubenswrapper[4817]: I0218 14:18:54.629093 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69289a6c-7b95-4c16-a326-dab582fc4b86-log-httpd\") pod \"ceilometer-0\" (UID: \"69289a6c-7b95-4c16-a326-dab582fc4b86\") " pod="openstack/ceilometer-0" Feb 18 14:18:54 crc kubenswrapper[4817]: I0218 14:18:54.629167 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69289a6c-7b95-4c16-a326-dab582fc4b86-run-httpd\") pod \"ceilometer-0\" (UID: \"69289a6c-7b95-4c16-a326-dab582fc4b86\") " pod="openstack/ceilometer-0" Feb 18 14:18:54 crc kubenswrapper[4817]: I0218 14:18:54.629200 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69289a6c-7b95-4c16-a326-dab582fc4b86-config-data\") pod \"ceilometer-0\" (UID: \"69289a6c-7b95-4c16-a326-dab582fc4b86\") " pod="openstack/ceilometer-0" Feb 18 14:18:54 crc kubenswrapper[4817]: I0218 14:18:54.629245 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69289a6c-7b95-4c16-a326-dab582fc4b86-scripts\") pod \"ceilometer-0\" (UID: \"69289a6c-7b95-4c16-a326-dab582fc4b86\") " pod="openstack/ceilometer-0" Feb 18 14:18:54 crc kubenswrapper[4817]: I0218 14:18:54.629277 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pf8t\" (UniqueName: \"kubernetes.io/projected/69289a6c-7b95-4c16-a326-dab582fc4b86-kube-api-access-5pf8t\") pod \"ceilometer-0\" (UID: \"69289a6c-7b95-4c16-a326-dab582fc4b86\") " pod="openstack/ceilometer-0" Feb 18 14:18:54 crc kubenswrapper[4817]: I0218 14:18:54.629313 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69289a6c-7b95-4c16-a326-dab582fc4b86-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69289a6c-7b95-4c16-a326-dab582fc4b86\") " pod="openstack/ceilometer-0" Feb 18 14:18:54 crc kubenswrapper[4817]: I0218 14:18:54.629354 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69289a6c-7b95-4c16-a326-dab582fc4b86-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69289a6c-7b95-4c16-a326-dab582fc4b86\") " pod="openstack/ceilometer-0" Feb 18 14:18:54 crc kubenswrapper[4817]: I0218 14:18:54.629587 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69289a6c-7b95-4c16-a326-dab582fc4b86-log-httpd\") pod \"ceilometer-0\" (UID: \"69289a6c-7b95-4c16-a326-dab582fc4b86\") " pod="openstack/ceilometer-0" Feb 18 14:18:54 crc kubenswrapper[4817]: I0218 14:18:54.629610 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69289a6c-7b95-4c16-a326-dab582fc4b86-run-httpd\") pod \"ceilometer-0\" (UID: \"69289a6c-7b95-4c16-a326-dab582fc4b86\") " pod="openstack/ceilometer-0" Feb 18 14:18:54 crc kubenswrapper[4817]: I0218 14:18:54.635686 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69289a6c-7b95-4c16-a326-dab582fc4b86-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69289a6c-7b95-4c16-a326-dab582fc4b86\") " pod="openstack/ceilometer-0" Feb 18 14:18:54 crc kubenswrapper[4817]: I0218 14:18:54.636039 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69289a6c-7b95-4c16-a326-dab582fc4b86-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69289a6c-7b95-4c16-a326-dab582fc4b86\") " pod="openstack/ceilometer-0" Feb 18 14:18:54 crc kubenswrapper[4817]: I0218 14:18:54.636287 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69289a6c-7b95-4c16-a326-dab582fc4b86-config-data\") pod \"ceilometer-0\" (UID: \"69289a6c-7b95-4c16-a326-dab582fc4b86\") " pod="openstack/ceilometer-0" Feb 18 14:18:54 crc kubenswrapper[4817]: I0218 14:18:54.636907 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69289a6c-7b95-4c16-a326-dab582fc4b86-scripts\") pod \"ceilometer-0\" (UID: \"69289a6c-7b95-4c16-a326-dab582fc4b86\") " pod="openstack/ceilometer-0" Feb 18 14:18:54 crc kubenswrapper[4817]: I0218 14:18:54.649703 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pf8t\" (UniqueName: \"kubernetes.io/projected/69289a6c-7b95-4c16-a326-dab582fc4b86-kube-api-access-5pf8t\") pod \"ceilometer-0\" (UID: \"69289a6c-7b95-4c16-a326-dab582fc4b86\") " pod="openstack/ceilometer-0" Feb 18 14:18:54 crc kubenswrapper[4817]: I0218 14:18:54.750326 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:18:55 crc kubenswrapper[4817]: I0218 14:18:55.226676 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:18:55 crc kubenswrapper[4817]: W0218 14:18:55.231526 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69289a6c_7b95_4c16_a326_dab582fc4b86.slice/crio-d4df07953ad4f0ccacba3de5700a9a959deab55b104a155e3b9482a725864ec0 WatchSource:0}: Error finding container d4df07953ad4f0ccacba3de5700a9a959deab55b104a155e3b9482a725864ec0: Status 404 returned error can't find the container with id d4df07953ad4f0ccacba3de5700a9a959deab55b104a155e3b9482a725864ec0 Feb 18 14:18:55 crc kubenswrapper[4817]: I0218 14:18:55.353915 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69289a6c-7b95-4c16-a326-dab582fc4b86","Type":"ContainerStarted","Data":"d4df07953ad4f0ccacba3de5700a9a959deab55b104a155e3b9482a725864ec0"} Feb 18 14:18:56 crc kubenswrapper[4817]: I0218 14:18:56.186436 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b709a176-218a-4047-990d-72930e70a179" path="/var/lib/kubelet/pods/b709a176-218a-4047-990d-72930e70a179/volumes" Feb 18 14:18:56 crc kubenswrapper[4817]: I0218 14:18:56.366589 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69289a6c-7b95-4c16-a326-dab582fc4b86","Type":"ContainerStarted","Data":"446fd5b3ecb61957312dd974bf64fcfa6943b99e18c4a83378af0878c142cda6"} Feb 18 14:18:56 crc kubenswrapper[4817]: I0218 14:18:56.368657 4817 generic.go:334] "Generic (PLEG): container finished" podID="162fd834-bb59-43f4-98f0-9acb0333e71c" containerID="9adb2e5a4fdf34e1a9a8889725d117c3402d9cc2068cccc21ebd90da49d70abf" exitCode=0 Feb 18 14:18:56 crc kubenswrapper[4817]: I0218 14:18:56.368699 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dzzzv" event={"ID":"162fd834-bb59-43f4-98f0-9acb0333e71c","Type":"ContainerDied","Data":"9adb2e5a4fdf34e1a9a8889725d117c3402d9cc2068cccc21ebd90da49d70abf"} Feb 18 14:18:57 crc kubenswrapper[4817]: I0218 14:18:57.780161 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dzzzv" Feb 18 14:18:57 crc kubenswrapper[4817]: I0218 14:18:57.892039 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/162fd834-bb59-43f4-98f0-9acb0333e71c-combined-ca-bundle\") pod \"162fd834-bb59-43f4-98f0-9acb0333e71c\" (UID: \"162fd834-bb59-43f4-98f0-9acb0333e71c\") " Feb 18 14:18:57 crc kubenswrapper[4817]: I0218 14:18:57.892256 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mw8g5\" (UniqueName: \"kubernetes.io/projected/162fd834-bb59-43f4-98f0-9acb0333e71c-kube-api-access-mw8g5\") pod \"162fd834-bb59-43f4-98f0-9acb0333e71c\" (UID: \"162fd834-bb59-43f4-98f0-9acb0333e71c\") " Feb 18 14:18:57 crc kubenswrapper[4817]: I0218 14:18:57.892319 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/162fd834-bb59-43f4-98f0-9acb0333e71c-db-sync-config-data\") pod \"162fd834-bb59-43f4-98f0-9acb0333e71c\" (UID: \"162fd834-bb59-43f4-98f0-9acb0333e71c\") " Feb 18 14:18:57 crc kubenswrapper[4817]: I0218 14:18:57.896201 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/162fd834-bb59-43f4-98f0-9acb0333e71c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "162fd834-bb59-43f4-98f0-9acb0333e71c" (UID: "162fd834-bb59-43f4-98f0-9acb0333e71c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:18:57 crc kubenswrapper[4817]: I0218 14:18:57.896467 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/162fd834-bb59-43f4-98f0-9acb0333e71c-kube-api-access-mw8g5" (OuterVolumeSpecName: "kube-api-access-mw8g5") pod "162fd834-bb59-43f4-98f0-9acb0333e71c" (UID: "162fd834-bb59-43f4-98f0-9acb0333e71c"). InnerVolumeSpecName "kube-api-access-mw8g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:18:57 crc kubenswrapper[4817]: I0218 14:18:57.923765 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/162fd834-bb59-43f4-98f0-9acb0333e71c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "162fd834-bb59-43f4-98f0-9acb0333e71c" (UID: "162fd834-bb59-43f4-98f0-9acb0333e71c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:18:57 crc kubenswrapper[4817]: I0218 14:18:57.994329 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/162fd834-bb59-43f4-98f0-9acb0333e71c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:57 crc kubenswrapper[4817]: I0218 14:18:57.994369 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mw8g5\" (UniqueName: \"kubernetes.io/projected/162fd834-bb59-43f4-98f0-9acb0333e71c-kube-api-access-mw8g5\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:57 crc kubenswrapper[4817]: I0218 14:18:57.994384 4817 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/162fd834-bb59-43f4-98f0-9acb0333e71c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.392094 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69289a6c-7b95-4c16-a326-dab582fc4b86","Type":"ContainerStarted","Data":"c3a599395d06f206c5faa5386bea5b3abc4cb33817b5a9e4c092c926794b4c0f"} Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.395044 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dzzzv" event={"ID":"162fd834-bb59-43f4-98f0-9acb0333e71c","Type":"ContainerDied","Data":"89eb4e6b2539f55acfb94cd58dcb6775472482b604951c503207fc700cb1db6b"} Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.395082 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89eb4e6b2539f55acfb94cd58dcb6775472482b604951c503207fc700cb1db6b" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.395153 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dzzzv" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.736164 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-78b498b86c-nn4vc"] Feb 18 14:18:58 crc kubenswrapper[4817]: E0218 14:18:58.736924 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="162fd834-bb59-43f4-98f0-9acb0333e71c" containerName="barbican-db-sync" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.736944 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="162fd834-bb59-43f4-98f0-9acb0333e71c" containerName="barbican-db-sync" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.737167 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="162fd834-bb59-43f4-98f0-9acb0333e71c" containerName="barbican-db-sync" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.749005 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-78b498b86c-nn4vc" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.771756 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.772195 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-jhphm" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.772377 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.777048 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5bf77b97db-hknps"] Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.779090 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5bf77b97db-hknps" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.788016 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.812740 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-78b498b86c-nn4vc"] Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.825913 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/864e8c7f-e3b7-4f27-960e-df753b339571-combined-ca-bundle\") pod \"barbican-worker-78b498b86c-nn4vc\" (UID: \"864e8c7f-e3b7-4f27-960e-df753b339571\") " pod="openstack/barbican-worker-78b498b86c-nn4vc" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.826084 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/33929d5f-e679-44e2-a0e9-816088e17cb1-config-data-custom\") pod \"barbican-keystone-listener-5bf77b97db-hknps\" (UID: \"33929d5f-e679-44e2-a0e9-816088e17cb1\") " pod="openstack/barbican-keystone-listener-5bf77b97db-hknps" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.826230 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/864e8c7f-e3b7-4f27-960e-df753b339571-logs\") pod \"barbican-worker-78b498b86c-nn4vc\" (UID: \"864e8c7f-e3b7-4f27-960e-df753b339571\") " pod="openstack/barbican-worker-78b498b86c-nn4vc" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.826314 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2kr4\" (UniqueName: \"kubernetes.io/projected/864e8c7f-e3b7-4f27-960e-df753b339571-kube-api-access-g2kr4\") pod \"barbican-worker-78b498b86c-nn4vc\" (UID: \"864e8c7f-e3b7-4f27-960e-df753b339571\") " pod="openstack/barbican-worker-78b498b86c-nn4vc" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.826574 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33929d5f-e679-44e2-a0e9-816088e17cb1-config-data\") pod \"barbican-keystone-listener-5bf77b97db-hknps\" (UID: \"33929d5f-e679-44e2-a0e9-816088e17cb1\") " pod="openstack/barbican-keystone-listener-5bf77b97db-hknps" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.826764 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/864e8c7f-e3b7-4f27-960e-df753b339571-config-data-custom\") pod \"barbican-worker-78b498b86c-nn4vc\" (UID: \"864e8c7f-e3b7-4f27-960e-df753b339571\") " pod="openstack/barbican-worker-78b498b86c-nn4vc" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.826904 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33929d5f-e679-44e2-a0e9-816088e17cb1-combined-ca-bundle\") pod \"barbican-keystone-listener-5bf77b97db-hknps\" (UID: \"33929d5f-e679-44e2-a0e9-816088e17cb1\") " pod="openstack/barbican-keystone-listener-5bf77b97db-hknps" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.826992 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33929d5f-e679-44e2-a0e9-816088e17cb1-logs\") pod \"barbican-keystone-listener-5bf77b97db-hknps\" (UID: \"33929d5f-e679-44e2-a0e9-816088e17cb1\") " pod="openstack/barbican-keystone-listener-5bf77b97db-hknps" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.827087 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm6g2\" (UniqueName: \"kubernetes.io/projected/33929d5f-e679-44e2-a0e9-816088e17cb1-kube-api-access-pm6g2\") pod \"barbican-keystone-listener-5bf77b97db-hknps\" (UID: \"33929d5f-e679-44e2-a0e9-816088e17cb1\") " pod="openstack/barbican-keystone-listener-5bf77b97db-hknps" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.836663 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/864e8c7f-e3b7-4f27-960e-df753b339571-config-data\") pod \"barbican-worker-78b498b86c-nn4vc\" (UID: \"864e8c7f-e3b7-4f27-960e-df753b339571\") " pod="openstack/barbican-worker-78b498b86c-nn4vc" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.849579 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5bf77b97db-hknps"] Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.938608 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm6g2\" (UniqueName: \"kubernetes.io/projected/33929d5f-e679-44e2-a0e9-816088e17cb1-kube-api-access-pm6g2\") pod \"barbican-keystone-listener-5bf77b97db-hknps\" (UID: \"33929d5f-e679-44e2-a0e9-816088e17cb1\") " pod="openstack/barbican-keystone-listener-5bf77b97db-hknps" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.938669 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/864e8c7f-e3b7-4f27-960e-df753b339571-config-data\") pod \"barbican-worker-78b498b86c-nn4vc\" (UID: \"864e8c7f-e3b7-4f27-960e-df753b339571\") " pod="openstack/barbican-worker-78b498b86c-nn4vc" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.938756 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/864e8c7f-e3b7-4f27-960e-df753b339571-combined-ca-bundle\") pod \"barbican-worker-78b498b86c-nn4vc\" (UID: \"864e8c7f-e3b7-4f27-960e-df753b339571\") " pod="openstack/barbican-worker-78b498b86c-nn4vc" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.938796 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/33929d5f-e679-44e2-a0e9-816088e17cb1-config-data-custom\") pod \"barbican-keystone-listener-5bf77b97db-hknps\" (UID: \"33929d5f-e679-44e2-a0e9-816088e17cb1\") " pod="openstack/barbican-keystone-listener-5bf77b97db-hknps" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.938845 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/864e8c7f-e3b7-4f27-960e-df753b339571-logs\") pod \"barbican-worker-78b498b86c-nn4vc\" (UID: \"864e8c7f-e3b7-4f27-960e-df753b339571\") " pod="openstack/barbican-worker-78b498b86c-nn4vc" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.938882 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2kr4\" (UniqueName: \"kubernetes.io/projected/864e8c7f-e3b7-4f27-960e-df753b339571-kube-api-access-g2kr4\") pod \"barbican-worker-78b498b86c-nn4vc\" (UID: \"864e8c7f-e3b7-4f27-960e-df753b339571\") " pod="openstack/barbican-worker-78b498b86c-nn4vc" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.938948 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33929d5f-e679-44e2-a0e9-816088e17cb1-config-data\") pod \"barbican-keystone-listener-5bf77b97db-hknps\" (UID: \"33929d5f-e679-44e2-a0e9-816088e17cb1\") " pod="openstack/barbican-keystone-listener-5bf77b97db-hknps" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.939033 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/864e8c7f-e3b7-4f27-960e-df753b339571-config-data-custom\") pod \"barbican-worker-78b498b86c-nn4vc\" (UID: \"864e8c7f-e3b7-4f27-960e-df753b339571\") " pod="openstack/barbican-worker-78b498b86c-nn4vc" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.939091 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33929d5f-e679-44e2-a0e9-816088e17cb1-combined-ca-bundle\") pod \"barbican-keystone-listener-5bf77b97db-hknps\" (UID: \"33929d5f-e679-44e2-a0e9-816088e17cb1\") " pod="openstack/barbican-keystone-listener-5bf77b97db-hknps" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.939130 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33929d5f-e679-44e2-a0e9-816088e17cb1-logs\") pod \"barbican-keystone-listener-5bf77b97db-hknps\" (UID: \"33929d5f-e679-44e2-a0e9-816088e17cb1\") " pod="openstack/barbican-keystone-listener-5bf77b97db-hknps" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.939613 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33929d5f-e679-44e2-a0e9-816088e17cb1-logs\") pod \"barbican-keystone-listener-5bf77b97db-hknps\" (UID: \"33929d5f-e679-44e2-a0e9-816088e17cb1\") " pod="openstack/barbican-keystone-listener-5bf77b97db-hknps" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.939891 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/864e8c7f-e3b7-4f27-960e-df753b339571-logs\") pod \"barbican-worker-78b498b86c-nn4vc\" (UID: \"864e8c7f-e3b7-4f27-960e-df753b339571\") " pod="openstack/barbican-worker-78b498b86c-nn4vc" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.947175 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7979dc8455-w5k2f"] Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.952238 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/864e8c7f-e3b7-4f27-960e-df753b339571-config-data-custom\") pod \"barbican-worker-78b498b86c-nn4vc\" (UID: \"864e8c7f-e3b7-4f27-960e-df753b339571\") " pod="openstack/barbican-worker-78b498b86c-nn4vc" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.955785 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/864e8c7f-e3b7-4f27-960e-df753b339571-combined-ca-bundle\") pod \"barbican-worker-78b498b86c-nn4vc\" (UID: \"864e8c7f-e3b7-4f27-960e-df753b339571\") " pod="openstack/barbican-worker-78b498b86c-nn4vc" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.956068 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33929d5f-e679-44e2-a0e9-816088e17cb1-config-data\") pod \"barbican-keystone-listener-5bf77b97db-hknps\" (UID: \"33929d5f-e679-44e2-a0e9-816088e17cb1\") " pod="openstack/barbican-keystone-listener-5bf77b97db-hknps" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.956902 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/864e8c7f-e3b7-4f27-960e-df753b339571-config-data\") pod \"barbican-worker-78b498b86c-nn4vc\" (UID: \"864e8c7f-e3b7-4f27-960e-df753b339571\") " pod="openstack/barbican-worker-78b498b86c-nn4vc" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.966261 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7979dc8455-w5k2f" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.973522 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33929d5f-e679-44e2-a0e9-816088e17cb1-combined-ca-bundle\") pod \"barbican-keystone-listener-5bf77b97db-hknps\" (UID: \"33929d5f-e679-44e2-a0e9-816088e17cb1\") " pod="openstack/barbican-keystone-listener-5bf77b97db-hknps" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.977372 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm6g2\" (UniqueName: \"kubernetes.io/projected/33929d5f-e679-44e2-a0e9-816088e17cb1-kube-api-access-pm6g2\") pod \"barbican-keystone-listener-5bf77b97db-hknps\" (UID: \"33929d5f-e679-44e2-a0e9-816088e17cb1\") " pod="openstack/barbican-keystone-listener-5bf77b97db-hknps" Feb 18 14:18:58 crc kubenswrapper[4817]: I0218 14:18:58.992798 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/33929d5f-e679-44e2-a0e9-816088e17cb1-config-data-custom\") pod \"barbican-keystone-listener-5bf77b97db-hknps\" (UID: \"33929d5f-e679-44e2-a0e9-816088e17cb1\") " pod="openstack/barbican-keystone-listener-5bf77b97db-hknps" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.006281 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7979dc8455-w5k2f"] Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.009904 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2kr4\" (UniqueName: \"kubernetes.io/projected/864e8c7f-e3b7-4f27-960e-df753b339571-kube-api-access-g2kr4\") pod \"barbican-worker-78b498b86c-nn4vc\" (UID: \"864e8c7f-e3b7-4f27-960e-df753b339571\") " pod="openstack/barbican-worker-78b498b86c-nn4vc" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.045993 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1544da4-e27b-4191-960d-d2a2145c6a42-config\") pod \"dnsmasq-dns-7979dc8455-w5k2f\" (UID: \"f1544da4-e27b-4191-960d-d2a2145c6a42\") " pod="openstack/dnsmasq-dns-7979dc8455-w5k2f" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.046048 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1544da4-e27b-4191-960d-d2a2145c6a42-ovsdbserver-sb\") pod \"dnsmasq-dns-7979dc8455-w5k2f\" (UID: \"f1544da4-e27b-4191-960d-d2a2145c6a42\") " pod="openstack/dnsmasq-dns-7979dc8455-w5k2f" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.046079 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1544da4-e27b-4191-960d-d2a2145c6a42-dns-swift-storage-0\") pod \"dnsmasq-dns-7979dc8455-w5k2f\" (UID: \"f1544da4-e27b-4191-960d-d2a2145c6a42\") " pod="openstack/dnsmasq-dns-7979dc8455-w5k2f" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.046119 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1544da4-e27b-4191-960d-d2a2145c6a42-dns-svc\") pod \"dnsmasq-dns-7979dc8455-w5k2f\" (UID: \"f1544da4-e27b-4191-960d-d2a2145c6a42\") " pod="openstack/dnsmasq-dns-7979dc8455-w5k2f" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.046173 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1544da4-e27b-4191-960d-d2a2145c6a42-ovsdbserver-nb\") pod \"dnsmasq-dns-7979dc8455-w5k2f\" (UID: \"f1544da4-e27b-4191-960d-d2a2145c6a42\") " pod="openstack/dnsmasq-dns-7979dc8455-w5k2f" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.046269 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skmws\" (UniqueName: \"kubernetes.io/projected/f1544da4-e27b-4191-960d-d2a2145c6a42-kube-api-access-skmws\") pod \"dnsmasq-dns-7979dc8455-w5k2f\" (UID: \"f1544da4-e27b-4191-960d-d2a2145c6a42\") " pod="openstack/dnsmasq-dns-7979dc8455-w5k2f" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.097951 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-78b498b86c-nn4vc" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.114213 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5bf77b97db-hknps" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.124359 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-64ff6b6cd6-6qg5b"] Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.125819 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-64ff6b6cd6-6qg5b" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.131998 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.147895 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1544da4-e27b-4191-960d-d2a2145c6a42-ovsdbserver-nb\") pod \"dnsmasq-dns-7979dc8455-w5k2f\" (UID: \"f1544da4-e27b-4191-960d-d2a2145c6a42\") " pod="openstack/dnsmasq-dns-7979dc8455-w5k2f" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.148091 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skmws\" (UniqueName: \"kubernetes.io/projected/f1544da4-e27b-4191-960d-d2a2145c6a42-kube-api-access-skmws\") pod \"dnsmasq-dns-7979dc8455-w5k2f\" (UID: \"f1544da4-e27b-4191-960d-d2a2145c6a42\") " pod="openstack/dnsmasq-dns-7979dc8455-w5k2f" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.148179 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1544da4-e27b-4191-960d-d2a2145c6a42-config\") pod \"dnsmasq-dns-7979dc8455-w5k2f\" (UID: \"f1544da4-e27b-4191-960d-d2a2145c6a42\") " pod="openstack/dnsmasq-dns-7979dc8455-w5k2f" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.148202 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1544da4-e27b-4191-960d-d2a2145c6a42-ovsdbserver-sb\") pod \"dnsmasq-dns-7979dc8455-w5k2f\" (UID: \"f1544da4-e27b-4191-960d-d2a2145c6a42\") " pod="openstack/dnsmasq-dns-7979dc8455-w5k2f" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.148232 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1544da4-e27b-4191-960d-d2a2145c6a42-dns-swift-storage-0\") pod \"dnsmasq-dns-7979dc8455-w5k2f\" (UID: \"f1544da4-e27b-4191-960d-d2a2145c6a42\") " pod="openstack/dnsmasq-dns-7979dc8455-w5k2f" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.157312 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1544da4-e27b-4191-960d-d2a2145c6a42-dns-svc\") pod \"dnsmasq-dns-7979dc8455-w5k2f\" (UID: \"f1544da4-e27b-4191-960d-d2a2145c6a42\") " pod="openstack/dnsmasq-dns-7979dc8455-w5k2f" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.159036 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1544da4-e27b-4191-960d-d2a2145c6a42-ovsdbserver-nb\") pod \"dnsmasq-dns-7979dc8455-w5k2f\" (UID: \"f1544da4-e27b-4191-960d-d2a2145c6a42\") " pod="openstack/dnsmasq-dns-7979dc8455-w5k2f" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.164129 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1544da4-e27b-4191-960d-d2a2145c6a42-dns-svc\") pod \"dnsmasq-dns-7979dc8455-w5k2f\" (UID: \"f1544da4-e27b-4191-960d-d2a2145c6a42\") " pod="openstack/dnsmasq-dns-7979dc8455-w5k2f" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.170995 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1544da4-e27b-4191-960d-d2a2145c6a42-ovsdbserver-sb\") pod \"dnsmasq-dns-7979dc8455-w5k2f\" (UID: \"f1544da4-e27b-4191-960d-d2a2145c6a42\") " pod="openstack/dnsmasq-dns-7979dc8455-w5k2f" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.173449 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1544da4-e27b-4191-960d-d2a2145c6a42-config\") pod \"dnsmasq-dns-7979dc8455-w5k2f\" (UID: \"f1544da4-e27b-4191-960d-d2a2145c6a42\") " pod="openstack/dnsmasq-dns-7979dc8455-w5k2f" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.181095 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skmws\" (UniqueName: \"kubernetes.io/projected/f1544da4-e27b-4191-960d-d2a2145c6a42-kube-api-access-skmws\") pod \"dnsmasq-dns-7979dc8455-w5k2f\" (UID: \"f1544da4-e27b-4191-960d-d2a2145c6a42\") " pod="openstack/dnsmasq-dns-7979dc8455-w5k2f" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.191935 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1544da4-e27b-4191-960d-d2a2145c6a42-dns-swift-storage-0\") pod \"dnsmasq-dns-7979dc8455-w5k2f\" (UID: \"f1544da4-e27b-4191-960d-d2a2145c6a42\") " pod="openstack/dnsmasq-dns-7979dc8455-w5k2f" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.208223 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-64ff6b6cd6-6qg5b"] Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.259486 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csp4q\" (UniqueName: \"kubernetes.io/projected/8b0830a0-1f90-4b33-8976-875adeb804f9-kube-api-access-csp4q\") pod \"barbican-api-64ff6b6cd6-6qg5b\" (UID: \"8b0830a0-1f90-4b33-8976-875adeb804f9\") " pod="openstack/barbican-api-64ff6b6cd6-6qg5b" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.259552 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b0830a0-1f90-4b33-8976-875adeb804f9-config-data-custom\") pod \"barbican-api-64ff6b6cd6-6qg5b\" (UID: \"8b0830a0-1f90-4b33-8976-875adeb804f9\") " pod="openstack/barbican-api-64ff6b6cd6-6qg5b" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.259736 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b0830a0-1f90-4b33-8976-875adeb804f9-config-data\") pod \"barbican-api-64ff6b6cd6-6qg5b\" (UID: \"8b0830a0-1f90-4b33-8976-875adeb804f9\") " pod="openstack/barbican-api-64ff6b6cd6-6qg5b" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.259760 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b0830a0-1f90-4b33-8976-875adeb804f9-combined-ca-bundle\") pod \"barbican-api-64ff6b6cd6-6qg5b\" (UID: \"8b0830a0-1f90-4b33-8976-875adeb804f9\") " pod="openstack/barbican-api-64ff6b6cd6-6qg5b" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.259837 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b0830a0-1f90-4b33-8976-875adeb804f9-logs\") pod \"barbican-api-64ff6b6cd6-6qg5b\" (UID: \"8b0830a0-1f90-4b33-8976-875adeb804f9\") " pod="openstack/barbican-api-64ff6b6cd6-6qg5b" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.363264 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csp4q\" (UniqueName: \"kubernetes.io/projected/8b0830a0-1f90-4b33-8976-875adeb804f9-kube-api-access-csp4q\") pod \"barbican-api-64ff6b6cd6-6qg5b\" (UID: \"8b0830a0-1f90-4b33-8976-875adeb804f9\") " pod="openstack/barbican-api-64ff6b6cd6-6qg5b" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.363651 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b0830a0-1f90-4b33-8976-875adeb804f9-config-data-custom\") pod \"barbican-api-64ff6b6cd6-6qg5b\" (UID: \"8b0830a0-1f90-4b33-8976-875adeb804f9\") " pod="openstack/barbican-api-64ff6b6cd6-6qg5b" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.363824 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b0830a0-1f90-4b33-8976-875adeb804f9-config-data\") pod \"barbican-api-64ff6b6cd6-6qg5b\" (UID: \"8b0830a0-1f90-4b33-8976-875adeb804f9\") " pod="openstack/barbican-api-64ff6b6cd6-6qg5b" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.363856 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b0830a0-1f90-4b33-8976-875adeb804f9-combined-ca-bundle\") pod \"barbican-api-64ff6b6cd6-6qg5b\" (UID: \"8b0830a0-1f90-4b33-8976-875adeb804f9\") " pod="openstack/barbican-api-64ff6b6cd6-6qg5b" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.363951 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b0830a0-1f90-4b33-8976-875adeb804f9-logs\") pod \"barbican-api-64ff6b6cd6-6qg5b\" (UID: \"8b0830a0-1f90-4b33-8976-875adeb804f9\") " pod="openstack/barbican-api-64ff6b6cd6-6qg5b" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.364530 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b0830a0-1f90-4b33-8976-875adeb804f9-logs\") pod \"barbican-api-64ff6b6cd6-6qg5b\" (UID: \"8b0830a0-1f90-4b33-8976-875adeb804f9\") " pod="openstack/barbican-api-64ff6b6cd6-6qg5b" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.376837 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b0830a0-1f90-4b33-8976-875adeb804f9-combined-ca-bundle\") pod \"barbican-api-64ff6b6cd6-6qg5b\" (UID: \"8b0830a0-1f90-4b33-8976-875adeb804f9\") " pod="openstack/barbican-api-64ff6b6cd6-6qg5b" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.379477 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b0830a0-1f90-4b33-8976-875adeb804f9-config-data\") pod \"barbican-api-64ff6b6cd6-6qg5b\" (UID: \"8b0830a0-1f90-4b33-8976-875adeb804f9\") " pod="openstack/barbican-api-64ff6b6cd6-6qg5b" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.380574 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b0830a0-1f90-4b33-8976-875adeb804f9-config-data-custom\") pod \"barbican-api-64ff6b6cd6-6qg5b\" (UID: \"8b0830a0-1f90-4b33-8976-875adeb804f9\") " pod="openstack/barbican-api-64ff6b6cd6-6qg5b" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.410114 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csp4q\" (UniqueName: \"kubernetes.io/projected/8b0830a0-1f90-4b33-8976-875adeb804f9-kube-api-access-csp4q\") pod \"barbican-api-64ff6b6cd6-6qg5b\" (UID: \"8b0830a0-1f90-4b33-8976-875adeb804f9\") " pod="openstack/barbican-api-64ff6b6cd6-6qg5b" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.446678 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69289a6c-7b95-4c16-a326-dab582fc4b86","Type":"ContainerStarted","Data":"dde7cabe52ee3128f06443a1007f42fee418d5247be7aa5cbc03fef4f7e1dcc5"} Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.490904 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7979dc8455-w5k2f" Feb 18 14:18:59 crc kubenswrapper[4817]: I0218 14:18:59.595638 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-64ff6b6cd6-6qg5b" Feb 18 14:19:00 crc kubenswrapper[4817]: I0218 14:19:00.062204 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-78b498b86c-nn4vc"] Feb 18 14:19:00 crc kubenswrapper[4817]: W0218 14:19:00.123739 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33929d5f_e679_44e2_a0e9_816088e17cb1.slice/crio-1ca83fd94ff692b34474d63059365d03eb37d590cb0af53347a1284d9d83f238 WatchSource:0}: Error finding container 1ca83fd94ff692b34474d63059365d03eb37d590cb0af53347a1284d9d83f238: Status 404 returned error can't find the container with id 1ca83fd94ff692b34474d63059365d03eb37d590cb0af53347a1284d9d83f238 Feb 18 14:19:00 crc kubenswrapper[4817]: I0218 14:19:00.140602 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5bf77b97db-hknps"] Feb 18 14:19:00 crc kubenswrapper[4817]: I0218 14:19:00.372075 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7979dc8455-w5k2f"] Feb 18 14:19:00 crc kubenswrapper[4817]: I0218 14:19:00.478604 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-78b498b86c-nn4vc" event={"ID":"864e8c7f-e3b7-4f27-960e-df753b339571","Type":"ContainerStarted","Data":"64035b57cce6d1e71b6b94b96436735f7868626c6460dfe140a129646d23d433"} Feb 18 14:19:00 crc kubenswrapper[4817]: I0218 14:19:00.480712 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7979dc8455-w5k2f" event={"ID":"f1544da4-e27b-4191-960d-d2a2145c6a42","Type":"ContainerStarted","Data":"4d9280af8a1dcf41baf26dc55c6c58e8abdd9dfa171442035a9c5ee2fe14f149"} Feb 18 14:19:00 crc kubenswrapper[4817]: I0218 14:19:00.483684 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5bf77b97db-hknps" event={"ID":"33929d5f-e679-44e2-a0e9-816088e17cb1","Type":"ContainerStarted","Data":"1ca83fd94ff692b34474d63059365d03eb37d590cb0af53347a1284d9d83f238"} Feb 18 14:19:00 crc kubenswrapper[4817]: I0218 14:19:00.577473 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-64ff6b6cd6-6qg5b"] Feb 18 14:19:01 crc kubenswrapper[4817]: I0218 14:19:01.500343 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69289a6c-7b95-4c16-a326-dab582fc4b86","Type":"ContainerStarted","Data":"4349e8f38b3e490984b83b035c211819bd8b59dddba035180f6537e1e67865b1"} Feb 18 14:19:01 crc kubenswrapper[4817]: I0218 14:19:01.506129 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-64ff6b6cd6-6qg5b" event={"ID":"8b0830a0-1f90-4b33-8976-875adeb804f9","Type":"ContainerStarted","Data":"78c0dd6a38727f9c2bedc253d7c6cf61f7a740807fbceb83f866aef29681a977"} Feb 18 14:19:01 crc kubenswrapper[4817]: I0218 14:19:01.506175 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-64ff6b6cd6-6qg5b" event={"ID":"8b0830a0-1f90-4b33-8976-875adeb804f9","Type":"ContainerStarted","Data":"a1834ce4dec06d3730ebe5f5f3ce43bb088ac6f06075ee7f3f96e57babfdfc43"} Feb 18 14:19:01 crc kubenswrapper[4817]: I0218 14:19:01.518524 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7979dc8455-w5k2f" event={"ID":"f1544da4-e27b-4191-960d-d2a2145c6a42","Type":"ContainerStarted","Data":"66021b93b6ee75d0eb93524e345b56d14e0c76bd376e4922b3beebfde1d71b4a"} Feb 18 14:19:02 crc kubenswrapper[4817]: I0218 14:19:02.218872 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-759f74666b-ms4jl"] Feb 18 14:19:02 crc kubenswrapper[4817]: I0218 14:19:02.220708 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-759f74666b-ms4jl" Feb 18 14:19:02 crc kubenswrapper[4817]: I0218 14:19:02.223706 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 18 14:19:02 crc kubenswrapper[4817]: I0218 14:19:02.224078 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 18 14:19:02 crc kubenswrapper[4817]: I0218 14:19:02.259510 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-759f74666b-ms4jl"] Feb 18 14:19:02 crc kubenswrapper[4817]: I0218 14:19:02.346606 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6b68ae5-35a8-4050-9c64-e6ef834803fd-public-tls-certs\") pod \"barbican-api-759f74666b-ms4jl\" (UID: \"f6b68ae5-35a8-4050-9c64-e6ef834803fd\") " pod="openstack/barbican-api-759f74666b-ms4jl" Feb 18 14:19:02 crc kubenswrapper[4817]: I0218 14:19:02.347074 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6b68ae5-35a8-4050-9c64-e6ef834803fd-internal-tls-certs\") pod \"barbican-api-759f74666b-ms4jl\" (UID: \"f6b68ae5-35a8-4050-9c64-e6ef834803fd\") " pod="openstack/barbican-api-759f74666b-ms4jl" Feb 18 14:19:02 crc kubenswrapper[4817]: I0218 14:19:02.347135 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj2mf\" (UniqueName: \"kubernetes.io/projected/f6b68ae5-35a8-4050-9c64-e6ef834803fd-kube-api-access-zj2mf\") pod \"barbican-api-759f74666b-ms4jl\" (UID: \"f6b68ae5-35a8-4050-9c64-e6ef834803fd\") " pod="openstack/barbican-api-759f74666b-ms4jl" Feb 18 14:19:02 crc kubenswrapper[4817]: I0218 14:19:02.347160 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b68ae5-35a8-4050-9c64-e6ef834803fd-config-data\") pod \"barbican-api-759f74666b-ms4jl\" (UID: \"f6b68ae5-35a8-4050-9c64-e6ef834803fd\") " pod="openstack/barbican-api-759f74666b-ms4jl" Feb 18 14:19:02 crc kubenswrapper[4817]: I0218 14:19:02.347230 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f6b68ae5-35a8-4050-9c64-e6ef834803fd-config-data-custom\") pod \"barbican-api-759f74666b-ms4jl\" (UID: \"f6b68ae5-35a8-4050-9c64-e6ef834803fd\") " pod="openstack/barbican-api-759f74666b-ms4jl" Feb 18 14:19:02 crc kubenswrapper[4817]: I0218 14:19:02.347384 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b68ae5-35a8-4050-9c64-e6ef834803fd-combined-ca-bundle\") pod \"barbican-api-759f74666b-ms4jl\" (UID: \"f6b68ae5-35a8-4050-9c64-e6ef834803fd\") " pod="openstack/barbican-api-759f74666b-ms4jl" Feb 18 14:19:02 crc kubenswrapper[4817]: I0218 14:19:02.347446 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6b68ae5-35a8-4050-9c64-e6ef834803fd-logs\") pod \"barbican-api-759f74666b-ms4jl\" (UID: \"f6b68ae5-35a8-4050-9c64-e6ef834803fd\") " pod="openstack/barbican-api-759f74666b-ms4jl" Feb 18 14:19:02 crc kubenswrapper[4817]: I0218 14:19:02.449548 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b68ae5-35a8-4050-9c64-e6ef834803fd-combined-ca-bundle\") pod \"barbican-api-759f74666b-ms4jl\" (UID: \"f6b68ae5-35a8-4050-9c64-e6ef834803fd\") " pod="openstack/barbican-api-759f74666b-ms4jl" Feb 18 14:19:02 crc kubenswrapper[4817]: I0218 14:19:02.449601 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6b68ae5-35a8-4050-9c64-e6ef834803fd-logs\") pod \"barbican-api-759f74666b-ms4jl\" (UID: \"f6b68ae5-35a8-4050-9c64-e6ef834803fd\") " pod="openstack/barbican-api-759f74666b-ms4jl" Feb 18 14:19:02 crc kubenswrapper[4817]: I0218 14:19:02.449659 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6b68ae5-35a8-4050-9c64-e6ef834803fd-public-tls-certs\") pod \"barbican-api-759f74666b-ms4jl\" (UID: \"f6b68ae5-35a8-4050-9c64-e6ef834803fd\") " pod="openstack/barbican-api-759f74666b-ms4jl" Feb 18 14:19:02 crc kubenswrapper[4817]: I0218 14:19:02.449687 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6b68ae5-35a8-4050-9c64-e6ef834803fd-internal-tls-certs\") pod \"barbican-api-759f74666b-ms4jl\" (UID: \"f6b68ae5-35a8-4050-9c64-e6ef834803fd\") " pod="openstack/barbican-api-759f74666b-ms4jl" Feb 18 14:19:02 crc kubenswrapper[4817]: I0218 14:19:02.449715 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj2mf\" (UniqueName: \"kubernetes.io/projected/f6b68ae5-35a8-4050-9c64-e6ef834803fd-kube-api-access-zj2mf\") pod \"barbican-api-759f74666b-ms4jl\" (UID: \"f6b68ae5-35a8-4050-9c64-e6ef834803fd\") " pod="openstack/barbican-api-759f74666b-ms4jl" Feb 18 14:19:02 crc kubenswrapper[4817]: I0218 14:19:02.449737 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b68ae5-35a8-4050-9c64-e6ef834803fd-config-data\") pod \"barbican-api-759f74666b-ms4jl\" (UID: \"f6b68ae5-35a8-4050-9c64-e6ef834803fd\") " pod="openstack/barbican-api-759f74666b-ms4jl" Feb 18 14:19:02 crc kubenswrapper[4817]: I0218 14:19:02.450252 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6b68ae5-35a8-4050-9c64-e6ef834803fd-logs\") pod \"barbican-api-759f74666b-ms4jl\" (UID: \"f6b68ae5-35a8-4050-9c64-e6ef834803fd\") " pod="openstack/barbican-api-759f74666b-ms4jl" Feb 18 14:19:02 crc kubenswrapper[4817]: I0218 14:19:02.450479 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f6b68ae5-35a8-4050-9c64-e6ef834803fd-config-data-custom\") pod \"barbican-api-759f74666b-ms4jl\" (UID: \"f6b68ae5-35a8-4050-9c64-e6ef834803fd\") " pod="openstack/barbican-api-759f74666b-ms4jl" Feb 18 14:19:02 crc kubenswrapper[4817]: I0218 14:19:02.456706 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b68ae5-35a8-4050-9c64-e6ef834803fd-config-data\") pod \"barbican-api-759f74666b-ms4jl\" (UID: \"f6b68ae5-35a8-4050-9c64-e6ef834803fd\") " pod="openstack/barbican-api-759f74666b-ms4jl" Feb 18 14:19:02 crc kubenswrapper[4817]: I0218 14:19:02.457418 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6b68ae5-35a8-4050-9c64-e6ef834803fd-internal-tls-certs\") pod \"barbican-api-759f74666b-ms4jl\" (UID: \"f6b68ae5-35a8-4050-9c64-e6ef834803fd\") " pod="openstack/barbican-api-759f74666b-ms4jl" Feb 18 14:19:02 crc kubenswrapper[4817]: I0218 14:19:02.458329 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6b68ae5-35a8-4050-9c64-e6ef834803fd-public-tls-certs\") pod \"barbican-api-759f74666b-ms4jl\" (UID: \"f6b68ae5-35a8-4050-9c64-e6ef834803fd\") " pod="openstack/barbican-api-759f74666b-ms4jl" Feb 18 14:19:02 crc kubenswrapper[4817]: I0218 14:19:02.461468 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f6b68ae5-35a8-4050-9c64-e6ef834803fd-config-data-custom\") pod \"barbican-api-759f74666b-ms4jl\" (UID: \"f6b68ae5-35a8-4050-9c64-e6ef834803fd\") " pod="openstack/barbican-api-759f74666b-ms4jl" Feb 18 14:19:02 crc kubenswrapper[4817]: I0218 14:19:02.474484 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj2mf\" (UniqueName: \"kubernetes.io/projected/f6b68ae5-35a8-4050-9c64-e6ef834803fd-kube-api-access-zj2mf\") pod \"barbican-api-759f74666b-ms4jl\" (UID: \"f6b68ae5-35a8-4050-9c64-e6ef834803fd\") " pod="openstack/barbican-api-759f74666b-ms4jl" Feb 18 14:19:02 crc kubenswrapper[4817]: I0218 14:19:02.474866 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b68ae5-35a8-4050-9c64-e6ef834803fd-combined-ca-bundle\") pod \"barbican-api-759f74666b-ms4jl\" (UID: \"f6b68ae5-35a8-4050-9c64-e6ef834803fd\") " pod="openstack/barbican-api-759f74666b-ms4jl" Feb 18 14:19:02 crc kubenswrapper[4817]: I0218 14:19:02.550019 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-759f74666b-ms4jl" Feb 18 14:19:02 crc kubenswrapper[4817]: I0218 14:19:02.555854 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-64ff6b6cd6-6qg5b" event={"ID":"8b0830a0-1f90-4b33-8976-875adeb804f9","Type":"ContainerStarted","Data":"ca9e72e424c10baf7d397ed920fbc591f0259c9858ae4eb24d73920e41c1129f"} Feb 18 14:19:02 crc kubenswrapper[4817]: I0218 14:19:02.556036 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-64ff6b6cd6-6qg5b" Feb 18 14:19:02 crc kubenswrapper[4817]: I0218 14:19:02.559567 4817 generic.go:334] "Generic (PLEG): container finished" podID="f1544da4-e27b-4191-960d-d2a2145c6a42" containerID="66021b93b6ee75d0eb93524e345b56d14e0c76bd376e4922b3beebfde1d71b4a" exitCode=0 Feb 18 14:19:02 crc kubenswrapper[4817]: I0218 14:19:02.559644 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7979dc8455-w5k2f" event={"ID":"f1544da4-e27b-4191-960d-d2a2145c6a42","Type":"ContainerDied","Data":"66021b93b6ee75d0eb93524e345b56d14e0c76bd376e4922b3beebfde1d71b4a"} Feb 18 14:19:02 crc kubenswrapper[4817]: I0218 14:19:02.650575 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-64ff6b6cd6-6qg5b" podStartSLOduration=3.650551631 podStartE2EDuration="3.650551631s" podCreationTimestamp="2026-02-18 14:18:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:19:02.587938907 +0000 UTC m=+1205.163474890" watchObservedRunningTime="2026-02-18 14:19:02.650551631 +0000 UTC m=+1205.226087614" Feb 18 14:19:02 crc kubenswrapper[4817]: I0218 14:19:02.655167 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.055803613 podStartE2EDuration="8.655151636s" podCreationTimestamp="2026-02-18 14:18:54 +0000 UTC" firstStartedPulling="2026-02-18 14:18:55.233201219 +0000 UTC m=+1197.808737202" lastFinishedPulling="2026-02-18 14:19:00.832549242 +0000 UTC m=+1203.408085225" observedRunningTime="2026-02-18 14:19:02.611814357 +0000 UTC m=+1205.187350350" watchObservedRunningTime="2026-02-18 14:19:02.655151636 +0000 UTC m=+1205.230687619" Feb 18 14:19:03 crc kubenswrapper[4817]: I0218 14:19:03.570707 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-64ff6b6cd6-6qg5b" Feb 18 14:19:08 crc kubenswrapper[4817]: I0218 14:19:08.734547 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-759f74666b-ms4jl"] Feb 18 14:19:09 crc kubenswrapper[4817]: I0218 14:19:09.656295 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7979dc8455-w5k2f" event={"ID":"f1544da4-e27b-4191-960d-d2a2145c6a42","Type":"ContainerStarted","Data":"c14342e7e3cade0d49811e211d0118efbb76d10f00b8a8c70653c697a6221a7a"} Feb 18 14:19:09 crc kubenswrapper[4817]: I0218 14:19:09.657962 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7979dc8455-w5k2f" Feb 18 14:19:09 crc kubenswrapper[4817]: I0218 14:19:09.662201 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-759f74666b-ms4jl" event={"ID":"f6b68ae5-35a8-4050-9c64-e6ef834803fd","Type":"ContainerStarted","Data":"ac616a62a389d853f5c8ba3c008e8f45007ae73c2e177b3a15a992921a141a55"} Feb 18 14:19:09 crc kubenswrapper[4817]: I0218 14:19:09.662366 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-759f74666b-ms4jl" event={"ID":"f6b68ae5-35a8-4050-9c64-e6ef834803fd","Type":"ContainerStarted","Data":"caf118ddda944e253fd3fbd816cb30378fed637e9465d33eb61ed40491e3de0e"} Feb 18 14:19:09 crc kubenswrapper[4817]: I0218 14:19:09.690705 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7979dc8455-w5k2f" podStartSLOduration=11.69068276 podStartE2EDuration="11.69068276s" podCreationTimestamp="2026-02-18 14:18:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:19:09.682055633 +0000 UTC m=+1212.257591616" watchObservedRunningTime="2026-02-18 14:19:09.69068276 +0000 UTC m=+1212.266218743" Feb 18 14:19:10 crc kubenswrapper[4817]: I0218 14:19:10.651327 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-755bd56c8d-4mwpl" Feb 18 14:19:10 crc kubenswrapper[4817]: I0218 14:19:10.672220 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-759f74666b-ms4jl" event={"ID":"f6b68ae5-35a8-4050-9c64-e6ef834803fd","Type":"ContainerStarted","Data":"321909f593ad8bf7ac3090c0f7f9ee66f23d18c424d58a88d9b613533e01c26a"} Feb 18 14:19:10 crc kubenswrapper[4817]: I0218 14:19:10.672418 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-759f74666b-ms4jl" Feb 18 14:19:10 crc kubenswrapper[4817]: I0218 14:19:10.672455 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-759f74666b-ms4jl" Feb 18 14:19:10 crc kubenswrapper[4817]: I0218 14:19:10.673598 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-78b498b86c-nn4vc" event={"ID":"864e8c7f-e3b7-4f27-960e-df753b339571","Type":"ContainerStarted","Data":"8f045829cf68219cf2fe01630d3bab7f3acefe98aeba4d13f745c2637ea19bd4"} Feb 18 14:19:10 crc kubenswrapper[4817]: I0218 14:19:10.696921 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-759f74666b-ms4jl" podStartSLOduration=8.696897923 podStartE2EDuration="8.696897923s" podCreationTimestamp="2026-02-18 14:19:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:19:10.689171789 +0000 UTC m=+1213.264707812" watchObservedRunningTime="2026-02-18 14:19:10.696897923 +0000 UTC m=+1213.272433906" Feb 18 14:19:11 crc kubenswrapper[4817]: I0218 14:19:11.694406 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5bf77b97db-hknps" event={"ID":"33929d5f-e679-44e2-a0e9-816088e17cb1","Type":"ContainerStarted","Data":"5abdc68834f234dc0435fd3e9a5ac9a4b7375dadca316f4aef9d3229854c334c"} Feb 18 14:19:11 crc kubenswrapper[4817]: I0218 14:19:11.695023 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5bf77b97db-hknps" event={"ID":"33929d5f-e679-44e2-a0e9-816088e17cb1","Type":"ContainerStarted","Data":"026b249dc2955babeb33cc73d49c6a86131c6213dbd03dd060fcf3bd51f1ce70"} Feb 18 14:19:11 crc kubenswrapper[4817]: I0218 14:19:11.710854 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-78b498b86c-nn4vc" event={"ID":"864e8c7f-e3b7-4f27-960e-df753b339571","Type":"ContainerStarted","Data":"415dfab3e15806a33a5b392118a21fc6e9ed461317b6835a8a1f305e836f5680"} Feb 18 14:19:11 crc kubenswrapper[4817]: I0218 14:19:11.724828 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5bf77b97db-hknps" podStartSLOduration=3.645934527 podStartE2EDuration="13.724811462s" podCreationTimestamp="2026-02-18 14:18:58 +0000 UTC" firstStartedPulling="2026-02-18 14:19:00.156646951 +0000 UTC m=+1202.732182934" lastFinishedPulling="2026-02-18 14:19:10.235523886 +0000 UTC m=+1212.811059869" observedRunningTime="2026-02-18 14:19:11.717994851 +0000 UTC m=+1214.293530844" watchObservedRunningTime="2026-02-18 14:19:11.724811462 +0000 UTC m=+1214.300347445" Feb 18 14:19:11 crc kubenswrapper[4817]: I0218 14:19:11.756936 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-78b498b86c-nn4vc" podStartSLOduration=4.039532123 podStartE2EDuration="13.75691571s" podCreationTimestamp="2026-02-18 14:18:58 +0000 UTC" firstStartedPulling="2026-02-18 14:19:00.094955441 +0000 UTC m=+1202.670491424" lastFinishedPulling="2026-02-18 14:19:09.812339028 +0000 UTC m=+1212.387875011" observedRunningTime="2026-02-18 14:19:11.752186661 +0000 UTC m=+1214.327722644" watchObservedRunningTime="2026-02-18 14:19:11.75691571 +0000 UTC m=+1214.332451693" Feb 18 14:19:12 crc kubenswrapper[4817]: I0218 14:19:12.060509 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-64ff6b6cd6-6qg5b" podUID="8b0830a0-1f90-4b33-8976-875adeb804f9" containerName="barbican-api-log" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 14:19:12 crc kubenswrapper[4817]: I0218 14:19:12.113244 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-64ff6b6cd6-6qg5b" podUID="8b0830a0-1f90-4b33-8976-875adeb804f9" containerName="barbican-api-log" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 14:19:12 crc kubenswrapper[4817]: I0218 14:19:12.136735 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-64ff6b6cd6-6qg5b" podUID="8b0830a0-1f90-4b33-8976-875adeb804f9" containerName="barbican-api" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 14:19:12 crc kubenswrapper[4817]: I0218 14:19:12.863193 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:19:12 crc kubenswrapper[4817]: I0218 14:19:12.863267 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:19:13 crc kubenswrapper[4817]: I0218 14:19:13.423853 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 18 14:19:13 crc kubenswrapper[4817]: I0218 14:19:13.425435 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 18 14:19:13 crc kubenswrapper[4817]: I0218 14:19:13.427913 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 18 14:19:13 crc kubenswrapper[4817]: I0218 14:19:13.427956 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-km6vl" Feb 18 14:19:13 crc kubenswrapper[4817]: I0218 14:19:13.445499 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 18 14:19:13 crc kubenswrapper[4817]: I0218 14:19:13.456132 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 18 14:19:13 crc kubenswrapper[4817]: I0218 14:19:13.527623 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8c3e6421-4a5d-4da3-a1c2-4ae21ae25115-openstack-config\") pod \"openstackclient\" (UID: \"8c3e6421-4a5d-4da3-a1c2-4ae21ae25115\") " pod="openstack/openstackclient" Feb 18 14:19:13 crc kubenswrapper[4817]: I0218 14:19:13.527744 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8c3e6421-4a5d-4da3-a1c2-4ae21ae25115-openstack-config-secret\") pod \"openstackclient\" (UID: \"8c3e6421-4a5d-4da3-a1c2-4ae21ae25115\") " pod="openstack/openstackclient" Feb 18 14:19:13 crc kubenswrapper[4817]: I0218 14:19:13.527968 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrsnr\" (UniqueName: \"kubernetes.io/projected/8c3e6421-4a5d-4da3-a1c2-4ae21ae25115-kube-api-access-vrsnr\") pod \"openstackclient\" (UID: \"8c3e6421-4a5d-4da3-a1c2-4ae21ae25115\") " pod="openstack/openstackclient" Feb 18 14:19:13 crc kubenswrapper[4817]: I0218 14:19:13.528031 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c3e6421-4a5d-4da3-a1c2-4ae21ae25115-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8c3e6421-4a5d-4da3-a1c2-4ae21ae25115\") " pod="openstack/openstackclient" Feb 18 14:19:13 crc kubenswrapper[4817]: I0218 14:19:13.629746 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrsnr\" (UniqueName: \"kubernetes.io/projected/8c3e6421-4a5d-4da3-a1c2-4ae21ae25115-kube-api-access-vrsnr\") pod \"openstackclient\" (UID: \"8c3e6421-4a5d-4da3-a1c2-4ae21ae25115\") " pod="openstack/openstackclient" Feb 18 14:19:13 crc kubenswrapper[4817]: I0218 14:19:13.629840 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c3e6421-4a5d-4da3-a1c2-4ae21ae25115-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8c3e6421-4a5d-4da3-a1c2-4ae21ae25115\") " pod="openstack/openstackclient" Feb 18 14:19:13 crc kubenswrapper[4817]: I0218 14:19:13.629911 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8c3e6421-4a5d-4da3-a1c2-4ae21ae25115-openstack-config\") pod \"openstackclient\" (UID: \"8c3e6421-4a5d-4da3-a1c2-4ae21ae25115\") " pod="openstack/openstackclient" Feb 18 14:19:13 crc kubenswrapper[4817]: I0218 14:19:13.629968 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8c3e6421-4a5d-4da3-a1c2-4ae21ae25115-openstack-config-secret\") pod \"openstackclient\" (UID: \"8c3e6421-4a5d-4da3-a1c2-4ae21ae25115\") " pod="openstack/openstackclient" Feb 18 14:19:13 crc kubenswrapper[4817]: I0218 14:19:13.632141 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8c3e6421-4a5d-4da3-a1c2-4ae21ae25115-openstack-config\") pod \"openstackclient\" (UID: \"8c3e6421-4a5d-4da3-a1c2-4ae21ae25115\") " pod="openstack/openstackclient" Feb 18 14:19:13 crc kubenswrapper[4817]: I0218 14:19:13.638394 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8c3e6421-4a5d-4da3-a1c2-4ae21ae25115-openstack-config-secret\") pod \"openstackclient\" (UID: \"8c3e6421-4a5d-4da3-a1c2-4ae21ae25115\") " pod="openstack/openstackclient" Feb 18 14:19:13 crc kubenswrapper[4817]: I0218 14:19:13.638585 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c3e6421-4a5d-4da3-a1c2-4ae21ae25115-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8c3e6421-4a5d-4da3-a1c2-4ae21ae25115\") " pod="openstack/openstackclient" Feb 18 14:19:13 crc kubenswrapper[4817]: I0218 14:19:13.652536 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrsnr\" (UniqueName: \"kubernetes.io/projected/8c3e6421-4a5d-4da3-a1c2-4ae21ae25115-kube-api-access-vrsnr\") pod \"openstackclient\" (UID: \"8c3e6421-4a5d-4da3-a1c2-4ae21ae25115\") " pod="openstack/openstackclient" Feb 18 14:19:13 crc kubenswrapper[4817]: I0218 14:19:13.742866 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 18 14:19:13 crc kubenswrapper[4817]: I0218 14:19:13.809321 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 18 14:19:13 crc kubenswrapper[4817]: I0218 14:19:13.828410 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 18 14:19:13 crc kubenswrapper[4817]: I0218 14:19:13.861510 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 18 14:19:13 crc kubenswrapper[4817]: I0218 14:19:13.868384 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 18 14:19:13 crc kubenswrapper[4817]: I0218 14:19:13.915694 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 18 14:19:13 crc kubenswrapper[4817]: I0218 14:19:13.939586 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvd8b\" (UniqueName: \"kubernetes.io/projected/ace81bfb-db15-429f-9168-936817dad694-kube-api-access-pvd8b\") pod \"openstackclient\" (UID: \"ace81bfb-db15-429f-9168-936817dad694\") " pod="openstack/openstackclient" Feb 18 14:19:13 crc kubenswrapper[4817]: I0218 14:19:13.939655 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ace81bfb-db15-429f-9168-936817dad694-openstack-config\") pod \"openstackclient\" (UID: \"ace81bfb-db15-429f-9168-936817dad694\") " pod="openstack/openstackclient" Feb 18 14:19:13 crc kubenswrapper[4817]: I0218 14:19:13.939770 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ace81bfb-db15-429f-9168-936817dad694-openstack-config-secret\") pod \"openstackclient\" (UID: \"ace81bfb-db15-429f-9168-936817dad694\") " pod="openstack/openstackclient" Feb 18 14:19:13 crc kubenswrapper[4817]: I0218 14:19:13.939869 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace81bfb-db15-429f-9168-936817dad694-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ace81bfb-db15-429f-9168-936817dad694\") " pod="openstack/openstackclient" Feb 18 14:19:13 crc kubenswrapper[4817]: E0218 14:19:13.957882 4817 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 18 14:19:13 crc kubenswrapper[4817]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_8c3e6421-4a5d-4da3-a1c2-4ae21ae25115_0(2c93f65664dbdd14dcb4bcc0e9f1346ec94e346cb6e17e3e5b0526267a48c5e9): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"2c93f65664dbdd14dcb4bcc0e9f1346ec94e346cb6e17e3e5b0526267a48c5e9" Netns:"/var/run/netns/2cae2369-b876-410c-b221-f21d4b042954" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=2c93f65664dbdd14dcb4bcc0e9f1346ec94e346cb6e17e3e5b0526267a48c5e9;K8S_POD_UID=8c3e6421-4a5d-4da3-a1c2-4ae21ae25115" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/8c3e6421-4a5d-4da3-a1c2-4ae21ae25115]: expected pod UID "8c3e6421-4a5d-4da3-a1c2-4ae21ae25115" but got "ace81bfb-db15-429f-9168-936817dad694" from Kube API Feb 18 14:19:13 crc kubenswrapper[4817]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 18 14:19:13 crc kubenswrapper[4817]: > Feb 18 14:19:13 crc kubenswrapper[4817]: E0218 14:19:13.957968 4817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 18 14:19:13 crc kubenswrapper[4817]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_8c3e6421-4a5d-4da3-a1c2-4ae21ae25115_0(2c93f65664dbdd14dcb4bcc0e9f1346ec94e346cb6e17e3e5b0526267a48c5e9): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"2c93f65664dbdd14dcb4bcc0e9f1346ec94e346cb6e17e3e5b0526267a48c5e9" Netns:"/var/run/netns/2cae2369-b876-410c-b221-f21d4b042954" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=2c93f65664dbdd14dcb4bcc0e9f1346ec94e346cb6e17e3e5b0526267a48c5e9;K8S_POD_UID=8c3e6421-4a5d-4da3-a1c2-4ae21ae25115" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/8c3e6421-4a5d-4da3-a1c2-4ae21ae25115]: expected pod UID "8c3e6421-4a5d-4da3-a1c2-4ae21ae25115" but got "ace81bfb-db15-429f-9168-936817dad694" from Kube API Feb 18 14:19:13 crc kubenswrapper[4817]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 18 14:19:13 crc kubenswrapper[4817]: > pod="openstack/openstackclient" Feb 18 14:19:14 crc kubenswrapper[4817]: I0218 14:19:14.042028 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ace81bfb-db15-429f-9168-936817dad694-openstack-config-secret\") pod \"openstackclient\" (UID: \"ace81bfb-db15-429f-9168-936817dad694\") " pod="openstack/openstackclient" Feb 18 14:19:14 crc kubenswrapper[4817]: I0218 14:19:14.042177 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace81bfb-db15-429f-9168-936817dad694-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ace81bfb-db15-429f-9168-936817dad694\") " pod="openstack/openstackclient" Feb 18 14:19:14 crc kubenswrapper[4817]: I0218 14:19:14.042232 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvd8b\" (UniqueName: \"kubernetes.io/projected/ace81bfb-db15-429f-9168-936817dad694-kube-api-access-pvd8b\") pod \"openstackclient\" (UID: \"ace81bfb-db15-429f-9168-936817dad694\") " pod="openstack/openstackclient" Feb 18 14:19:14 crc kubenswrapper[4817]: I0218 14:19:14.042269 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ace81bfb-db15-429f-9168-936817dad694-openstack-config\") pod \"openstackclient\" (UID: \"ace81bfb-db15-429f-9168-936817dad694\") " pod="openstack/openstackclient" Feb 18 14:19:14 crc kubenswrapper[4817]: I0218 14:19:14.043208 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ace81bfb-db15-429f-9168-936817dad694-openstack-config\") pod \"openstackclient\" (UID: \"ace81bfb-db15-429f-9168-936817dad694\") " pod="openstack/openstackclient" Feb 18 14:19:14 crc kubenswrapper[4817]: I0218 14:19:14.046594 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace81bfb-db15-429f-9168-936817dad694-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ace81bfb-db15-429f-9168-936817dad694\") " pod="openstack/openstackclient" Feb 18 14:19:14 crc kubenswrapper[4817]: I0218 14:19:14.046694 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ace81bfb-db15-429f-9168-936817dad694-openstack-config-secret\") pod \"openstackclient\" (UID: \"ace81bfb-db15-429f-9168-936817dad694\") " pod="openstack/openstackclient" Feb 18 14:19:14 crc kubenswrapper[4817]: I0218 14:19:14.065444 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvd8b\" (UniqueName: \"kubernetes.io/projected/ace81bfb-db15-429f-9168-936817dad694-kube-api-access-pvd8b\") pod \"openstackclient\" (UID: \"ace81bfb-db15-429f-9168-936817dad694\") " pod="openstack/openstackclient" Feb 18 14:19:14 crc kubenswrapper[4817]: I0218 14:19:14.070625 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7b57694bbd-rpg5b" Feb 18 14:19:14 crc kubenswrapper[4817]: I0218 14:19:14.208569 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 18 14:19:14 crc kubenswrapper[4817]: I0218 14:19:14.493177 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7979dc8455-w5k2f" Feb 18 14:19:14 crc kubenswrapper[4817]: I0218 14:19:14.560639 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-kg5vn"] Feb 18 14:19:14 crc kubenswrapper[4817]: I0218 14:19:14.560859 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76fcf4b695-kg5vn" podUID="6690ffc6-f624-448b-81d7-e36a8e059a44" containerName="dnsmasq-dns" containerID="cri-o://ec983c9bde2a9cfc3193bd7dfe622c99f90676a21ce1f58fa9eaf2e7ca242469" gracePeriod=10 Feb 18 14:19:14 crc kubenswrapper[4817]: I0218 14:19:14.735791 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 18 14:19:14 crc kubenswrapper[4817]: W0218 14:19:14.739315 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podace81bfb_db15_429f_9168_936817dad694.slice/crio-548064e943e62ab2273cf5924daf71be00ac8a37b110197735841fd772c0ede6 WatchSource:0}: Error finding container 548064e943e62ab2273cf5924daf71be00ac8a37b110197735841fd772c0ede6: Status 404 returned error can't find the container with id 548064e943e62ab2273cf5924daf71be00ac8a37b110197735841fd772c0ede6 Feb 18 14:19:14 crc kubenswrapper[4817]: I0218 14:19:14.760202 4817 generic.go:334] "Generic (PLEG): container finished" podID="6690ffc6-f624-448b-81d7-e36a8e059a44" containerID="ec983c9bde2a9cfc3193bd7dfe622c99f90676a21ce1f58fa9eaf2e7ca242469" exitCode=0 Feb 18 14:19:14 crc kubenswrapper[4817]: I0218 14:19:14.760574 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 18 14:19:14 crc kubenswrapper[4817]: I0218 14:19:14.761265 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-kg5vn" event={"ID":"6690ffc6-f624-448b-81d7-e36a8e059a44","Type":"ContainerDied","Data":"ec983c9bde2a9cfc3193bd7dfe622c99f90676a21ce1f58fa9eaf2e7ca242469"} Feb 18 14:19:14 crc kubenswrapper[4817]: I0218 14:19:14.884673 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 18 14:19:14 crc kubenswrapper[4817]: I0218 14:19:14.889294 4817 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="8c3e6421-4a5d-4da3-a1c2-4ae21ae25115" podUID="ace81bfb-db15-429f-9168-936817dad694" Feb 18 14:19:14 crc kubenswrapper[4817]: I0218 14:19:14.909913 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7b57694bbd-rpg5b" Feb 18 14:19:14 crc kubenswrapper[4817]: I0218 14:19:14.958420 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c3e6421-4a5d-4da3-a1c2-4ae21ae25115-combined-ca-bundle\") pod \"8c3e6421-4a5d-4da3-a1c2-4ae21ae25115\" (UID: \"8c3e6421-4a5d-4da3-a1c2-4ae21ae25115\") " Feb 18 14:19:14 crc kubenswrapper[4817]: I0218 14:19:14.958647 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8c3e6421-4a5d-4da3-a1c2-4ae21ae25115-openstack-config-secret\") pod \"8c3e6421-4a5d-4da3-a1c2-4ae21ae25115\" (UID: \"8c3e6421-4a5d-4da3-a1c2-4ae21ae25115\") " Feb 18 14:19:14 crc kubenswrapper[4817]: I0218 14:19:14.958697 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrsnr\" (UniqueName: \"kubernetes.io/projected/8c3e6421-4a5d-4da3-a1c2-4ae21ae25115-kube-api-access-vrsnr\") pod \"8c3e6421-4a5d-4da3-a1c2-4ae21ae25115\" (UID: \"8c3e6421-4a5d-4da3-a1c2-4ae21ae25115\") " Feb 18 14:19:14 crc kubenswrapper[4817]: I0218 14:19:14.958852 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8c3e6421-4a5d-4da3-a1c2-4ae21ae25115-openstack-config\") pod \"8c3e6421-4a5d-4da3-a1c2-4ae21ae25115\" (UID: \"8c3e6421-4a5d-4da3-a1c2-4ae21ae25115\") " Feb 18 14:19:14 crc kubenswrapper[4817]: I0218 14:19:14.960224 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c3e6421-4a5d-4da3-a1c2-4ae21ae25115-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "8c3e6421-4a5d-4da3-a1c2-4ae21ae25115" (UID: "8c3e6421-4a5d-4da3-a1c2-4ae21ae25115"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:19:14 crc kubenswrapper[4817]: I0218 14:19:14.998868 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c3e6421-4a5d-4da3-a1c2-4ae21ae25115-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "8c3e6421-4a5d-4da3-a1c2-4ae21ae25115" (UID: "8c3e6421-4a5d-4da3-a1c2-4ae21ae25115"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:14 crc kubenswrapper[4817]: I0218 14:19:14.998946 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c3e6421-4a5d-4da3-a1c2-4ae21ae25115-kube-api-access-vrsnr" (OuterVolumeSpecName: "kube-api-access-vrsnr") pod "8c3e6421-4a5d-4da3-a1c2-4ae21ae25115" (UID: "8c3e6421-4a5d-4da3-a1c2-4ae21ae25115"). InnerVolumeSpecName "kube-api-access-vrsnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:19:14 crc kubenswrapper[4817]: I0218 14:19:14.999076 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c3e6421-4a5d-4da3-a1c2-4ae21ae25115-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c3e6421-4a5d-4da3-a1c2-4ae21ae25115" (UID: "8c3e6421-4a5d-4da3-a1c2-4ae21ae25115"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:15 crc kubenswrapper[4817]: I0218 14:19:15.061759 4817 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8c3e6421-4a5d-4da3-a1c2-4ae21ae25115-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:15 crc kubenswrapper[4817]: I0218 14:19:15.061794 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrsnr\" (UniqueName: \"kubernetes.io/projected/8c3e6421-4a5d-4da3-a1c2-4ae21ae25115-kube-api-access-vrsnr\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:15 crc kubenswrapper[4817]: I0218 14:19:15.061804 4817 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8c3e6421-4a5d-4da3-a1c2-4ae21ae25115-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:15 crc kubenswrapper[4817]: I0218 14:19:15.061813 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c3e6421-4a5d-4da3-a1c2-4ae21ae25115-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:15 crc kubenswrapper[4817]: I0218 14:19:15.196735 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-kg5vn" Feb 18 14:19:15 crc kubenswrapper[4817]: I0218 14:19:15.265068 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6690ffc6-f624-448b-81d7-e36a8e059a44-ovsdbserver-nb\") pod \"6690ffc6-f624-448b-81d7-e36a8e059a44\" (UID: \"6690ffc6-f624-448b-81d7-e36a8e059a44\") " Feb 18 14:19:15 crc kubenswrapper[4817]: I0218 14:19:15.265126 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6690ffc6-f624-448b-81d7-e36a8e059a44-dns-swift-storage-0\") pod \"6690ffc6-f624-448b-81d7-e36a8e059a44\" (UID: \"6690ffc6-f624-448b-81d7-e36a8e059a44\") " Feb 18 14:19:15 crc kubenswrapper[4817]: I0218 14:19:15.265205 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6690ffc6-f624-448b-81d7-e36a8e059a44-config\") pod \"6690ffc6-f624-448b-81d7-e36a8e059a44\" (UID: \"6690ffc6-f624-448b-81d7-e36a8e059a44\") " Feb 18 14:19:15 crc kubenswrapper[4817]: I0218 14:19:15.265289 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6690ffc6-f624-448b-81d7-e36a8e059a44-dns-svc\") pod \"6690ffc6-f624-448b-81d7-e36a8e059a44\" (UID: \"6690ffc6-f624-448b-81d7-e36a8e059a44\") " Feb 18 14:19:15 crc kubenswrapper[4817]: I0218 14:19:15.265307 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6690ffc6-f624-448b-81d7-e36a8e059a44-ovsdbserver-sb\") pod \"6690ffc6-f624-448b-81d7-e36a8e059a44\" (UID: \"6690ffc6-f624-448b-81d7-e36a8e059a44\") " Feb 18 14:19:15 crc kubenswrapper[4817]: I0218 14:19:15.265325 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtrbb\" (UniqueName: \"kubernetes.io/projected/6690ffc6-f624-448b-81d7-e36a8e059a44-kube-api-access-rtrbb\") pod \"6690ffc6-f624-448b-81d7-e36a8e059a44\" (UID: \"6690ffc6-f624-448b-81d7-e36a8e059a44\") " Feb 18 14:19:15 crc kubenswrapper[4817]: I0218 14:19:15.347944 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6690ffc6-f624-448b-81d7-e36a8e059a44-kube-api-access-rtrbb" (OuterVolumeSpecName: "kube-api-access-rtrbb") pod "6690ffc6-f624-448b-81d7-e36a8e059a44" (UID: "6690ffc6-f624-448b-81d7-e36a8e059a44"). InnerVolumeSpecName "kube-api-access-rtrbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:19:15 crc kubenswrapper[4817]: I0218 14:19:15.372696 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtrbb\" (UniqueName: \"kubernetes.io/projected/6690ffc6-f624-448b-81d7-e36a8e059a44-kube-api-access-rtrbb\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:15 crc kubenswrapper[4817]: I0218 14:19:15.515087 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6690ffc6-f624-448b-81d7-e36a8e059a44-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6690ffc6-f624-448b-81d7-e36a8e059a44" (UID: "6690ffc6-f624-448b-81d7-e36a8e059a44"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:19:15 crc kubenswrapper[4817]: I0218 14:19:15.523581 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6690ffc6-f624-448b-81d7-e36a8e059a44-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6690ffc6-f624-448b-81d7-e36a8e059a44" (UID: "6690ffc6-f624-448b-81d7-e36a8e059a44"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:19:15 crc kubenswrapper[4817]: I0218 14:19:15.535531 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6690ffc6-f624-448b-81d7-e36a8e059a44-config" (OuterVolumeSpecName: "config") pod "6690ffc6-f624-448b-81d7-e36a8e059a44" (UID: "6690ffc6-f624-448b-81d7-e36a8e059a44"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:19:15 crc kubenswrapper[4817]: I0218 14:19:15.535857 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6690ffc6-f624-448b-81d7-e36a8e059a44-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6690ffc6-f624-448b-81d7-e36a8e059a44" (UID: "6690ffc6-f624-448b-81d7-e36a8e059a44"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:19:15 crc kubenswrapper[4817]: I0218 14:19:15.577860 4817 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6690ffc6-f624-448b-81d7-e36a8e059a44-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:15 crc kubenswrapper[4817]: I0218 14:19:15.577902 4817 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6690ffc6-f624-448b-81d7-e36a8e059a44-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:15 crc kubenswrapper[4817]: I0218 14:19:15.577914 4817 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6690ffc6-f624-448b-81d7-e36a8e059a44-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:15 crc kubenswrapper[4817]: I0218 14:19:15.577928 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6690ffc6-f624-448b-81d7-e36a8e059a44-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:15 crc kubenswrapper[4817]: I0218 14:19:15.581819 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6690ffc6-f624-448b-81d7-e36a8e059a44-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6690ffc6-f624-448b-81d7-e36a8e059a44" (UID: "6690ffc6-f624-448b-81d7-e36a8e059a44"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:19:15 crc kubenswrapper[4817]: I0218 14:19:15.679914 4817 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6690ffc6-f624-448b-81d7-e36a8e059a44-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:15 crc kubenswrapper[4817]: I0218 14:19:15.776776 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-kg5vn" Feb 18 14:19:15 crc kubenswrapper[4817]: I0218 14:19:15.776763 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-kg5vn" event={"ID":"6690ffc6-f624-448b-81d7-e36a8e059a44","Type":"ContainerDied","Data":"18ff0d42aef7248985e94b80487347aca4fe620a6723074d51de5283441d48b3"} Feb 18 14:19:15 crc kubenswrapper[4817]: I0218 14:19:15.777040 4817 scope.go:117] "RemoveContainer" containerID="ec983c9bde2a9cfc3193bd7dfe622c99f90676a21ce1f58fa9eaf2e7ca242469" Feb 18 14:19:15 crc kubenswrapper[4817]: I0218 14:19:15.792808 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 18 14:19:15 crc kubenswrapper[4817]: I0218 14:19:15.792822 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ace81bfb-db15-429f-9168-936817dad694","Type":"ContainerStarted","Data":"548064e943e62ab2273cf5924daf71be00ac8a37b110197735841fd772c0ede6"} Feb 18 14:19:15 crc kubenswrapper[4817]: I0218 14:19:15.822005 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-kg5vn"] Feb 18 14:19:15 crc kubenswrapper[4817]: I0218 14:19:15.822274 4817 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="8c3e6421-4a5d-4da3-a1c2-4ae21ae25115" podUID="ace81bfb-db15-429f-9168-936817dad694" Feb 18 14:19:15 crc kubenswrapper[4817]: I0218 14:19:15.832857 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-kg5vn"] Feb 18 14:19:15 crc kubenswrapper[4817]: I0218 14:19:15.834711 4817 scope.go:117] "RemoveContainer" containerID="424995e206af878c989f6525a5276cac37e9a5d54275e333164250b565d75508" Feb 18 14:19:16 crc kubenswrapper[4817]: I0218 14:19:16.189495 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6690ffc6-f624-448b-81d7-e36a8e059a44" path="/var/lib/kubelet/pods/6690ffc6-f624-448b-81d7-e36a8e059a44/volumes" Feb 18 14:19:16 crc kubenswrapper[4817]: I0218 14:19:16.190259 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c3e6421-4a5d-4da3-a1c2-4ae21ae25115" path="/var/lib/kubelet/pods/8c3e6421-4a5d-4da3-a1c2-4ae21ae25115/volumes" Feb 18 14:19:16 crc kubenswrapper[4817]: I0218 14:19:16.650017 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-64ff6b6cd6-6qg5b" Feb 18 14:19:16 crc kubenswrapper[4817]: I0218 14:19:16.860061 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-64ff6b6cd6-6qg5b" Feb 18 14:19:19 crc kubenswrapper[4817]: I0218 14:19:19.343462 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-759f74666b-ms4jl" Feb 18 14:19:19 crc kubenswrapper[4817]: I0218 14:19:19.732524 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-759f74666b-ms4jl" Feb 18 14:19:19 crc kubenswrapper[4817]: I0218 14:19:19.834094 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-64ff6b6cd6-6qg5b"] Feb 18 14:19:19 crc kubenswrapper[4817]: I0218 14:19:19.834365 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-64ff6b6cd6-6qg5b" podUID="8b0830a0-1f90-4b33-8976-875adeb804f9" containerName="barbican-api-log" containerID="cri-o://78c0dd6a38727f9c2bedc253d7c6cf61f7a740807fbceb83f866aef29681a977" gracePeriod=30 Feb 18 14:19:19 crc kubenswrapper[4817]: I0218 14:19:19.834479 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-64ff6b6cd6-6qg5b" podUID="8b0830a0-1f90-4b33-8976-875adeb804f9" containerName="barbican-api" containerID="cri-o://ca9e72e424c10baf7d397ed920fbc591f0259c9858ae4eb24d73920e41c1129f" gracePeriod=30 Feb 18 14:19:20 crc kubenswrapper[4817]: I0218 14:19:20.872085 4817 generic.go:334] "Generic (PLEG): container finished" podID="0e385fdc-9c05-49ce-a823-dd99efa98e94" containerID="00e17bd6056b65696e77d9641aa5c8e9534f3cfe54f718e4038fd916bdae0100" exitCode=0 Feb 18 14:19:20 crc kubenswrapper[4817]: I0218 14:19:20.872144 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-4xc6g" event={"ID":"0e385fdc-9c05-49ce-a823-dd99efa98e94","Type":"ContainerDied","Data":"00e17bd6056b65696e77d9641aa5c8e9534f3cfe54f718e4038fd916bdae0100"} Feb 18 14:19:20 crc kubenswrapper[4817]: I0218 14:19:20.876754 4817 generic.go:334] "Generic (PLEG): container finished" podID="07f0e519-a5f3-45a2-a5da-e10f851f18df" containerID="215f78b9b93a35d655cb71cb9ad214093e60b960a2b02feae20952738469759d" exitCode=0 Feb 18 14:19:20 crc kubenswrapper[4817]: I0218 14:19:20.876814 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2pxsw" event={"ID":"07f0e519-a5f3-45a2-a5da-e10f851f18df","Type":"ContainerDied","Data":"215f78b9b93a35d655cb71cb9ad214093e60b960a2b02feae20952738469759d"} Feb 18 14:19:20 crc kubenswrapper[4817]: I0218 14:19:20.878357 4817 generic.go:334] "Generic (PLEG): container finished" podID="8b0830a0-1f90-4b33-8976-875adeb804f9" containerID="78c0dd6a38727f9c2bedc253d7c6cf61f7a740807fbceb83f866aef29681a977" exitCode=143 Feb 18 14:19:20 crc kubenswrapper[4817]: I0218 14:19:20.878384 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-64ff6b6cd6-6qg5b" event={"ID":"8b0830a0-1f90-4b33-8976-875adeb804f9","Type":"ContainerDied","Data":"78c0dd6a38727f9c2bedc253d7c6cf61f7a740807fbceb83f866aef29681a977"} Feb 18 14:19:21 crc kubenswrapper[4817]: I0218 14:19:21.461104 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-549ff9d7ff-4pfxq"] Feb 18 14:19:21 crc kubenswrapper[4817]: E0218 14:19:21.461638 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6690ffc6-f624-448b-81d7-e36a8e059a44" containerName="dnsmasq-dns" Feb 18 14:19:21 crc kubenswrapper[4817]: I0218 14:19:21.461656 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="6690ffc6-f624-448b-81d7-e36a8e059a44" containerName="dnsmasq-dns" Feb 18 14:19:21 crc kubenswrapper[4817]: E0218 14:19:21.461696 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6690ffc6-f624-448b-81d7-e36a8e059a44" containerName="init" Feb 18 14:19:21 crc kubenswrapper[4817]: I0218 14:19:21.461705 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="6690ffc6-f624-448b-81d7-e36a8e059a44" containerName="init" Feb 18 14:19:21 crc kubenswrapper[4817]: I0218 14:19:21.461948 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="6690ffc6-f624-448b-81d7-e36a8e059a44" containerName="dnsmasq-dns" Feb 18 14:19:21 crc kubenswrapper[4817]: I0218 14:19:21.463302 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-549ff9d7ff-4pfxq" Feb 18 14:19:21 crc kubenswrapper[4817]: I0218 14:19:21.465905 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 18 14:19:21 crc kubenswrapper[4817]: I0218 14:19:21.466090 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 18 14:19:21 crc kubenswrapper[4817]: I0218 14:19:21.466291 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 18 14:19:21 crc kubenswrapper[4817]: I0218 14:19:21.479794 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-549ff9d7ff-4pfxq"] Feb 18 14:19:21 crc kubenswrapper[4817]: I0218 14:19:21.595235 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g25mz\" (UniqueName: \"kubernetes.io/projected/85a61008-fd45-4598-90bc-b0cf2856cefa-kube-api-access-g25mz\") pod \"swift-proxy-549ff9d7ff-4pfxq\" (UID: \"85a61008-fd45-4598-90bc-b0cf2856cefa\") " pod="openstack/swift-proxy-549ff9d7ff-4pfxq" Feb 18 14:19:21 crc kubenswrapper[4817]: I0218 14:19:21.595303 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85a61008-fd45-4598-90bc-b0cf2856cefa-config-data\") pod \"swift-proxy-549ff9d7ff-4pfxq\" (UID: \"85a61008-fd45-4598-90bc-b0cf2856cefa\") " pod="openstack/swift-proxy-549ff9d7ff-4pfxq" Feb 18 14:19:21 crc kubenswrapper[4817]: I0218 14:19:21.595340 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85a61008-fd45-4598-90bc-b0cf2856cefa-run-httpd\") pod \"swift-proxy-549ff9d7ff-4pfxq\" (UID: \"85a61008-fd45-4598-90bc-b0cf2856cefa\") " pod="openstack/swift-proxy-549ff9d7ff-4pfxq" Feb 18 14:19:21 crc kubenswrapper[4817]: I0218 14:19:21.595387 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85a61008-fd45-4598-90bc-b0cf2856cefa-log-httpd\") pod \"swift-proxy-549ff9d7ff-4pfxq\" (UID: \"85a61008-fd45-4598-90bc-b0cf2856cefa\") " pod="openstack/swift-proxy-549ff9d7ff-4pfxq" Feb 18 14:19:21 crc kubenswrapper[4817]: I0218 14:19:21.595639 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85a61008-fd45-4598-90bc-b0cf2856cefa-public-tls-certs\") pod \"swift-proxy-549ff9d7ff-4pfxq\" (UID: \"85a61008-fd45-4598-90bc-b0cf2856cefa\") " pod="openstack/swift-proxy-549ff9d7ff-4pfxq" Feb 18 14:19:21 crc kubenswrapper[4817]: I0218 14:19:21.595690 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85a61008-fd45-4598-90bc-b0cf2856cefa-combined-ca-bundle\") pod \"swift-proxy-549ff9d7ff-4pfxq\" (UID: \"85a61008-fd45-4598-90bc-b0cf2856cefa\") " pod="openstack/swift-proxy-549ff9d7ff-4pfxq" Feb 18 14:19:21 crc kubenswrapper[4817]: I0218 14:19:21.595756 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/85a61008-fd45-4598-90bc-b0cf2856cefa-internal-tls-certs\") pod \"swift-proxy-549ff9d7ff-4pfxq\" (UID: \"85a61008-fd45-4598-90bc-b0cf2856cefa\") " pod="openstack/swift-proxy-549ff9d7ff-4pfxq" Feb 18 14:19:21 crc kubenswrapper[4817]: I0218 14:19:21.595780 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/85a61008-fd45-4598-90bc-b0cf2856cefa-etc-swift\") pod \"swift-proxy-549ff9d7ff-4pfxq\" (UID: \"85a61008-fd45-4598-90bc-b0cf2856cefa\") " pod="openstack/swift-proxy-549ff9d7ff-4pfxq" Feb 18 14:19:21 crc kubenswrapper[4817]: I0218 14:19:21.697420 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85a61008-fd45-4598-90bc-b0cf2856cefa-public-tls-certs\") pod \"swift-proxy-549ff9d7ff-4pfxq\" (UID: \"85a61008-fd45-4598-90bc-b0cf2856cefa\") " pod="openstack/swift-proxy-549ff9d7ff-4pfxq" Feb 18 14:19:21 crc kubenswrapper[4817]: I0218 14:19:21.697522 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85a61008-fd45-4598-90bc-b0cf2856cefa-combined-ca-bundle\") pod \"swift-proxy-549ff9d7ff-4pfxq\" (UID: \"85a61008-fd45-4598-90bc-b0cf2856cefa\") " pod="openstack/swift-proxy-549ff9d7ff-4pfxq" Feb 18 14:19:21 crc kubenswrapper[4817]: I0218 14:19:21.697589 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/85a61008-fd45-4598-90bc-b0cf2856cefa-internal-tls-certs\") pod \"swift-proxy-549ff9d7ff-4pfxq\" (UID: \"85a61008-fd45-4598-90bc-b0cf2856cefa\") " pod="openstack/swift-proxy-549ff9d7ff-4pfxq" Feb 18 14:19:21 crc kubenswrapper[4817]: I0218 14:19:21.697608 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/85a61008-fd45-4598-90bc-b0cf2856cefa-etc-swift\") pod \"swift-proxy-549ff9d7ff-4pfxq\" (UID: \"85a61008-fd45-4598-90bc-b0cf2856cefa\") " pod="openstack/swift-proxy-549ff9d7ff-4pfxq" Feb 18 14:19:21 crc kubenswrapper[4817]: I0218 14:19:21.697655 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g25mz\" (UniqueName: \"kubernetes.io/projected/85a61008-fd45-4598-90bc-b0cf2856cefa-kube-api-access-g25mz\") pod \"swift-proxy-549ff9d7ff-4pfxq\" (UID: \"85a61008-fd45-4598-90bc-b0cf2856cefa\") " pod="openstack/swift-proxy-549ff9d7ff-4pfxq" Feb 18 14:19:21 crc kubenswrapper[4817]: I0218 14:19:21.697686 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85a61008-fd45-4598-90bc-b0cf2856cefa-config-data\") pod \"swift-proxy-549ff9d7ff-4pfxq\" (UID: \"85a61008-fd45-4598-90bc-b0cf2856cefa\") " pod="openstack/swift-proxy-549ff9d7ff-4pfxq" Feb 18 14:19:21 crc kubenswrapper[4817]: I0218 14:19:21.697721 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85a61008-fd45-4598-90bc-b0cf2856cefa-run-httpd\") pod \"swift-proxy-549ff9d7ff-4pfxq\" (UID: \"85a61008-fd45-4598-90bc-b0cf2856cefa\") " pod="openstack/swift-proxy-549ff9d7ff-4pfxq" Feb 18 14:19:21 crc kubenswrapper[4817]: I0218 14:19:21.697765 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85a61008-fd45-4598-90bc-b0cf2856cefa-log-httpd\") pod \"swift-proxy-549ff9d7ff-4pfxq\" (UID: \"85a61008-fd45-4598-90bc-b0cf2856cefa\") " pod="openstack/swift-proxy-549ff9d7ff-4pfxq" Feb 18 14:19:21 crc kubenswrapper[4817]: I0218 14:19:21.698348 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85a61008-fd45-4598-90bc-b0cf2856cefa-run-httpd\") pod \"swift-proxy-549ff9d7ff-4pfxq\" (UID: \"85a61008-fd45-4598-90bc-b0cf2856cefa\") " pod="openstack/swift-proxy-549ff9d7ff-4pfxq" Feb 18 14:19:21 crc kubenswrapper[4817]: I0218 14:19:21.698580 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85a61008-fd45-4598-90bc-b0cf2856cefa-log-httpd\") pod \"swift-proxy-549ff9d7ff-4pfxq\" (UID: \"85a61008-fd45-4598-90bc-b0cf2856cefa\") " pod="openstack/swift-proxy-549ff9d7ff-4pfxq" Feb 18 14:19:21 crc kubenswrapper[4817]: I0218 14:19:21.710391 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85a61008-fd45-4598-90bc-b0cf2856cefa-config-data\") pod \"swift-proxy-549ff9d7ff-4pfxq\" (UID: \"85a61008-fd45-4598-90bc-b0cf2856cefa\") " pod="openstack/swift-proxy-549ff9d7ff-4pfxq" Feb 18 14:19:21 crc kubenswrapper[4817]: I0218 14:19:21.714941 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/85a61008-fd45-4598-90bc-b0cf2856cefa-internal-tls-certs\") pod \"swift-proxy-549ff9d7ff-4pfxq\" (UID: \"85a61008-fd45-4598-90bc-b0cf2856cefa\") " pod="openstack/swift-proxy-549ff9d7ff-4pfxq" Feb 18 14:19:21 crc kubenswrapper[4817]: I0218 14:19:21.715110 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/85a61008-fd45-4598-90bc-b0cf2856cefa-etc-swift\") pod \"swift-proxy-549ff9d7ff-4pfxq\" (UID: \"85a61008-fd45-4598-90bc-b0cf2856cefa\") " pod="openstack/swift-proxy-549ff9d7ff-4pfxq" Feb 18 14:19:21 crc kubenswrapper[4817]: I0218 14:19:21.723740 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85a61008-fd45-4598-90bc-b0cf2856cefa-public-tls-certs\") pod \"swift-proxy-549ff9d7ff-4pfxq\" (UID: \"85a61008-fd45-4598-90bc-b0cf2856cefa\") " pod="openstack/swift-proxy-549ff9d7ff-4pfxq" Feb 18 14:19:21 crc kubenswrapper[4817]: I0218 14:19:21.736003 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85a61008-fd45-4598-90bc-b0cf2856cefa-combined-ca-bundle\") pod \"swift-proxy-549ff9d7ff-4pfxq\" (UID: \"85a61008-fd45-4598-90bc-b0cf2856cefa\") " pod="openstack/swift-proxy-549ff9d7ff-4pfxq" Feb 18 14:19:21 crc kubenswrapper[4817]: I0218 14:19:21.750956 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g25mz\" (UniqueName: \"kubernetes.io/projected/85a61008-fd45-4598-90bc-b0cf2856cefa-kube-api-access-g25mz\") pod \"swift-proxy-549ff9d7ff-4pfxq\" (UID: \"85a61008-fd45-4598-90bc-b0cf2856cefa\") " pod="openstack/swift-proxy-549ff9d7ff-4pfxq" Feb 18 14:19:21 crc kubenswrapper[4817]: I0218 14:19:21.789060 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-549ff9d7ff-4pfxq" Feb 18 14:19:23 crc kubenswrapper[4817]: I0218 14:19:23.248015 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:19:23 crc kubenswrapper[4817]: I0218 14:19:23.251678 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69289a6c-7b95-4c16-a326-dab582fc4b86" containerName="ceilometer-central-agent" containerID="cri-o://446fd5b3ecb61957312dd974bf64fcfa6943b99e18c4a83378af0878c142cda6" gracePeriod=30 Feb 18 14:19:23 crc kubenswrapper[4817]: I0218 14:19:23.251917 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69289a6c-7b95-4c16-a326-dab582fc4b86" containerName="proxy-httpd" containerID="cri-o://4349e8f38b3e490984b83b035c211819bd8b59dddba035180f6537e1e67865b1" gracePeriod=30 Feb 18 14:19:23 crc kubenswrapper[4817]: I0218 14:19:23.252045 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69289a6c-7b95-4c16-a326-dab582fc4b86" containerName="sg-core" containerID="cri-o://dde7cabe52ee3128f06443a1007f42fee418d5247be7aa5cbc03fef4f7e1dcc5" gracePeriod=30 Feb 18 14:19:23 crc kubenswrapper[4817]: I0218 14:19:23.252176 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69289a6c-7b95-4c16-a326-dab582fc4b86" containerName="ceilometer-notification-agent" containerID="cri-o://c3a599395d06f206c5faa5386bea5b3abc4cb33817b5a9e4c092c926794b4c0f" gracePeriod=30 Feb 18 14:19:23 crc kubenswrapper[4817]: I0218 14:19:23.252591 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 18 14:19:23 crc kubenswrapper[4817]: I0218 14:19:23.261201 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="69289a6c-7b95-4c16-a326-dab582fc4b86" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.169:3000/\": EOF" Feb 18 14:19:23 crc kubenswrapper[4817]: I0218 14:19:23.924496 4817 generic.go:334] "Generic (PLEG): container finished" podID="69289a6c-7b95-4c16-a326-dab582fc4b86" containerID="4349e8f38b3e490984b83b035c211819bd8b59dddba035180f6537e1e67865b1" exitCode=0 Feb 18 14:19:23 crc kubenswrapper[4817]: I0218 14:19:23.924825 4817 generic.go:334] "Generic (PLEG): container finished" podID="69289a6c-7b95-4c16-a326-dab582fc4b86" containerID="dde7cabe52ee3128f06443a1007f42fee418d5247be7aa5cbc03fef4f7e1dcc5" exitCode=2 Feb 18 14:19:23 crc kubenswrapper[4817]: I0218 14:19:23.924834 4817 generic.go:334] "Generic (PLEG): container finished" podID="69289a6c-7b95-4c16-a326-dab582fc4b86" containerID="446fd5b3ecb61957312dd974bf64fcfa6943b99e18c4a83378af0878c142cda6" exitCode=0 Feb 18 14:19:23 crc kubenswrapper[4817]: I0218 14:19:23.924589 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69289a6c-7b95-4c16-a326-dab582fc4b86","Type":"ContainerDied","Data":"4349e8f38b3e490984b83b035c211819bd8b59dddba035180f6537e1e67865b1"} Feb 18 14:19:23 crc kubenswrapper[4817]: I0218 14:19:23.924908 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69289a6c-7b95-4c16-a326-dab582fc4b86","Type":"ContainerDied","Data":"dde7cabe52ee3128f06443a1007f42fee418d5247be7aa5cbc03fef4f7e1dcc5"} Feb 18 14:19:23 crc kubenswrapper[4817]: I0218 14:19:23.924923 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69289a6c-7b95-4c16-a326-dab582fc4b86","Type":"ContainerDied","Data":"446fd5b3ecb61957312dd974bf64fcfa6943b99e18c4a83378af0878c142cda6"} Feb 18 14:19:23 crc kubenswrapper[4817]: I0218 14:19:23.928175 4817 generic.go:334] "Generic (PLEG): container finished" podID="8b0830a0-1f90-4b33-8976-875adeb804f9" containerID="ca9e72e424c10baf7d397ed920fbc591f0259c9858ae4eb24d73920e41c1129f" exitCode=0 Feb 18 14:19:23 crc kubenswrapper[4817]: I0218 14:19:23.928204 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-64ff6b6cd6-6qg5b" event={"ID":"8b0830a0-1f90-4b33-8976-875adeb804f9","Type":"ContainerDied","Data":"ca9e72e424c10baf7d397ed920fbc591f0259c9858ae4eb24d73920e41c1129f"} Feb 18 14:19:24 crc kubenswrapper[4817]: I0218 14:19:24.597941 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-64ff6b6cd6-6qg5b" podUID="8b0830a0-1f90-4b33-8976-875adeb804f9" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.173:9311/healthcheck\": dial tcp 10.217.0.173:9311: connect: connection refused" Feb 18 14:19:24 crc kubenswrapper[4817]: I0218 14:19:24.598540 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-64ff6b6cd6-6qg5b" podUID="8b0830a0-1f90-4b33-8976-875adeb804f9" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.173:9311/healthcheck\": dial tcp 10.217.0.173:9311: connect: connection refused" Feb 18 14:19:24 crc kubenswrapper[4817]: I0218 14:19:24.751871 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="69289a6c-7b95-4c16-a326-dab582fc4b86" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.169:3000/\": dial tcp 10.217.0.169:3000: connect: connection refused" Feb 18 14:19:24 crc kubenswrapper[4817]: I0218 14:19:24.942034 4817 generic.go:334] "Generic (PLEG): container finished" podID="69289a6c-7b95-4c16-a326-dab582fc4b86" containerID="c3a599395d06f206c5faa5386bea5b3abc4cb33817b5a9e4c092c926794b4c0f" exitCode=0 Feb 18 14:19:24 crc kubenswrapper[4817]: I0218 14:19:24.942230 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69289a6c-7b95-4c16-a326-dab582fc4b86","Type":"ContainerDied","Data":"c3a599395d06f206c5faa5386bea5b3abc4cb33817b5a9e4c092c926794b4c0f"} Feb 18 14:19:24 crc kubenswrapper[4817]: I0218 14:19:24.944146 4817 generic.go:334] "Generic (PLEG): container finished" podID="fb12a33e-172a-4c2d-8c97-8ae5486ce22d" containerID="47021a4d2507c0e35400943910b6d9219bf27a681ec61204b6398fd99f2a3061" exitCode=0 Feb 18 14:19:24 crc kubenswrapper[4817]: I0218 14:19:24.944173 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-w48xt" event={"ID":"fb12a33e-172a-4c2d-8c97-8ae5486ce22d","Type":"ContainerDied","Data":"47021a4d2507c0e35400943910b6d9219bf27a681ec61204b6398fd99f2a3061"} Feb 18 14:19:26 crc kubenswrapper[4817]: I0218 14:19:26.970539 4817 generic.go:334] "Generic (PLEG): container finished" podID="8de51007-ada2-49f5-90b2-11151899e3cf" containerID="a2b4cacda3d056587cd39f7a1cbb66789b5d848ebe2fd068888ed5b7d9b5436a" exitCode=0 Feb 18 14:19:26 crc kubenswrapper[4817]: I0218 14:19:26.970654 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-2fk5c" event={"ID":"8de51007-ada2-49f5-90b2-11151899e3cf","Type":"ContainerDied","Data":"a2b4cacda3d056587cd39f7a1cbb66789b5d848ebe2fd068888ed5b7d9b5436a"} Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.433718 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2pxsw" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.443494 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-w48xt" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.459800 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-4xc6g" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.531877 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb12a33e-172a-4c2d-8c97-8ae5486ce22d-config\") pod \"fb12a33e-172a-4c2d-8c97-8ae5486ce22d\" (UID: \"fb12a33e-172a-4c2d-8c97-8ae5486ce22d\") " Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.531929 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb12a33e-172a-4c2d-8c97-8ae5486ce22d-combined-ca-bundle\") pod \"fb12a33e-172a-4c2d-8c97-8ae5486ce22d\" (UID: \"fb12a33e-172a-4c2d-8c97-8ae5486ce22d\") " Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.531961 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72d54\" (UniqueName: \"kubernetes.io/projected/07f0e519-a5f3-45a2-a5da-e10f851f18df-kube-api-access-72d54\") pod \"07f0e519-a5f3-45a2-a5da-e10f851f18df\" (UID: \"07f0e519-a5f3-45a2-a5da-e10f851f18df\") " Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.532029 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/0e385fdc-9c05-49ce-a823-dd99efa98e94-certs\") pod \"0e385fdc-9c05-49ce-a823-dd99efa98e94\" (UID: \"0e385fdc-9c05-49ce-a823-dd99efa98e94\") " Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.532079 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e385fdc-9c05-49ce-a823-dd99efa98e94-config-data\") pod \"0e385fdc-9c05-49ce-a823-dd99efa98e94\" (UID: \"0e385fdc-9c05-49ce-a823-dd99efa98e94\") " Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.532109 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f0e519-a5f3-45a2-a5da-e10f851f18df-combined-ca-bundle\") pod \"07f0e519-a5f3-45a2-a5da-e10f851f18df\" (UID: \"07f0e519-a5f3-45a2-a5da-e10f851f18df\") " Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.532127 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzncd\" (UniqueName: \"kubernetes.io/projected/0e385fdc-9c05-49ce-a823-dd99efa98e94-kube-api-access-fzncd\") pod \"0e385fdc-9c05-49ce-a823-dd99efa98e94\" (UID: \"0e385fdc-9c05-49ce-a823-dd99efa98e94\") " Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.532181 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07f0e519-a5f3-45a2-a5da-e10f851f18df-config-data\") pod \"07f0e519-a5f3-45a2-a5da-e10f851f18df\" (UID: \"07f0e519-a5f3-45a2-a5da-e10f851f18df\") " Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.532216 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07f0e519-a5f3-45a2-a5da-e10f851f18df-db-sync-config-data\") pod \"07f0e519-a5f3-45a2-a5da-e10f851f18df\" (UID: \"07f0e519-a5f3-45a2-a5da-e10f851f18df\") " Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.532241 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e385fdc-9c05-49ce-a823-dd99efa98e94-combined-ca-bundle\") pod \"0e385fdc-9c05-49ce-a823-dd99efa98e94\" (UID: \"0e385fdc-9c05-49ce-a823-dd99efa98e94\") " Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.532264 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r97sn\" (UniqueName: \"kubernetes.io/projected/fb12a33e-172a-4c2d-8c97-8ae5486ce22d-kube-api-access-r97sn\") pod \"fb12a33e-172a-4c2d-8c97-8ae5486ce22d\" (UID: \"fb12a33e-172a-4c2d-8c97-8ae5486ce22d\") " Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.532307 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07f0e519-a5f3-45a2-a5da-e10f851f18df-scripts\") pod \"07f0e519-a5f3-45a2-a5da-e10f851f18df\" (UID: \"07f0e519-a5f3-45a2-a5da-e10f851f18df\") " Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.532321 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e385fdc-9c05-49ce-a823-dd99efa98e94-scripts\") pod \"0e385fdc-9c05-49ce-a823-dd99efa98e94\" (UID: \"0e385fdc-9c05-49ce-a823-dd99efa98e94\") " Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.532363 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/07f0e519-a5f3-45a2-a5da-e10f851f18df-etc-machine-id\") pod \"07f0e519-a5f3-45a2-a5da-e10f851f18df\" (UID: \"07f0e519-a5f3-45a2-a5da-e10f851f18df\") " Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.532777 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07f0e519-a5f3-45a2-a5da-e10f851f18df-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "07f0e519-a5f3-45a2-a5da-e10f851f18df" (UID: "07f0e519-a5f3-45a2-a5da-e10f851f18df"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.538284 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07f0e519-a5f3-45a2-a5da-e10f851f18df-kube-api-access-72d54" (OuterVolumeSpecName: "kube-api-access-72d54") pod "07f0e519-a5f3-45a2-a5da-e10f851f18df" (UID: "07f0e519-a5f3-45a2-a5da-e10f851f18df"). InnerVolumeSpecName "kube-api-access-72d54". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.540521 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e385fdc-9c05-49ce-a823-dd99efa98e94-certs" (OuterVolumeSpecName: "certs") pod "0e385fdc-9c05-49ce-a823-dd99efa98e94" (UID: "0e385fdc-9c05-49ce-a823-dd99efa98e94"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.541133 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07f0e519-a5f3-45a2-a5da-e10f851f18df-scripts" (OuterVolumeSpecName: "scripts") pod "07f0e519-a5f3-45a2-a5da-e10f851f18df" (UID: "07f0e519-a5f3-45a2-a5da-e10f851f18df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.541556 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07f0e519-a5f3-45a2-a5da-e10f851f18df-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "07f0e519-a5f3-45a2-a5da-e10f851f18df" (UID: "07f0e519-a5f3-45a2-a5da-e10f851f18df"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.550584 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e385fdc-9c05-49ce-a823-dd99efa98e94-kube-api-access-fzncd" (OuterVolumeSpecName: "kube-api-access-fzncd") pod "0e385fdc-9c05-49ce-a823-dd99efa98e94" (UID: "0e385fdc-9c05-49ce-a823-dd99efa98e94"). InnerVolumeSpecName "kube-api-access-fzncd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.551063 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb12a33e-172a-4c2d-8c97-8ae5486ce22d-kube-api-access-r97sn" (OuterVolumeSpecName: "kube-api-access-r97sn") pod "fb12a33e-172a-4c2d-8c97-8ae5486ce22d" (UID: "fb12a33e-172a-4c2d-8c97-8ae5486ce22d"). InnerVolumeSpecName "kube-api-access-r97sn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.556323 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e385fdc-9c05-49ce-a823-dd99efa98e94-scripts" (OuterVolumeSpecName: "scripts") pod "0e385fdc-9c05-49ce-a823-dd99efa98e94" (UID: "0e385fdc-9c05-49ce-a823-dd99efa98e94"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.578068 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e385fdc-9c05-49ce-a823-dd99efa98e94-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e385fdc-9c05-49ce-a823-dd99efa98e94" (UID: "0e385fdc-9c05-49ce-a823-dd99efa98e94"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.588735 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e385fdc-9c05-49ce-a823-dd99efa98e94-config-data" (OuterVolumeSpecName: "config-data") pod "0e385fdc-9c05-49ce-a823-dd99efa98e94" (UID: "0e385fdc-9c05-49ce-a823-dd99efa98e94"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.600327 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07f0e519-a5f3-45a2-a5da-e10f851f18df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07f0e519-a5f3-45a2-a5da-e10f851f18df" (UID: "07f0e519-a5f3-45a2-a5da-e10f851f18df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.613756 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb12a33e-172a-4c2d-8c97-8ae5486ce22d-config" (OuterVolumeSpecName: "config") pod "fb12a33e-172a-4c2d-8c97-8ae5486ce22d" (UID: "fb12a33e-172a-4c2d-8c97-8ae5486ce22d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.613828 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07f0e519-a5f3-45a2-a5da-e10f851f18df-config-data" (OuterVolumeSpecName: "config-data") pod "07f0e519-a5f3-45a2-a5da-e10f851f18df" (UID: "07f0e519-a5f3-45a2-a5da-e10f851f18df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.626487 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb12a33e-172a-4c2d-8c97-8ae5486ce22d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb12a33e-172a-4c2d-8c97-8ae5486ce22d" (UID: "fb12a33e-172a-4c2d-8c97-8ae5486ce22d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.638756 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f0e519-a5f3-45a2-a5da-e10f851f18df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.638788 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzncd\" (UniqueName: \"kubernetes.io/projected/0e385fdc-9c05-49ce-a823-dd99efa98e94-kube-api-access-fzncd\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.638799 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07f0e519-a5f3-45a2-a5da-e10f851f18df-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.638807 4817 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07f0e519-a5f3-45a2-a5da-e10f851f18df-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.638818 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e385fdc-9c05-49ce-a823-dd99efa98e94-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.638826 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r97sn\" (UniqueName: \"kubernetes.io/projected/fb12a33e-172a-4c2d-8c97-8ae5486ce22d-kube-api-access-r97sn\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.638834 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07f0e519-a5f3-45a2-a5da-e10f851f18df-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.638841 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e385fdc-9c05-49ce-a823-dd99efa98e94-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.638849 4817 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/07f0e519-a5f3-45a2-a5da-e10f851f18df-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.638857 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb12a33e-172a-4c2d-8c97-8ae5486ce22d-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.638866 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb12a33e-172a-4c2d-8c97-8ae5486ce22d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.638874 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72d54\" (UniqueName: \"kubernetes.io/projected/07f0e519-a5f3-45a2-a5da-e10f851f18df-kube-api-access-72d54\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.638881 4817 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/0e385fdc-9c05-49ce-a823-dd99efa98e94-certs\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.638889 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e385fdc-9c05-49ce-a823-dd99efa98e94-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.982611 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-w48xt" event={"ID":"fb12a33e-172a-4c2d-8c97-8ae5486ce22d","Type":"ContainerDied","Data":"429985b7d225d27c1d2c8b413a6eb9965dcb67726c7bf772f1316f08f004e838"} Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.982651 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="429985b7d225d27c1d2c8b413a6eb9965dcb67726c7bf772f1316f08f004e838" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.982776 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-w48xt" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.983865 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-4xc6g" event={"ID":"0e385fdc-9c05-49ce-a823-dd99efa98e94","Type":"ContainerDied","Data":"48d77ced66073362c74a1ad7dac9c386205f7a7b9ae6a91001dd855c7440a975"} Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.983883 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48d77ced66073362c74a1ad7dac9c386205f7a7b9ae6a91001dd855c7440a975" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.983968 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-4xc6g" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.986208 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2pxsw" Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.987609 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2pxsw" event={"ID":"07f0e519-a5f3-45a2-a5da-e10f851f18df","Type":"ContainerDied","Data":"c24ed8c4067b5ac77147b6ac1e83f5638ebec6400da8610a6339156d9754294b"} Feb 18 14:19:27 crc kubenswrapper[4817]: I0218 14:19:27.987668 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c24ed8c4067b5ac77147b6ac1e83f5638ebec6400da8610a6339156d9754294b" Feb 18 14:19:28 crc kubenswrapper[4817]: E0218 14:19:28.305600 4817 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e385fdc_9c05_49ce_a823_dd99efa98e94.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e385fdc_9c05_49ce_a823_dd99efa98e94.slice/crio-48d77ced66073362c74a1ad7dac9c386205f7a7b9ae6a91001dd855c7440a975\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07f0e519_a5f3_45a2_a5da_e10f851f18df.slice\": RecentStats: unable to find data in memory cache]" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.342473 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.345363 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-64ff6b6cd6-6qg5b" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.466529 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b0830a0-1f90-4b33-8976-875adeb804f9-config-data\") pod \"8b0830a0-1f90-4b33-8976-875adeb804f9\" (UID: \"8b0830a0-1f90-4b33-8976-875adeb804f9\") " Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.466601 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b0830a0-1f90-4b33-8976-875adeb804f9-combined-ca-bundle\") pod \"8b0830a0-1f90-4b33-8976-875adeb804f9\" (UID: \"8b0830a0-1f90-4b33-8976-875adeb804f9\") " Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.466696 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69289a6c-7b95-4c16-a326-dab582fc4b86-log-httpd\") pod \"69289a6c-7b95-4c16-a326-dab582fc4b86\" (UID: \"69289a6c-7b95-4c16-a326-dab582fc4b86\") " Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.466790 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69289a6c-7b95-4c16-a326-dab582fc4b86-run-httpd\") pod \"69289a6c-7b95-4c16-a326-dab582fc4b86\" (UID: \"69289a6c-7b95-4c16-a326-dab582fc4b86\") " Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.466813 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69289a6c-7b95-4c16-a326-dab582fc4b86-scripts\") pod \"69289a6c-7b95-4c16-a326-dab582fc4b86\" (UID: \"69289a6c-7b95-4c16-a326-dab582fc4b86\") " Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.466905 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csp4q\" (UniqueName: \"kubernetes.io/projected/8b0830a0-1f90-4b33-8976-875adeb804f9-kube-api-access-csp4q\") pod \"8b0830a0-1f90-4b33-8976-875adeb804f9\" (UID: \"8b0830a0-1f90-4b33-8976-875adeb804f9\") " Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.466937 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69289a6c-7b95-4c16-a326-dab582fc4b86-config-data\") pod \"69289a6c-7b95-4c16-a326-dab582fc4b86\" (UID: \"69289a6c-7b95-4c16-a326-dab582fc4b86\") " Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.467069 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b0830a0-1f90-4b33-8976-875adeb804f9-logs\") pod \"8b0830a0-1f90-4b33-8976-875adeb804f9\" (UID: \"8b0830a0-1f90-4b33-8976-875adeb804f9\") " Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.467101 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b0830a0-1f90-4b33-8976-875adeb804f9-config-data-custom\") pod \"8b0830a0-1f90-4b33-8976-875adeb804f9\" (UID: \"8b0830a0-1f90-4b33-8976-875adeb804f9\") " Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.467123 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pf8t\" (UniqueName: \"kubernetes.io/projected/69289a6c-7b95-4c16-a326-dab582fc4b86-kube-api-access-5pf8t\") pod \"69289a6c-7b95-4c16-a326-dab582fc4b86\" (UID: \"69289a6c-7b95-4c16-a326-dab582fc4b86\") " Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.467168 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69289a6c-7b95-4c16-a326-dab582fc4b86-combined-ca-bundle\") pod \"69289a6c-7b95-4c16-a326-dab582fc4b86\" (UID: \"69289a6c-7b95-4c16-a326-dab582fc4b86\") " Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.467210 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69289a6c-7b95-4c16-a326-dab582fc4b86-sg-core-conf-yaml\") pod \"69289a6c-7b95-4c16-a326-dab582fc4b86\" (UID: \"69289a6c-7b95-4c16-a326-dab582fc4b86\") " Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.467938 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69289a6c-7b95-4c16-a326-dab582fc4b86-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "69289a6c-7b95-4c16-a326-dab582fc4b86" (UID: "69289a6c-7b95-4c16-a326-dab582fc4b86"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.470290 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b0830a0-1f90-4b33-8976-875adeb804f9-logs" (OuterVolumeSpecName: "logs") pod "8b0830a0-1f90-4b33-8976-875adeb804f9" (UID: "8b0830a0-1f90-4b33-8976-875adeb804f9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.474705 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69289a6c-7b95-4c16-a326-dab582fc4b86-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "69289a6c-7b95-4c16-a326-dab582fc4b86" (UID: "69289a6c-7b95-4c16-a326-dab582fc4b86"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.479285 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69289a6c-7b95-4c16-a326-dab582fc4b86-kube-api-access-5pf8t" (OuterVolumeSpecName: "kube-api-access-5pf8t") pod "69289a6c-7b95-4c16-a326-dab582fc4b86" (UID: "69289a6c-7b95-4c16-a326-dab582fc4b86"). InnerVolumeSpecName "kube-api-access-5pf8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.479519 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b0830a0-1f90-4b33-8976-875adeb804f9-kube-api-access-csp4q" (OuterVolumeSpecName: "kube-api-access-csp4q") pod "8b0830a0-1f90-4b33-8976-875adeb804f9" (UID: "8b0830a0-1f90-4b33-8976-875adeb804f9"). InnerVolumeSpecName "kube-api-access-csp4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.483211 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b0830a0-1f90-4b33-8976-875adeb804f9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8b0830a0-1f90-4b33-8976-875adeb804f9" (UID: "8b0830a0-1f90-4b33-8976-875adeb804f9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.490463 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69289a6c-7b95-4c16-a326-dab582fc4b86-scripts" (OuterVolumeSpecName: "scripts") pod "69289a6c-7b95-4c16-a326-dab582fc4b86" (UID: "69289a6c-7b95-4c16-a326-dab582fc4b86"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.527238 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69289a6c-7b95-4c16-a326-dab582fc4b86-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "69289a6c-7b95-4c16-a326-dab582fc4b86" (UID: "69289a6c-7b95-4c16-a326-dab582fc4b86"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.560587 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b0830a0-1f90-4b33-8976-875adeb804f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b0830a0-1f90-4b33-8976-875adeb804f9" (UID: "8b0830a0-1f90-4b33-8976-875adeb804f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.570203 4817 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69289a6c-7b95-4c16-a326-dab582fc4b86-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.570244 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b0830a0-1f90-4b33-8976-875adeb804f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.570259 4817 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69289a6c-7b95-4c16-a326-dab582fc4b86-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.570270 4817 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69289a6c-7b95-4c16-a326-dab582fc4b86-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.570282 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69289a6c-7b95-4c16-a326-dab582fc4b86-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.570293 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csp4q\" (UniqueName: \"kubernetes.io/projected/8b0830a0-1f90-4b33-8976-875adeb804f9-kube-api-access-csp4q\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.570307 4817 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b0830a0-1f90-4b33-8976-875adeb804f9-logs\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.570319 4817 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b0830a0-1f90-4b33-8976-875adeb804f9-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.570333 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pf8t\" (UniqueName: \"kubernetes.io/projected/69289a6c-7b95-4c16-a326-dab582fc4b86-kube-api-access-5pf8t\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.574428 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b0830a0-1f90-4b33-8976-875adeb804f9-config-data" (OuterVolumeSpecName: "config-data") pod "8b0830a0-1f90-4b33-8976-875adeb804f9" (UID: "8b0830a0-1f90-4b33-8976-875adeb804f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.591183 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69289a6c-7b95-4c16-a326-dab582fc4b86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69289a6c-7b95-4c16-a326-dab582fc4b86" (UID: "69289a6c-7b95-4c16-a326-dab582fc4b86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.662216 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-jgz44"] Feb 18 14:19:28 crc kubenswrapper[4817]: E0218 14:19:28.662681 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e385fdc-9c05-49ce-a823-dd99efa98e94" containerName="cloudkitty-db-sync" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.662700 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e385fdc-9c05-49ce-a823-dd99efa98e94" containerName="cloudkitty-db-sync" Feb 18 14:19:28 crc kubenswrapper[4817]: E0218 14:19:28.662717 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b0830a0-1f90-4b33-8976-875adeb804f9" containerName="barbican-api" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.662725 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b0830a0-1f90-4b33-8976-875adeb804f9" containerName="barbican-api" Feb 18 14:19:28 crc kubenswrapper[4817]: E0218 14:19:28.662741 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07f0e519-a5f3-45a2-a5da-e10f851f18df" containerName="cinder-db-sync" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.662748 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="07f0e519-a5f3-45a2-a5da-e10f851f18df" containerName="cinder-db-sync" Feb 18 14:19:28 crc kubenswrapper[4817]: E0218 14:19:28.662766 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69289a6c-7b95-4c16-a326-dab582fc4b86" containerName="ceilometer-central-agent" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.662773 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="69289a6c-7b95-4c16-a326-dab582fc4b86" containerName="ceilometer-central-agent" Feb 18 14:19:28 crc kubenswrapper[4817]: E0218 14:19:28.662786 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b0830a0-1f90-4b33-8976-875adeb804f9" containerName="barbican-api-log" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.662794 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b0830a0-1f90-4b33-8976-875adeb804f9" containerName="barbican-api-log" Feb 18 14:19:28 crc kubenswrapper[4817]: E0218 14:19:28.662807 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69289a6c-7b95-4c16-a326-dab582fc4b86" containerName="proxy-httpd" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.662813 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="69289a6c-7b95-4c16-a326-dab582fc4b86" containerName="proxy-httpd" Feb 18 14:19:28 crc kubenswrapper[4817]: E0218 14:19:28.662835 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69289a6c-7b95-4c16-a326-dab582fc4b86" containerName="ceilometer-notification-agent" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.662842 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="69289a6c-7b95-4c16-a326-dab582fc4b86" containerName="ceilometer-notification-agent" Feb 18 14:19:28 crc kubenswrapper[4817]: E0218 14:19:28.662857 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb12a33e-172a-4c2d-8c97-8ae5486ce22d" containerName="neutron-db-sync" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.662864 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb12a33e-172a-4c2d-8c97-8ae5486ce22d" containerName="neutron-db-sync" Feb 18 14:19:28 crc kubenswrapper[4817]: E0218 14:19:28.662878 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69289a6c-7b95-4c16-a326-dab582fc4b86" containerName="sg-core" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.662885 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="69289a6c-7b95-4c16-a326-dab582fc4b86" containerName="sg-core" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.663108 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="69289a6c-7b95-4c16-a326-dab582fc4b86" containerName="ceilometer-central-agent" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.663128 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb12a33e-172a-4c2d-8c97-8ae5486ce22d" containerName="neutron-db-sync" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.663139 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b0830a0-1f90-4b33-8976-875adeb804f9" containerName="barbican-api-log" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.663155 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="07f0e519-a5f3-45a2-a5da-e10f851f18df" containerName="cinder-db-sync" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.663169 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="69289a6c-7b95-4c16-a326-dab582fc4b86" containerName="proxy-httpd" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.663182 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b0830a0-1f90-4b33-8976-875adeb804f9" containerName="barbican-api" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.663194 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="69289a6c-7b95-4c16-a326-dab582fc4b86" containerName="sg-core" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.663207 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="69289a6c-7b95-4c16-a326-dab582fc4b86" containerName="ceilometer-notification-agent" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.663224 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e385fdc-9c05-49ce-a823-dd99efa98e94" containerName="cloudkitty-db-sync" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.664057 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-jgz44" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.671882 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69289a6c-7b95-4c16-a326-dab582fc4b86-config-data" (OuterVolumeSpecName: "config-data") pod "69289a6c-7b95-4c16-a326-dab582fc4b86" (UID: "69289a6c-7b95-4c16-a326-dab582fc4b86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.672367 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.672741 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.673073 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-zgqz6" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.673798 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.674305 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69289a6c-7b95-4c16-a326-dab582fc4b86-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.674351 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69289a6c-7b95-4c16-a326-dab582fc4b86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.674365 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b0830a0-1f90-4b33-8976-875adeb804f9-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.674507 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.698953 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-jgz44"] Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.767659 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-789c5c5cb7-g78nz"] Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.769856 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-789c5c5cb7-g78nz" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.780358 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f7d4df-bc28-4a01-a044-091894ac27c2-combined-ca-bundle\") pod \"cloudkitty-storageinit-jgz44\" (UID: \"c6f7d4df-bc28-4a01-a044-091894ac27c2\") " pod="openstack/cloudkitty-storageinit-jgz44" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.780654 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gwv8\" (UniqueName: \"kubernetes.io/projected/c6f7d4df-bc28-4a01-a044-091894ac27c2-kube-api-access-5gwv8\") pod \"cloudkitty-storageinit-jgz44\" (UID: \"c6f7d4df-bc28-4a01-a044-091894ac27c2\") " pod="openstack/cloudkitty-storageinit-jgz44" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.780806 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6f7d4df-bc28-4a01-a044-091894ac27c2-scripts\") pod \"cloudkitty-storageinit-jgz44\" (UID: \"c6f7d4df-bc28-4a01-a044-091894ac27c2\") " pod="openstack/cloudkitty-storageinit-jgz44" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.780921 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6f7d4df-bc28-4a01-a044-091894ac27c2-config-data\") pod \"cloudkitty-storageinit-jgz44\" (UID: \"c6f7d4df-bc28-4a01-a044-091894ac27c2\") " pod="openstack/cloudkitty-storageinit-jgz44" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.781031 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/c6f7d4df-bc28-4a01-a044-091894ac27c2-certs\") pod \"cloudkitty-storageinit-jgz44\" (UID: \"c6f7d4df-bc28-4a01-a044-091894ac27c2\") " pod="openstack/cloudkitty-storageinit-jgz44" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.793169 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-789c5c5cb7-g78nz"] Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.882910 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.893011 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f7d4df-bc28-4a01-a044-091894ac27c2-combined-ca-bundle\") pod \"cloudkitty-storageinit-jgz44\" (UID: \"c6f7d4df-bc28-4a01-a044-091894ac27c2\") " pod="openstack/cloudkitty-storageinit-jgz44" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.893102 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4bc0418-8267-476b-9f19-60f05965bfec-dns-svc\") pod \"dnsmasq-dns-789c5c5cb7-g78nz\" (UID: \"e4bc0418-8267-476b-9f19-60f05965bfec\") " pod="openstack/dnsmasq-dns-789c5c5cb7-g78nz" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.893149 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjg5p\" (UniqueName: \"kubernetes.io/projected/e4bc0418-8267-476b-9f19-60f05965bfec-kube-api-access-sjg5p\") pod \"dnsmasq-dns-789c5c5cb7-g78nz\" (UID: \"e4bc0418-8267-476b-9f19-60f05965bfec\") " pod="openstack/dnsmasq-dns-789c5c5cb7-g78nz" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.893210 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gwv8\" (UniqueName: \"kubernetes.io/projected/c6f7d4df-bc28-4a01-a044-091894ac27c2-kube-api-access-5gwv8\") pod \"cloudkitty-storageinit-jgz44\" (UID: \"c6f7d4df-bc28-4a01-a044-091894ac27c2\") " pod="openstack/cloudkitty-storageinit-jgz44" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.893243 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e4bc0418-8267-476b-9f19-60f05965bfec-dns-swift-storage-0\") pod \"dnsmasq-dns-789c5c5cb7-g78nz\" (UID: \"e4bc0418-8267-476b-9f19-60f05965bfec\") " pod="openstack/dnsmasq-dns-789c5c5cb7-g78nz" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.893310 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6f7d4df-bc28-4a01-a044-091894ac27c2-scripts\") pod \"cloudkitty-storageinit-jgz44\" (UID: \"c6f7d4df-bc28-4a01-a044-091894ac27c2\") " pod="openstack/cloudkitty-storageinit-jgz44" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.893373 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6f7d4df-bc28-4a01-a044-091894ac27c2-config-data\") pod \"cloudkitty-storageinit-jgz44\" (UID: \"c6f7d4df-bc28-4a01-a044-091894ac27c2\") " pod="openstack/cloudkitty-storageinit-jgz44" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.893440 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/c6f7d4df-bc28-4a01-a044-091894ac27c2-certs\") pod \"cloudkitty-storageinit-jgz44\" (UID: \"c6f7d4df-bc28-4a01-a044-091894ac27c2\") " pod="openstack/cloudkitty-storageinit-jgz44" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.893558 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4bc0418-8267-476b-9f19-60f05965bfec-ovsdbserver-nb\") pod \"dnsmasq-dns-789c5c5cb7-g78nz\" (UID: \"e4bc0418-8267-476b-9f19-60f05965bfec\") " pod="openstack/dnsmasq-dns-789c5c5cb7-g78nz" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.893581 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4bc0418-8267-476b-9f19-60f05965bfec-ovsdbserver-sb\") pod \"dnsmasq-dns-789c5c5cb7-g78nz\" (UID: \"e4bc0418-8267-476b-9f19-60f05965bfec\") " pod="openstack/dnsmasq-dns-789c5c5cb7-g78nz" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.893606 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4bc0418-8267-476b-9f19-60f05965bfec-config\") pod \"dnsmasq-dns-789c5c5cb7-g78nz\" (UID: \"e4bc0418-8267-476b-9f19-60f05965bfec\") " pod="openstack/dnsmasq-dns-789c5c5cb7-g78nz" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.906305 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f7d4df-bc28-4a01-a044-091894ac27c2-combined-ca-bundle\") pod \"cloudkitty-storageinit-jgz44\" (UID: \"c6f7d4df-bc28-4a01-a044-091894ac27c2\") " pod="openstack/cloudkitty-storageinit-jgz44" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.919159 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/c6f7d4df-bc28-4a01-a044-091894ac27c2-certs\") pod \"cloudkitty-storageinit-jgz44\" (UID: \"c6f7d4df-bc28-4a01-a044-091894ac27c2\") " pod="openstack/cloudkitty-storageinit-jgz44" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.926855 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6f7d4df-bc28-4a01-a044-091894ac27c2-config-data\") pod \"cloudkitty-storageinit-jgz44\" (UID: \"c6f7d4df-bc28-4a01-a044-091894ac27c2\") " pod="openstack/cloudkitty-storageinit-jgz44" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.934955 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.936829 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6f7d4df-bc28-4a01-a044-091894ac27c2-scripts\") pod \"cloudkitty-storageinit-jgz44\" (UID: \"c6f7d4df-bc28-4a01-a044-091894ac27c2\") " pod="openstack/cloudkitty-storageinit-jgz44" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.967477 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.976126 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.977283 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gwv8\" (UniqueName: \"kubernetes.io/projected/c6f7d4df-bc28-4a01-a044-091894ac27c2-kube-api-access-5gwv8\") pod \"cloudkitty-storageinit-jgz44\" (UID: \"c6f7d4df-bc28-4a01-a044-091894ac27c2\") " pod="openstack/cloudkitty-storageinit-jgz44" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.979313 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-vs6xr" Feb 18 14:19:28 crc kubenswrapper[4817]: I0218 14:19:28.979628 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:28.999422 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.001924 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-jgz44" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.026102 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4bc0418-8267-476b-9f19-60f05965bfec-ovsdbserver-nb\") pod \"dnsmasq-dns-789c5c5cb7-g78nz\" (UID: \"e4bc0418-8267-476b-9f19-60f05965bfec\") " pod="openstack/dnsmasq-dns-789c5c5cb7-g78nz" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.026144 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4bc0418-8267-476b-9f19-60f05965bfec-ovsdbserver-sb\") pod \"dnsmasq-dns-789c5c5cb7-g78nz\" (UID: \"e4bc0418-8267-476b-9f19-60f05965bfec\") " pod="openstack/dnsmasq-dns-789c5c5cb7-g78nz" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.026175 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4bc0418-8267-476b-9f19-60f05965bfec-config\") pod \"dnsmasq-dns-789c5c5cb7-g78nz\" (UID: \"e4bc0418-8267-476b-9f19-60f05965bfec\") " pod="openstack/dnsmasq-dns-789c5c5cb7-g78nz" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.026405 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4bc0418-8267-476b-9f19-60f05965bfec-dns-svc\") pod \"dnsmasq-dns-789c5c5cb7-g78nz\" (UID: \"e4bc0418-8267-476b-9f19-60f05965bfec\") " pod="openstack/dnsmasq-dns-789c5c5cb7-g78nz" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.026443 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjg5p\" (UniqueName: \"kubernetes.io/projected/e4bc0418-8267-476b-9f19-60f05965bfec-kube-api-access-sjg5p\") pod \"dnsmasq-dns-789c5c5cb7-g78nz\" (UID: \"e4bc0418-8267-476b-9f19-60f05965bfec\") " pod="openstack/dnsmasq-dns-789c5c5cb7-g78nz" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.026515 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e4bc0418-8267-476b-9f19-60f05965bfec-dns-swift-storage-0\") pod \"dnsmasq-dns-789c5c5cb7-g78nz\" (UID: \"e4bc0418-8267-476b-9f19-60f05965bfec\") " pod="openstack/dnsmasq-dns-789c5c5cb7-g78nz" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.028709 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4bc0418-8267-476b-9f19-60f05965bfec-config\") pod \"dnsmasq-dns-789c5c5cb7-g78nz\" (UID: \"e4bc0418-8267-476b-9f19-60f05965bfec\") " pod="openstack/dnsmasq-dns-789c5c5cb7-g78nz" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.033738 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4bc0418-8267-476b-9f19-60f05965bfec-dns-svc\") pod \"dnsmasq-dns-789c5c5cb7-g78nz\" (UID: \"e4bc0418-8267-476b-9f19-60f05965bfec\") " pod="openstack/dnsmasq-dns-789c5c5cb7-g78nz" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.033919 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4bc0418-8267-476b-9f19-60f05965bfec-ovsdbserver-sb\") pod \"dnsmasq-dns-789c5c5cb7-g78nz\" (UID: \"e4bc0418-8267-476b-9f19-60f05965bfec\") " pod="openstack/dnsmasq-dns-789c5c5cb7-g78nz" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.035480 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4bc0418-8267-476b-9f19-60f05965bfec-ovsdbserver-nb\") pod \"dnsmasq-dns-789c5c5cb7-g78nz\" (UID: \"e4bc0418-8267-476b-9f19-60f05965bfec\") " pod="openstack/dnsmasq-dns-789c5c5cb7-g78nz" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.027779 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e4bc0418-8267-476b-9f19-60f05965bfec-dns-swift-storage-0\") pod \"dnsmasq-dns-789c5c5cb7-g78nz\" (UID: \"e4bc0418-8267-476b-9f19-60f05965bfec\") " pod="openstack/dnsmasq-dns-789c5c5cb7-g78nz" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.056043 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-789c5c5cb7-g78nz"] Feb 18 14:19:29 crc kubenswrapper[4817]: E0218 14:19:29.056936 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-sjg5p], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-789c5c5cb7-g78nz" podUID="e4bc0418-8267-476b-9f19-60f05965bfec" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.072362 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ace81bfb-db15-429f-9168-936817dad694","Type":"ContainerStarted","Data":"905e57f8ba7126ce4336ffd2a1b6bfcf1b03dfe88944b85310e838f6ab78630b"} Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.074291 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjg5p\" (UniqueName: \"kubernetes.io/projected/e4bc0418-8267-476b-9f19-60f05965bfec-kube-api-access-sjg5p\") pod \"dnsmasq-dns-789c5c5cb7-g78nz\" (UID: \"e4bc0418-8267-476b-9f19-60f05965bfec\") " pod="openstack/dnsmasq-dns-789c5c5cb7-g78nz" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.095581 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69289a6c-7b95-4c16-a326-dab582fc4b86","Type":"ContainerDied","Data":"d4df07953ad4f0ccacba3de5700a9a959deab55b104a155e3b9482a725864ec0"} Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.095643 4817 scope.go:117] "RemoveContainer" containerID="4349e8f38b3e490984b83b035c211819bd8b59dddba035180f6537e1e67865b1" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.095820 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.111809 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-64ff6b6cd6-6qg5b" event={"ID":"8b0830a0-1f90-4b33-8976-875adeb804f9","Type":"ContainerDied","Data":"a1834ce4dec06d3730ebe5f5f3ce43bb088ac6f06075ee7f3f96e57babfdfc43"} Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.112068 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-64ff6b6cd6-6qg5b" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.128585 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90dec55d-864d-49da-b960-0a8e51e4d0ad-config-data\") pod \"cinder-scheduler-0\" (UID: \"90dec55d-864d-49da-b960-0a8e51e4d0ad\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.128681 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90dec55d-864d-49da-b960-0a8e51e4d0ad-scripts\") pod \"cinder-scheduler-0\" (UID: \"90dec55d-864d-49da-b960-0a8e51e4d0ad\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.128723 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90dec55d-864d-49da-b960-0a8e51e4d0ad-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"90dec55d-864d-49da-b960-0a8e51e4d0ad\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.128850 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90dec55d-864d-49da-b960-0a8e51e4d0ad-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"90dec55d-864d-49da-b960-0a8e51e4d0ad\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.128895 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86b5k\" (UniqueName: \"kubernetes.io/projected/90dec55d-864d-49da-b960-0a8e51e4d0ad-kube-api-access-86b5k\") pod \"cinder-scheduler-0\" (UID: \"90dec55d-864d-49da-b960-0a8e51e4d0ad\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.129071 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90dec55d-864d-49da-b960-0a8e51e4d0ad-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"90dec55d-864d-49da-b960-0a8e51e4d0ad\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.137170 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-95d56546f-smk2l"] Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.141189 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95d56546f-smk2l" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.151087 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95d56546f-smk2l"] Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.198586 4817 scope.go:117] "RemoveContainer" containerID="dde7cabe52ee3128f06443a1007f42fee418d5247be7aa5cbc03fef4f7e1dcc5" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.212772 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f8cb89c64-cqrwn"] Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.216633 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f8cb89c64-cqrwn" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.228485 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.228871 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.229115 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-v5lxv" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.229638 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.230910 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90dec55d-864d-49da-b960-0a8e51e4d0ad-config-data\") pod \"cinder-scheduler-0\" (UID: \"90dec55d-864d-49da-b960-0a8e51e4d0ad\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.231017 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90dec55d-864d-49da-b960-0a8e51e4d0ad-scripts\") pod \"cinder-scheduler-0\" (UID: \"90dec55d-864d-49da-b960-0a8e51e4d0ad\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.231056 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90dec55d-864d-49da-b960-0a8e51e4d0ad-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"90dec55d-864d-49da-b960-0a8e51e4d0ad\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.231130 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90dec55d-864d-49da-b960-0a8e51e4d0ad-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"90dec55d-864d-49da-b960-0a8e51e4d0ad\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.231191 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86b5k\" (UniqueName: \"kubernetes.io/projected/90dec55d-864d-49da-b960-0a8e51e4d0ad-kube-api-access-86b5k\") pod \"cinder-scheduler-0\" (UID: \"90dec55d-864d-49da-b960-0a8e51e4d0ad\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.231277 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90dec55d-864d-49da-b960-0a8e51e4d0ad-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"90dec55d-864d-49da-b960-0a8e51e4d0ad\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.237656 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90dec55d-864d-49da-b960-0a8e51e4d0ad-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"90dec55d-864d-49da-b960-0a8e51e4d0ad\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.252060 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90dec55d-864d-49da-b960-0a8e51e4d0ad-scripts\") pod \"cinder-scheduler-0\" (UID: \"90dec55d-864d-49da-b960-0a8e51e4d0ad\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.254690 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f8cb89c64-cqrwn"] Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.258030 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90dec55d-864d-49da-b960-0a8e51e4d0ad-config-data\") pod \"cinder-scheduler-0\" (UID: \"90dec55d-864d-49da-b960-0a8e51e4d0ad\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.271287 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90dec55d-864d-49da-b960-0a8e51e4d0ad-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"90dec55d-864d-49da-b960-0a8e51e4d0ad\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.282492 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90dec55d-864d-49da-b960-0a8e51e4d0ad-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"90dec55d-864d-49da-b960-0a8e51e4d0ad\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.282569 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.696993332 podStartE2EDuration="16.282554905s" podCreationTimestamp="2026-02-18 14:19:13 +0000 UTC" firstStartedPulling="2026-02-18 14:19:14.754224393 +0000 UTC m=+1217.329760376" lastFinishedPulling="2026-02-18 14:19:28.339785966 +0000 UTC m=+1230.915321949" observedRunningTime="2026-02-18 14:19:29.103277438 +0000 UTC m=+1231.678813421" watchObservedRunningTime="2026-02-18 14:19:29.282554905 +0000 UTC m=+1231.858090898" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.283755 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86b5k\" (UniqueName: \"kubernetes.io/projected/90dec55d-864d-49da-b960-0a8e51e4d0ad-kube-api-access-86b5k\") pod \"cinder-scheduler-0\" (UID: \"90dec55d-864d-49da-b960-0a8e51e4d0ad\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.324032 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.338015 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c85f7224-7818-41ea-a46b-3d55ac66cece-dns-swift-storage-0\") pod \"dnsmasq-dns-95d56546f-smk2l\" (UID: \"c85f7224-7818-41ea-a46b-3d55ac66cece\") " pod="openstack/dnsmasq-dns-95d56546f-smk2l" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.338079 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c85f7224-7818-41ea-a46b-3d55ac66cece-config\") pod \"dnsmasq-dns-95d56546f-smk2l\" (UID: \"c85f7224-7818-41ea-a46b-3d55ac66cece\") " pod="openstack/dnsmasq-dns-95d56546f-smk2l" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.338127 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20cb47f0-a64d-4e7d-93f0-1fed117df7ce-combined-ca-bundle\") pod \"neutron-f8cb89c64-cqrwn\" (UID: \"20cb47f0-a64d-4e7d-93f0-1fed117df7ce\") " pod="openstack/neutron-f8cb89c64-cqrwn" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.338199 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjxkv\" (UniqueName: \"kubernetes.io/projected/20cb47f0-a64d-4e7d-93f0-1fed117df7ce-kube-api-access-fjxkv\") pod \"neutron-f8cb89c64-cqrwn\" (UID: \"20cb47f0-a64d-4e7d-93f0-1fed117df7ce\") " pod="openstack/neutron-f8cb89c64-cqrwn" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.338259 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/20cb47f0-a64d-4e7d-93f0-1fed117df7ce-httpd-config\") pod \"neutron-f8cb89c64-cqrwn\" (UID: \"20cb47f0-a64d-4e7d-93f0-1fed117df7ce\") " pod="openstack/neutron-f8cb89c64-cqrwn" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.338285 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c85f7224-7818-41ea-a46b-3d55ac66cece-ovsdbserver-nb\") pod \"dnsmasq-dns-95d56546f-smk2l\" (UID: \"c85f7224-7818-41ea-a46b-3d55ac66cece\") " pod="openstack/dnsmasq-dns-95d56546f-smk2l" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.338359 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blbkx\" (UniqueName: \"kubernetes.io/projected/c85f7224-7818-41ea-a46b-3d55ac66cece-kube-api-access-blbkx\") pod \"dnsmasq-dns-95d56546f-smk2l\" (UID: \"c85f7224-7818-41ea-a46b-3d55ac66cece\") " pod="openstack/dnsmasq-dns-95d56546f-smk2l" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.338386 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/20cb47f0-a64d-4e7d-93f0-1fed117df7ce-config\") pod \"neutron-f8cb89c64-cqrwn\" (UID: \"20cb47f0-a64d-4e7d-93f0-1fed117df7ce\") " pod="openstack/neutron-f8cb89c64-cqrwn" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.338431 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/20cb47f0-a64d-4e7d-93f0-1fed117df7ce-ovndb-tls-certs\") pod \"neutron-f8cb89c64-cqrwn\" (UID: \"20cb47f0-a64d-4e7d-93f0-1fed117df7ce\") " pod="openstack/neutron-f8cb89c64-cqrwn" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.338477 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c85f7224-7818-41ea-a46b-3d55ac66cece-ovsdbserver-sb\") pod \"dnsmasq-dns-95d56546f-smk2l\" (UID: \"c85f7224-7818-41ea-a46b-3d55ac66cece\") " pod="openstack/dnsmasq-dns-95d56546f-smk2l" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.338509 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c85f7224-7818-41ea-a46b-3d55ac66cece-dns-svc\") pod \"dnsmasq-dns-95d56546f-smk2l\" (UID: \"c85f7224-7818-41ea-a46b-3d55ac66cece\") " pod="openstack/dnsmasq-dns-95d56546f-smk2l" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.341523 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.363557 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.373913 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.382885 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.387321 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.394891 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.430924 4817 scope.go:117] "RemoveContainer" containerID="c3a599395d06f206c5faa5386bea5b3abc4cb33817b5a9e4c092c926794b4c0f" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.440189 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blbkx\" (UniqueName: \"kubernetes.io/projected/c85f7224-7818-41ea-a46b-3d55ac66cece-kube-api-access-blbkx\") pod \"dnsmasq-dns-95d56546f-smk2l\" (UID: \"c85f7224-7818-41ea-a46b-3d55ac66cece\") " pod="openstack/dnsmasq-dns-95d56546f-smk2l" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.440239 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/20cb47f0-a64d-4e7d-93f0-1fed117df7ce-config\") pod \"neutron-f8cb89c64-cqrwn\" (UID: \"20cb47f0-a64d-4e7d-93f0-1fed117df7ce\") " pod="openstack/neutron-f8cb89c64-cqrwn" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.440290 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/20cb47f0-a64d-4e7d-93f0-1fed117df7ce-ovndb-tls-certs\") pod \"neutron-f8cb89c64-cqrwn\" (UID: \"20cb47f0-a64d-4e7d-93f0-1fed117df7ce\") " pod="openstack/neutron-f8cb89c64-cqrwn" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.440318 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c85f7224-7818-41ea-a46b-3d55ac66cece-ovsdbserver-sb\") pod \"dnsmasq-dns-95d56546f-smk2l\" (UID: \"c85f7224-7818-41ea-a46b-3d55ac66cece\") " pod="openstack/dnsmasq-dns-95d56546f-smk2l" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.440350 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c85f7224-7818-41ea-a46b-3d55ac66cece-dns-svc\") pod \"dnsmasq-dns-95d56546f-smk2l\" (UID: \"c85f7224-7818-41ea-a46b-3d55ac66cece\") " pod="openstack/dnsmasq-dns-95d56546f-smk2l" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.440419 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c85f7224-7818-41ea-a46b-3d55ac66cece-dns-swift-storage-0\") pod \"dnsmasq-dns-95d56546f-smk2l\" (UID: \"c85f7224-7818-41ea-a46b-3d55ac66cece\") " pod="openstack/dnsmasq-dns-95d56546f-smk2l" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.440452 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c85f7224-7818-41ea-a46b-3d55ac66cece-config\") pod \"dnsmasq-dns-95d56546f-smk2l\" (UID: \"c85f7224-7818-41ea-a46b-3d55ac66cece\") " pod="openstack/dnsmasq-dns-95d56546f-smk2l" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.440511 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20cb47f0-a64d-4e7d-93f0-1fed117df7ce-combined-ca-bundle\") pod \"neutron-f8cb89c64-cqrwn\" (UID: \"20cb47f0-a64d-4e7d-93f0-1fed117df7ce\") " pod="openstack/neutron-f8cb89c64-cqrwn" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.440572 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjxkv\" (UniqueName: \"kubernetes.io/projected/20cb47f0-a64d-4e7d-93f0-1fed117df7ce-kube-api-access-fjxkv\") pod \"neutron-f8cb89c64-cqrwn\" (UID: \"20cb47f0-a64d-4e7d-93f0-1fed117df7ce\") " pod="openstack/neutron-f8cb89c64-cqrwn" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.440619 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/20cb47f0-a64d-4e7d-93f0-1fed117df7ce-httpd-config\") pod \"neutron-f8cb89c64-cqrwn\" (UID: \"20cb47f0-a64d-4e7d-93f0-1fed117df7ce\") " pod="openstack/neutron-f8cb89c64-cqrwn" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.440647 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c85f7224-7818-41ea-a46b-3d55ac66cece-ovsdbserver-nb\") pod \"dnsmasq-dns-95d56546f-smk2l\" (UID: \"c85f7224-7818-41ea-a46b-3d55ac66cece\") " pod="openstack/dnsmasq-dns-95d56546f-smk2l" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.445706 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20cb47f0-a64d-4e7d-93f0-1fed117df7ce-combined-ca-bundle\") pod \"neutron-f8cb89c64-cqrwn\" (UID: \"20cb47f0-a64d-4e7d-93f0-1fed117df7ce\") " pod="openstack/neutron-f8cb89c64-cqrwn" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.451486 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c85f7224-7818-41ea-a46b-3d55ac66cece-ovsdbserver-nb\") pod \"dnsmasq-dns-95d56546f-smk2l\" (UID: \"c85f7224-7818-41ea-a46b-3d55ac66cece\") " pod="openstack/dnsmasq-dns-95d56546f-smk2l" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.456679 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/20cb47f0-a64d-4e7d-93f0-1fed117df7ce-httpd-config\") pod \"neutron-f8cb89c64-cqrwn\" (UID: \"20cb47f0-a64d-4e7d-93f0-1fed117df7ce\") " pod="openstack/neutron-f8cb89c64-cqrwn" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.465364 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.470595 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/20cb47f0-a64d-4e7d-93f0-1fed117df7ce-config\") pod \"neutron-f8cb89c64-cqrwn\" (UID: \"20cb47f0-a64d-4e7d-93f0-1fed117df7ce\") " pod="openstack/neutron-f8cb89c64-cqrwn" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.471194 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c85f7224-7818-41ea-a46b-3d55ac66cece-config\") pod \"dnsmasq-dns-95d56546f-smk2l\" (UID: \"c85f7224-7818-41ea-a46b-3d55ac66cece\") " pod="openstack/dnsmasq-dns-95d56546f-smk2l" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.472034 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.476362 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.477713 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c85f7224-7818-41ea-a46b-3d55ac66cece-ovsdbserver-sb\") pod \"dnsmasq-dns-95d56546f-smk2l\" (UID: \"c85f7224-7818-41ea-a46b-3d55ac66cece\") " pod="openstack/dnsmasq-dns-95d56546f-smk2l" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.477713 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c85f7224-7818-41ea-a46b-3d55ac66cece-dns-svc\") pod \"dnsmasq-dns-95d56546f-smk2l\" (UID: \"c85f7224-7818-41ea-a46b-3d55ac66cece\") " pod="openstack/dnsmasq-dns-95d56546f-smk2l" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.480617 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/20cb47f0-a64d-4e7d-93f0-1fed117df7ce-ovndb-tls-certs\") pod \"neutron-f8cb89c64-cqrwn\" (UID: \"20cb47f0-a64d-4e7d-93f0-1fed117df7ce\") " pod="openstack/neutron-f8cb89c64-cqrwn" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.485552 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjxkv\" (UniqueName: \"kubernetes.io/projected/20cb47f0-a64d-4e7d-93f0-1fed117df7ce-kube-api-access-fjxkv\") pod \"neutron-f8cb89c64-cqrwn\" (UID: \"20cb47f0-a64d-4e7d-93f0-1fed117df7ce\") " pod="openstack/neutron-f8cb89c64-cqrwn" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.490992 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c85f7224-7818-41ea-a46b-3d55ac66cece-dns-swift-storage-0\") pod \"dnsmasq-dns-95d56546f-smk2l\" (UID: \"c85f7224-7818-41ea-a46b-3d55ac66cece\") " pod="openstack/dnsmasq-dns-95d56546f-smk2l" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.508277 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blbkx\" (UniqueName: \"kubernetes.io/projected/c85f7224-7818-41ea-a46b-3d55ac66cece-kube-api-access-blbkx\") pod \"dnsmasq-dns-95d56546f-smk2l\" (UID: \"c85f7224-7818-41ea-a46b-3d55ac66cece\") " pod="openstack/dnsmasq-dns-95d56546f-smk2l" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.524033 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.537951 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-64ff6b6cd6-6qg5b"] Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.541910 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-run-httpd\") pod \"ceilometer-0\" (UID: \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\") " pod="openstack/ceilometer-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.541948 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-scripts\") pod \"ceilometer-0\" (UID: \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\") " pod="openstack/ceilometer-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.542041 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\") " pod="openstack/ceilometer-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.542075 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-config-data\") pod \"ceilometer-0\" (UID: \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\") " pod="openstack/ceilometer-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.542139 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2spmb\" (UniqueName: \"kubernetes.io/projected/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-kube-api-access-2spmb\") pod \"ceilometer-0\" (UID: \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\") " pod="openstack/ceilometer-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.542183 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\") " pod="openstack/ceilometer-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.542268 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-log-httpd\") pod \"ceilometer-0\" (UID: \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\") " pod="openstack/ceilometer-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.553409 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-64ff6b6cd6-6qg5b"] Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.570749 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.643598 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-scripts\") pod \"cinder-api-0\" (UID: \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\") " pod="openstack/cinder-api-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.643713 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-log-httpd\") pod \"ceilometer-0\" (UID: \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\") " pod="openstack/ceilometer-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.643746 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-run-httpd\") pod \"ceilometer-0\" (UID: \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\") " pod="openstack/ceilometer-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.643766 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-scripts\") pod \"ceilometer-0\" (UID: \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\") " pod="openstack/ceilometer-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.643794 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\") " pod="openstack/cinder-api-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.643855 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\") " pod="openstack/ceilometer-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.643893 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-config-data\") pod \"ceilometer-0\" (UID: \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\") " pod="openstack/ceilometer-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.643918 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh2wv\" (UniqueName: \"kubernetes.io/projected/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-kube-api-access-xh2wv\") pod \"cinder-api-0\" (UID: \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\") " pod="openstack/cinder-api-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.643968 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-logs\") pod \"cinder-api-0\" (UID: \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\") " pod="openstack/cinder-api-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.644009 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\") " pod="openstack/cinder-api-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.644032 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2spmb\" (UniqueName: \"kubernetes.io/projected/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-kube-api-access-2spmb\") pod \"ceilometer-0\" (UID: \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\") " pod="openstack/ceilometer-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.644067 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-config-data-custom\") pod \"cinder-api-0\" (UID: \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\") " pod="openstack/cinder-api-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.644112 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\") " pod="openstack/ceilometer-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.644138 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-config-data\") pod \"cinder-api-0\" (UID: \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\") " pod="openstack/cinder-api-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.644699 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-log-httpd\") pod \"ceilometer-0\" (UID: \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\") " pod="openstack/ceilometer-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.644724 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-run-httpd\") pod \"ceilometer-0\" (UID: \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\") " pod="openstack/ceilometer-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.650456 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\") " pod="openstack/ceilometer-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.654712 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\") " pod="openstack/ceilometer-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.657729 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-config-data\") pod \"ceilometer-0\" (UID: \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\") " pod="openstack/ceilometer-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.665587 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-scripts\") pod \"ceilometer-0\" (UID: \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\") " pod="openstack/ceilometer-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.669246 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2spmb\" (UniqueName: \"kubernetes.io/projected/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-kube-api-access-2spmb\") pod \"ceilometer-0\" (UID: \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\") " pod="openstack/ceilometer-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.671552 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95d56546f-smk2l" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.689393 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-549ff9d7ff-4pfxq"] Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.733051 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f8cb89c64-cqrwn" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.736158 4817 scope.go:117] "RemoveContainer" containerID="446fd5b3ecb61957312dd974bf64fcfa6943b99e18c4a83378af0878c142cda6" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.748130 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-config-data\") pod \"cinder-api-0\" (UID: \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\") " pod="openstack/cinder-api-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.748540 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-scripts\") pod \"cinder-api-0\" (UID: \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\") " pod="openstack/cinder-api-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.748720 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\") " pod="openstack/cinder-api-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.748910 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh2wv\" (UniqueName: \"kubernetes.io/projected/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-kube-api-access-xh2wv\") pod \"cinder-api-0\" (UID: \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\") " pod="openstack/cinder-api-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.749014 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-logs\") pod \"cinder-api-0\" (UID: \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\") " pod="openstack/cinder-api-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.749040 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\") " pod="openstack/cinder-api-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.750348 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-config-data-custom\") pod \"cinder-api-0\" (UID: \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\") " pod="openstack/cinder-api-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.754570 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\") " pod="openstack/cinder-api-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.755228 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-logs\") pod \"cinder-api-0\" (UID: \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\") " pod="openstack/cinder-api-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.765755 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\") " pod="openstack/cinder-api-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.771496 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-scripts\") pod \"cinder-api-0\" (UID: \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\") " pod="openstack/cinder-api-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.772616 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-config-data\") pod \"cinder-api-0\" (UID: \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\") " pod="openstack/cinder-api-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.774825 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh2wv\" (UniqueName: \"kubernetes.io/projected/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-kube-api-access-xh2wv\") pod \"cinder-api-0\" (UID: \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\") " pod="openstack/cinder-api-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.775055 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-2fk5c" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.778489 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-config-data-custom\") pod \"cinder-api-0\" (UID: \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\") " pod="openstack/cinder-api-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.847286 4817 scope.go:117] "RemoveContainer" containerID="ca9e72e424c10baf7d397ed920fbc591f0259c9858ae4eb24d73920e41c1129f" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.851298 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8de51007-ada2-49f5-90b2-11151899e3cf-db-sync-config-data\") pod \"8de51007-ada2-49f5-90b2-11151899e3cf\" (UID: \"8de51007-ada2-49f5-90b2-11151899e3cf\") " Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.851895 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7fzc\" (UniqueName: \"kubernetes.io/projected/8de51007-ada2-49f5-90b2-11151899e3cf-kube-api-access-q7fzc\") pod \"8de51007-ada2-49f5-90b2-11151899e3cf\" (UID: \"8de51007-ada2-49f5-90b2-11151899e3cf\") " Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.852069 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de51007-ada2-49f5-90b2-11151899e3cf-config-data\") pod \"8de51007-ada2-49f5-90b2-11151899e3cf\" (UID: \"8de51007-ada2-49f5-90b2-11151899e3cf\") " Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.852108 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de51007-ada2-49f5-90b2-11151899e3cf-combined-ca-bundle\") pod \"8de51007-ada2-49f5-90b2-11151899e3cf\" (UID: \"8de51007-ada2-49f5-90b2-11151899e3cf\") " Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.855374 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8de51007-ada2-49f5-90b2-11151899e3cf-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8de51007-ada2-49f5-90b2-11151899e3cf" (UID: "8de51007-ada2-49f5-90b2-11151899e3cf"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.863705 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8de51007-ada2-49f5-90b2-11151899e3cf-kube-api-access-q7fzc" (OuterVolumeSpecName: "kube-api-access-q7fzc") pod "8de51007-ada2-49f5-90b2-11151899e3cf" (UID: "8de51007-ada2-49f5-90b2-11151899e3cf"). InnerVolumeSpecName "kube-api-access-q7fzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.906082 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.955068 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-jgz44"] Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.959487 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.960927 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8de51007-ada2-49f5-90b2-11151899e3cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8de51007-ada2-49f5-90b2-11151899e3cf" (UID: "8de51007-ada2-49f5-90b2-11151899e3cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.962259 4817 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8de51007-ada2-49f5-90b2-11151899e3cf-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.962288 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7fzc\" (UniqueName: \"kubernetes.io/projected/8de51007-ada2-49f5-90b2-11151899e3cf-kube-api-access-q7fzc\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.962299 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de51007-ada2-49f5-90b2-11151899e3cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:29 crc kubenswrapper[4817]: I0218 14:19:29.996211 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8de51007-ada2-49f5-90b2-11151899e3cf-config-data" (OuterVolumeSpecName: "config-data") pod "8de51007-ada2-49f5-90b2-11151899e3cf" (UID: "8de51007-ada2-49f5-90b2-11151899e3cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:30 crc kubenswrapper[4817]: I0218 14:19:30.063897 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8de51007-ada2-49f5-90b2-11151899e3cf-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:30 crc kubenswrapper[4817]: I0218 14:19:30.158649 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-2fk5c" event={"ID":"8de51007-ada2-49f5-90b2-11151899e3cf","Type":"ContainerDied","Data":"bb641bbbdcbc85305cb1c82ecfc41c8af46270b50aa7e87aba5ae233f0c82a8b"} Feb 18 14:19:30 crc kubenswrapper[4817]: I0218 14:19:30.159042 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb641bbbdcbc85305cb1c82ecfc41c8af46270b50aa7e87aba5ae233f0c82a8b" Feb 18 14:19:30 crc kubenswrapper[4817]: I0218 14:19:30.159010 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-2fk5c" Feb 18 14:19:30 crc kubenswrapper[4817]: I0218 14:19:30.206769 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69289a6c-7b95-4c16-a326-dab582fc4b86" path="/var/lib/kubelet/pods/69289a6c-7b95-4c16-a326-dab582fc4b86/volumes" Feb 18 14:19:30 crc kubenswrapper[4817]: I0218 14:19:30.207821 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b0830a0-1f90-4b33-8976-875adeb804f9" path="/var/lib/kubelet/pods/8b0830a0-1f90-4b33-8976-875adeb804f9/volumes" Feb 18 14:19:30 crc kubenswrapper[4817]: I0218 14:19:30.208372 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-789c5c5cb7-g78nz" Feb 18 14:19:30 crc kubenswrapper[4817]: I0218 14:19:30.224296 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-549ff9d7ff-4pfxq" event={"ID":"85a61008-fd45-4598-90bc-b0cf2856cefa","Type":"ContainerStarted","Data":"356b9a5203970b43edc57f418298ec1f932efe6e782e2bbf4545840a6ef149b3"} Feb 18 14:19:30 crc kubenswrapper[4817]: I0218 14:19:30.326716 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 14:19:30 crc kubenswrapper[4817]: I0218 14:19:30.377094 4817 scope.go:117] "RemoveContainer" containerID="78c0dd6a38727f9c2bedc253d7c6cf61f7a740807fbceb83f866aef29681a977" Feb 18 14:19:30 crc kubenswrapper[4817]: I0218 14:19:30.504830 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95d56546f-smk2l"] Feb 18 14:19:30 crc kubenswrapper[4817]: I0218 14:19:30.613252 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f8cb89c64-cqrwn"] Feb 18 14:19:30 crc kubenswrapper[4817]: I0218 14:19:30.617292 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-789c5c5cb7-g78nz" Feb 18 14:19:30 crc kubenswrapper[4817]: I0218 14:19:30.710173 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4bc0418-8267-476b-9f19-60f05965bfec-dns-svc\") pod \"e4bc0418-8267-476b-9f19-60f05965bfec\" (UID: \"e4bc0418-8267-476b-9f19-60f05965bfec\") " Feb 18 14:19:30 crc kubenswrapper[4817]: I0218 14:19:30.710522 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4bc0418-8267-476b-9f19-60f05965bfec-ovsdbserver-nb\") pod \"e4bc0418-8267-476b-9f19-60f05965bfec\" (UID: \"e4bc0418-8267-476b-9f19-60f05965bfec\") " Feb 18 14:19:30 crc kubenswrapper[4817]: I0218 14:19:30.710579 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4bc0418-8267-476b-9f19-60f05965bfec-ovsdbserver-sb\") pod \"e4bc0418-8267-476b-9f19-60f05965bfec\" (UID: \"e4bc0418-8267-476b-9f19-60f05965bfec\") " Feb 18 14:19:30 crc kubenswrapper[4817]: I0218 14:19:30.710606 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjg5p\" (UniqueName: \"kubernetes.io/projected/e4bc0418-8267-476b-9f19-60f05965bfec-kube-api-access-sjg5p\") pod \"e4bc0418-8267-476b-9f19-60f05965bfec\" (UID: \"e4bc0418-8267-476b-9f19-60f05965bfec\") " Feb 18 14:19:30 crc kubenswrapper[4817]: I0218 14:19:30.710716 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4bc0418-8267-476b-9f19-60f05965bfec-config\") pod \"e4bc0418-8267-476b-9f19-60f05965bfec\" (UID: \"e4bc0418-8267-476b-9f19-60f05965bfec\") " Feb 18 14:19:30 crc kubenswrapper[4817]: I0218 14:19:30.710760 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e4bc0418-8267-476b-9f19-60f05965bfec-dns-swift-storage-0\") pod \"e4bc0418-8267-476b-9f19-60f05965bfec\" (UID: \"e4bc0418-8267-476b-9f19-60f05965bfec\") " Feb 18 14:19:30 crc kubenswrapper[4817]: I0218 14:19:30.710744 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4bc0418-8267-476b-9f19-60f05965bfec-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e4bc0418-8267-476b-9f19-60f05965bfec" (UID: "e4bc0418-8267-476b-9f19-60f05965bfec"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:19:30 crc kubenswrapper[4817]: I0218 14:19:30.711213 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4bc0418-8267-476b-9f19-60f05965bfec-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e4bc0418-8267-476b-9f19-60f05965bfec" (UID: "e4bc0418-8267-476b-9f19-60f05965bfec"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:19:30 crc kubenswrapper[4817]: I0218 14:19:30.711487 4817 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4bc0418-8267-476b-9f19-60f05965bfec-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:30 crc kubenswrapper[4817]: I0218 14:19:30.711504 4817 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4bc0418-8267-476b-9f19-60f05965bfec-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:30 crc kubenswrapper[4817]: I0218 14:19:30.711576 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4bc0418-8267-476b-9f19-60f05965bfec-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e4bc0418-8267-476b-9f19-60f05965bfec" (UID: "e4bc0418-8267-476b-9f19-60f05965bfec"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:19:30 crc kubenswrapper[4817]: I0218 14:19:30.711958 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4bc0418-8267-476b-9f19-60f05965bfec-config" (OuterVolumeSpecName: "config") pod "e4bc0418-8267-476b-9f19-60f05965bfec" (UID: "e4bc0418-8267-476b-9f19-60f05965bfec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:19:30 crc kubenswrapper[4817]: I0218 14:19:30.712366 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4bc0418-8267-476b-9f19-60f05965bfec-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e4bc0418-8267-476b-9f19-60f05965bfec" (UID: "e4bc0418-8267-476b-9f19-60f05965bfec"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:19:30 crc kubenswrapper[4817]: I0218 14:19:30.723426 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4bc0418-8267-476b-9f19-60f05965bfec-kube-api-access-sjg5p" (OuterVolumeSpecName: "kube-api-access-sjg5p") pod "e4bc0418-8267-476b-9f19-60f05965bfec" (UID: "e4bc0418-8267-476b-9f19-60f05965bfec"). InnerVolumeSpecName "kube-api-access-sjg5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:19:30 crc kubenswrapper[4817]: I0218 14:19:30.816499 4817 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4bc0418-8267-476b-9f19-60f05965bfec-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:30 crc kubenswrapper[4817]: I0218 14:19:30.816558 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjg5p\" (UniqueName: \"kubernetes.io/projected/e4bc0418-8267-476b-9f19-60f05965bfec-kube-api-access-sjg5p\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:30 crc kubenswrapper[4817]: I0218 14:19:30.816573 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4bc0418-8267-476b-9f19-60f05965bfec-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:30 crc kubenswrapper[4817]: I0218 14:19:30.816586 4817 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e4bc0418-8267-476b-9f19-60f05965bfec-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.239333 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.264231 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-jgz44" event={"ID":"c6f7d4df-bc28-4a01-a044-091894ac27c2","Type":"ContainerStarted","Data":"f04251b56135cf24124d6f6b653e2c82951eecc3a89e526f44eabe6a9a119e2f"} Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.264661 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-jgz44" event={"ID":"c6f7d4df-bc28-4a01-a044-091894ac27c2","Type":"ContainerStarted","Data":"9ec7de488cfce9abf572a11225c170ce0aaf396fa32791239bcc1cff60fbce32"} Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.277793 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90dec55d-864d-49da-b960-0a8e51e4d0ad","Type":"ContainerStarted","Data":"4f3ad76fad3426460ffadd9343cac72bcd7b354bcc695f9a091e0bcd3cf85fdd"} Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.294761 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-jgz44" podStartSLOduration=3.294747245 podStartE2EDuration="3.294747245s" podCreationTimestamp="2026-02-18 14:19:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:19:31.29334946 +0000 UTC m=+1233.868885443" watchObservedRunningTime="2026-02-18 14:19:31.294747245 +0000 UTC m=+1233.870283228" Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.311542 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95d56546f-smk2l" event={"ID":"c85f7224-7818-41ea-a46b-3d55ac66cece","Type":"ContainerStarted","Data":"d9e448eeffe6875db790257741d76906384b352c0bfb87810a0851bc44f249fa"} Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.311588 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95d56546f-smk2l" event={"ID":"c85f7224-7818-41ea-a46b-3d55ac66cece","Type":"ContainerStarted","Data":"ec3fd88083ad8710ce3c97055baf1639095ca1d0a921d5b1f756914d0f6bd9dd"} Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.315708 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f8cb89c64-cqrwn" event={"ID":"20cb47f0-a64d-4e7d-93f0-1fed117df7ce","Type":"ContainerStarted","Data":"0a31c6c6302c97db31d81c9b02b8cb8ea2ed7b56fec596129d4c5fc3be5ef214"} Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.315761 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f8cb89c64-cqrwn" event={"ID":"20cb47f0-a64d-4e7d-93f0-1fed117df7ce","Type":"ContainerStarted","Data":"1789427e05346805b6e3f5ca8b03aac7fe470c01de475e40c3bf13e1e87753f5"} Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.318496 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-549ff9d7ff-4pfxq" event={"ID":"85a61008-fd45-4598-90bc-b0cf2856cefa","Type":"ContainerStarted","Data":"553c180b6f654c5621ee83b2e869d351c92f0d2e6fd6769396156962802090a8"} Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.344231 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-789c5c5cb7-g78nz" Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.348644 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95d56546f-smk2l"] Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.388356 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.543736 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-9w568"] Feb 18 14:19:31 crc kubenswrapper[4817]: E0218 14:19:31.544225 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8de51007-ada2-49f5-90b2-11151899e3cf" containerName="glance-db-sync" Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.544247 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="8de51007-ada2-49f5-90b2-11151899e3cf" containerName="glance-db-sync" Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.544473 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="8de51007-ada2-49f5-90b2-11151899e3cf" containerName="glance-db-sync" Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.549938 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-9w568" Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.554435 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed573b21-30a7-47b3-bdc8-7d8843074607-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-9w568\" (UID: \"ed573b21-30a7-47b3-bdc8-7d8843074607\") " pod="openstack/dnsmasq-dns-5784cf869f-9w568" Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.554489 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed573b21-30a7-47b3-bdc8-7d8843074607-config\") pod \"dnsmasq-dns-5784cf869f-9w568\" (UID: \"ed573b21-30a7-47b3-bdc8-7d8843074607\") " pod="openstack/dnsmasq-dns-5784cf869f-9w568" Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.554540 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed573b21-30a7-47b3-bdc8-7d8843074607-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-9w568\" (UID: \"ed573b21-30a7-47b3-bdc8-7d8843074607\") " pod="openstack/dnsmasq-dns-5784cf869f-9w568" Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.554564 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed573b21-30a7-47b3-bdc8-7d8843074607-dns-svc\") pod \"dnsmasq-dns-5784cf869f-9w568\" (UID: \"ed573b21-30a7-47b3-bdc8-7d8843074607\") " pod="openstack/dnsmasq-dns-5784cf869f-9w568" Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.554594 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sr6k\" (UniqueName: \"kubernetes.io/projected/ed573b21-30a7-47b3-bdc8-7d8843074607-kube-api-access-9sr6k\") pod \"dnsmasq-dns-5784cf869f-9w568\" (UID: \"ed573b21-30a7-47b3-bdc8-7d8843074607\") " pod="openstack/dnsmasq-dns-5784cf869f-9w568" Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.554617 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed573b21-30a7-47b3-bdc8-7d8843074607-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-9w568\" (UID: \"ed573b21-30a7-47b3-bdc8-7d8843074607\") " pod="openstack/dnsmasq-dns-5784cf869f-9w568" Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.659517 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed573b21-30a7-47b3-bdc8-7d8843074607-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-9w568\" (UID: \"ed573b21-30a7-47b3-bdc8-7d8843074607\") " pod="openstack/dnsmasq-dns-5784cf869f-9w568" Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.659920 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed573b21-30a7-47b3-bdc8-7d8843074607-config\") pod \"dnsmasq-dns-5784cf869f-9w568\" (UID: \"ed573b21-30a7-47b3-bdc8-7d8843074607\") " pod="openstack/dnsmasq-dns-5784cf869f-9w568" Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.660030 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed573b21-30a7-47b3-bdc8-7d8843074607-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-9w568\" (UID: \"ed573b21-30a7-47b3-bdc8-7d8843074607\") " pod="openstack/dnsmasq-dns-5784cf869f-9w568" Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.660069 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed573b21-30a7-47b3-bdc8-7d8843074607-dns-svc\") pod \"dnsmasq-dns-5784cf869f-9w568\" (UID: \"ed573b21-30a7-47b3-bdc8-7d8843074607\") " pod="openstack/dnsmasq-dns-5784cf869f-9w568" Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.660128 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sr6k\" (UniqueName: \"kubernetes.io/projected/ed573b21-30a7-47b3-bdc8-7d8843074607-kube-api-access-9sr6k\") pod \"dnsmasq-dns-5784cf869f-9w568\" (UID: \"ed573b21-30a7-47b3-bdc8-7d8843074607\") " pod="openstack/dnsmasq-dns-5784cf869f-9w568" Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.660171 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed573b21-30a7-47b3-bdc8-7d8843074607-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-9w568\" (UID: \"ed573b21-30a7-47b3-bdc8-7d8843074607\") " pod="openstack/dnsmasq-dns-5784cf869f-9w568" Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.664366 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed573b21-30a7-47b3-bdc8-7d8843074607-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-9w568\" (UID: \"ed573b21-30a7-47b3-bdc8-7d8843074607\") " pod="openstack/dnsmasq-dns-5784cf869f-9w568" Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.668905 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed573b21-30a7-47b3-bdc8-7d8843074607-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-9w568\" (UID: \"ed573b21-30a7-47b3-bdc8-7d8843074607\") " pod="openstack/dnsmasq-dns-5784cf869f-9w568" Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.673344 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed573b21-30a7-47b3-bdc8-7d8843074607-dns-svc\") pod \"dnsmasq-dns-5784cf869f-9w568\" (UID: \"ed573b21-30a7-47b3-bdc8-7d8843074607\") " pod="openstack/dnsmasq-dns-5784cf869f-9w568" Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.674790 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed573b21-30a7-47b3-bdc8-7d8843074607-config\") pod \"dnsmasq-dns-5784cf869f-9w568\" (UID: \"ed573b21-30a7-47b3-bdc8-7d8843074607\") " pod="openstack/dnsmasq-dns-5784cf869f-9w568" Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.688429 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sr6k\" (UniqueName: \"kubernetes.io/projected/ed573b21-30a7-47b3-bdc8-7d8843074607-kube-api-access-9sr6k\") pod \"dnsmasq-dns-5784cf869f-9w568\" (UID: \"ed573b21-30a7-47b3-bdc8-7d8843074607\") " pod="openstack/dnsmasq-dns-5784cf869f-9w568" Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.688468 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed573b21-30a7-47b3-bdc8-7d8843074607-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-9w568\" (UID: \"ed573b21-30a7-47b3-bdc8-7d8843074607\") " pod="openstack/dnsmasq-dns-5784cf869f-9w568" Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.715192 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-9w568" Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.715719 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-9w568"] Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.754171 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-789c5c5cb7-g78nz"] Feb 18 14:19:31 crc kubenswrapper[4817]: I0218 14:19:31.791011 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-789c5c5cb7-g78nz"] Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.360449 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4bc0418-8267-476b-9f19-60f05965bfec" path="/var/lib/kubelet/pods/e4bc0418-8267-476b-9f19-60f05965bfec/volumes" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.361073 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.361098 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.370652 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.370759 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.386516 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.386958 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-24d5c" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.387656 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.448866 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c\") pod \"glance-default-external-api-0\" (UID: \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.448919 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d143e8-b16a-4487-8330-d5af2dbf6e2a-config-data\") pod \"glance-default-external-api-0\" (UID: \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.449009 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65d143e8-b16a-4487-8330-d5af2dbf6e2a-scripts\") pod \"glance-default-external-api-0\" (UID: \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.449064 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d143e8-b16a-4487-8330-d5af2dbf6e2a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.449440 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65d143e8-b16a-4487-8330-d5af2dbf6e2a-logs\") pod \"glance-default-external-api-0\" (UID: \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.449515 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b4g4\" (UniqueName: \"kubernetes.io/projected/65d143e8-b16a-4487-8330-d5af2dbf6e2a-kube-api-access-9b4g4\") pod \"glance-default-external-api-0\" (UID: \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.449573 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65d143e8-b16a-4487-8330-d5af2dbf6e2a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.594470 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.598315 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c\") pod \"glance-default-external-api-0\" (UID: \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.598357 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d143e8-b16a-4487-8330-d5af2dbf6e2a-config-data\") pod \"glance-default-external-api-0\" (UID: \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.598403 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65d143e8-b16a-4487-8330-d5af2dbf6e2a-scripts\") pod \"glance-default-external-api-0\" (UID: \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.598437 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d143e8-b16a-4487-8330-d5af2dbf6e2a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.598479 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65d143e8-b16a-4487-8330-d5af2dbf6e2a-logs\") pod \"glance-default-external-api-0\" (UID: \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.598506 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b4g4\" (UniqueName: \"kubernetes.io/projected/65d143e8-b16a-4487-8330-d5af2dbf6e2a-kube-api-access-9b4g4\") pod \"glance-default-external-api-0\" (UID: \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.598532 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65d143e8-b16a-4487-8330-d5af2dbf6e2a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.599096 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65d143e8-b16a-4487-8330-d5af2dbf6e2a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.603250 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65d143e8-b16a-4487-8330-d5af2dbf6e2a-logs\") pod \"glance-default-external-api-0\" (UID: \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.611648 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.627432 4817 generic.go:334] "Generic (PLEG): container finished" podID="c85f7224-7818-41ea-a46b-3d55ac66cece" containerID="d9e448eeffe6875db790257741d76906384b352c0bfb87810a0851bc44f249fa" exitCode=0 Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.627539 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95d56546f-smk2l" event={"ID":"c85f7224-7818-41ea-a46b-3d55ac66cece","Type":"ContainerDied","Data":"d9e448eeffe6875db790257741d76906384b352c0bfb87810a0851bc44f249fa"} Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.627571 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95d56546f-smk2l" event={"ID":"c85f7224-7818-41ea-a46b-3d55ac66cece","Type":"ContainerStarted","Data":"2d049fe5389d43826253222b153a9b4695d232e1861bfda298f5838892f8f621"} Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.627789 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-95d56546f-smk2l" podUID="c85f7224-7818-41ea-a46b-3d55ac66cece" containerName="dnsmasq-dns" containerID="cri-o://2d049fe5389d43826253222b153a9b4695d232e1861bfda298f5838892f8f621" gracePeriod=10 Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.627948 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-95d56546f-smk2l" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.628494 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.633163 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d143e8-b16a-4487-8330-d5af2dbf6e2a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.633597 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65d143e8-b16a-4487-8330-d5af2dbf6e2a-scripts\") pod \"glance-default-external-api-0\" (UID: \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.641585 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.656193 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d143e8-b16a-4487-8330-d5af2dbf6e2a-config-data\") pod \"glance-default-external-api-0\" (UID: \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.694643 4817 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.694682 4817 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c\") pod \"glance-default-external-api-0\" (UID: \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/33f17c260ac02f21898fdc222178f71cf5d08760d15202906af70758a1a97161/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.697120 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b4g4\" (UniqueName: \"kubernetes.io/projected/65d143e8-b16a-4487-8330-d5af2dbf6e2a-kube-api-access-9b4g4\") pod \"glance-default-external-api-0\" (UID: \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.732639 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-95d56546f-smk2l" podStartSLOduration=4.73261597 podStartE2EDuration="4.73261597s" podCreationTimestamp="2026-02-18 14:19:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:19:32.705722884 +0000 UTC m=+1235.281258877" watchObservedRunningTime="2026-02-18 14:19:32.73261597 +0000 UTC m=+1235.308151953" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.746004 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f8cb89c64-cqrwn" event={"ID":"20cb47f0-a64d-4e7d-93f0-1fed117df7ce","Type":"ContainerStarted","Data":"4c7116c6393fc406c9f817e4c5a1d598219578470179be0f40902636c9680953"} Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.746323 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-f8cb89c64-cqrwn" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.768210 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-549ff9d7ff-4pfxq" event={"ID":"85a61008-fd45-4598-90bc-b0cf2856cefa","Type":"ContainerStarted","Data":"166feedc5e44cce1c789edddc72773c8bd8b0119383f964430e261cdd96cd0b3"} Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.768363 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-549ff9d7ff-4pfxq" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.768383 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-549ff9d7ff-4pfxq" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.771199 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a","Type":"ContainerStarted","Data":"bb4547964222a00b27ee283cc159d46ca241897827d6e086ca055d9f5273b15f"} Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.784871 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-f8cb89c64-cqrwn" podStartSLOduration=3.784853243 podStartE2EDuration="3.784853243s" podCreationTimestamp="2026-02-18 14:19:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:19:32.776159614 +0000 UTC m=+1235.351695597" watchObservedRunningTime="2026-02-18 14:19:32.784853243 +0000 UTC m=+1235.360389226" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.796656 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7","Type":"ContainerStarted","Data":"d84806ac705d4ba0ad5837998525e506fd7f8f684d5f5bbafa345154eafd4ad3"} Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.806563 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb00d9fd-87a9-4180-a447-3a689a07d067-logs\") pod \"glance-default-internal-api-0\" (UID: \"fb00d9fd-87a9-4180-a447-3a689a07d067\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.806648 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fb00d9fd-87a9-4180-a447-3a689a07d067-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fb00d9fd-87a9-4180-a447-3a689a07d067\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.806727 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-be497b0f-a266-4a29-bed4-fc9e942da919\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be497b0f-a266-4a29-bed4-fc9e942da919\") pod \"glance-default-internal-api-0\" (UID: \"fb00d9fd-87a9-4180-a447-3a689a07d067\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.806799 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb00d9fd-87a9-4180-a447-3a689a07d067-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fb00d9fd-87a9-4180-a447-3a689a07d067\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.806872 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb00d9fd-87a9-4180-a447-3a689a07d067-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fb00d9fd-87a9-4180-a447-3a689a07d067\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.806899 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl9jj\" (UniqueName: \"kubernetes.io/projected/fb00d9fd-87a9-4180-a447-3a689a07d067-kube-api-access-hl9jj\") pod \"glance-default-internal-api-0\" (UID: \"fb00d9fd-87a9-4180-a447-3a689a07d067\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.806931 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb00d9fd-87a9-4180-a447-3a689a07d067-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fb00d9fd-87a9-4180-a447-3a689a07d067\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.822809 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-549ff9d7ff-4pfxq" podStartSLOduration=11.822790976 podStartE2EDuration="11.822790976s" podCreationTimestamp="2026-02-18 14:19:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:19:32.818457298 +0000 UTC m=+1235.393993281" watchObservedRunningTime="2026-02-18 14:19:32.822790976 +0000 UTC m=+1235.398326949" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.875954 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-9w568"] Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.910715 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fb00d9fd-87a9-4180-a447-3a689a07d067-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fb00d9fd-87a9-4180-a447-3a689a07d067\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.910894 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-be497b0f-a266-4a29-bed4-fc9e942da919\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be497b0f-a266-4a29-bed4-fc9e942da919\") pod \"glance-default-internal-api-0\" (UID: \"fb00d9fd-87a9-4180-a447-3a689a07d067\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.911055 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb00d9fd-87a9-4180-a447-3a689a07d067-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fb00d9fd-87a9-4180-a447-3a689a07d067\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.911150 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb00d9fd-87a9-4180-a447-3a689a07d067-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fb00d9fd-87a9-4180-a447-3a689a07d067\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.911175 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl9jj\" (UniqueName: \"kubernetes.io/projected/fb00d9fd-87a9-4180-a447-3a689a07d067-kube-api-access-hl9jj\") pod \"glance-default-internal-api-0\" (UID: \"fb00d9fd-87a9-4180-a447-3a689a07d067\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.911231 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb00d9fd-87a9-4180-a447-3a689a07d067-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fb00d9fd-87a9-4180-a447-3a689a07d067\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.911257 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb00d9fd-87a9-4180-a447-3a689a07d067-logs\") pod \"glance-default-internal-api-0\" (UID: \"fb00d9fd-87a9-4180-a447-3a689a07d067\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.913311 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fb00d9fd-87a9-4180-a447-3a689a07d067-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fb00d9fd-87a9-4180-a447-3a689a07d067\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.918187 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb00d9fd-87a9-4180-a447-3a689a07d067-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fb00d9fd-87a9-4180-a447-3a689a07d067\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.919701 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb00d9fd-87a9-4180-a447-3a689a07d067-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fb00d9fd-87a9-4180-a447-3a689a07d067\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.926403 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb00d9fd-87a9-4180-a447-3a689a07d067-logs\") pod \"glance-default-internal-api-0\" (UID: \"fb00d9fd-87a9-4180-a447-3a689a07d067\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.935966 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb00d9fd-87a9-4180-a447-3a689a07d067-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fb00d9fd-87a9-4180-a447-3a689a07d067\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.938919 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl9jj\" (UniqueName: \"kubernetes.io/projected/fb00d9fd-87a9-4180-a447-3a689a07d067-kube-api-access-hl9jj\") pod \"glance-default-internal-api-0\" (UID: \"fb00d9fd-87a9-4180-a447-3a689a07d067\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.949968 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c\") pod \"glance-default-external-api-0\" (UID: \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.951818 4817 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 18 14:19:32 crc kubenswrapper[4817]: I0218 14:19:32.951891 4817 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-be497b0f-a266-4a29-bed4-fc9e942da919\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be497b0f-a266-4a29-bed4-fc9e942da919\") pod \"glance-default-internal-api-0\" (UID: \"fb00d9fd-87a9-4180-a447-3a689a07d067\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/81a23472548d53035963276e43796643f625826f28e59ad0adc4b50496ee8da7/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 18 14:19:33 crc kubenswrapper[4817]: I0218 14:19:33.045285 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-be497b0f-a266-4a29-bed4-fc9e942da919\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be497b0f-a266-4a29-bed4-fc9e942da919\") pod \"glance-default-internal-api-0\" (UID: \"fb00d9fd-87a9-4180-a447-3a689a07d067\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:33 crc kubenswrapper[4817]: I0218 14:19:33.085962 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 14:19:33 crc kubenswrapper[4817]: I0218 14:19:33.086538 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 14:19:33 crc kubenswrapper[4817]: I0218 14:19:33.812693 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a","Type":"ContainerStarted","Data":"3c57c14c6a1470b98b5ec85da130fbeefabf9505c6c010408d4e953e7f4a7ca1"} Feb 18 14:19:33 crc kubenswrapper[4817]: I0218 14:19:33.815560 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7","Type":"ContainerStarted","Data":"214d080275967660b3f537f3d2cf11a7cb70e800ae3470c2e8612f12dc95813f"} Feb 18 14:19:33 crc kubenswrapper[4817]: I0218 14:19:33.822733 4817 generic.go:334] "Generic (PLEG): container finished" podID="ed573b21-30a7-47b3-bdc8-7d8843074607" containerID="152a3754c1fc459ed0c923bc20716b4ca6647241bf64f323939d93a00532731d" exitCode=0 Feb 18 14:19:33 crc kubenswrapper[4817]: I0218 14:19:33.822799 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-9w568" event={"ID":"ed573b21-30a7-47b3-bdc8-7d8843074607","Type":"ContainerDied","Data":"152a3754c1fc459ed0c923bc20716b4ca6647241bf64f323939d93a00532731d"} Feb 18 14:19:33 crc kubenswrapper[4817]: I0218 14:19:33.822854 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-9w568" event={"ID":"ed573b21-30a7-47b3-bdc8-7d8843074607","Type":"ContainerStarted","Data":"0761755a4cd73f79ae2a2da424a8cddbf1d3fd7dc245bddcc8836a7a58abcdd4"} Feb 18 14:19:34 crc kubenswrapper[4817]: I0218 14:19:34.117441 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 14:19:34 crc kubenswrapper[4817]: I0218 14:19:34.269464 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 14:19:34 crc kubenswrapper[4817]: I0218 14:19:34.885041 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"65d143e8-b16a-4487-8330-d5af2dbf6e2a","Type":"ContainerStarted","Data":"1d2035899bb0e7b63a465290dd9163ea976130a32a901386b88a9ecbfa49da71"} Feb 18 14:19:34 crc kubenswrapper[4817]: I0218 14:19:34.893272 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90dec55d-864d-49da-b960-0a8e51e4d0ad","Type":"ContainerStarted","Data":"21a162474f340754b12b22f618a7353fa5b951b83e285a68a10c6cb791a6a248"} Feb 18 14:19:34 crc kubenswrapper[4817]: I0218 14:19:34.895903 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fb00d9fd-87a9-4180-a447-3a689a07d067","Type":"ContainerStarted","Data":"720fa685f6bb4429e26616a2b2afd95eaaeb36a47112bc221a6b8aff8d9c768a"} Feb 18 14:19:34 crc kubenswrapper[4817]: I0218 14:19:34.913043 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-9w568" event={"ID":"ed573b21-30a7-47b3-bdc8-7d8843074607","Type":"ContainerStarted","Data":"693c3b70dfb28f9d779c518cc0749685cca27d25bf824daa09bc85e7ac834e16"} Feb 18 14:19:34 crc kubenswrapper[4817]: I0218 14:19:34.913473 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5784cf869f-9w568" Feb 18 14:19:34 crc kubenswrapper[4817]: I0218 14:19:34.970214 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5784cf869f-9w568" podStartSLOduration=3.970195096 podStartE2EDuration="3.970195096s" podCreationTimestamp="2026-02-18 14:19:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:19:34.966921544 +0000 UTC m=+1237.542457527" watchObservedRunningTime="2026-02-18 14:19:34.970195096 +0000 UTC m=+1237.545731079" Feb 18 14:19:35 crc kubenswrapper[4817]: I0218 14:19:35.952133 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fb00d9fd-87a9-4180-a447-3a689a07d067","Type":"ContainerStarted","Data":"102e8b4d69409586336698ca07f72650a94e8c19610c2b8bea65e622adf39861"} Feb 18 14:19:35 crc kubenswrapper[4817]: I0218 14:19:35.976909 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a","Type":"ContainerStarted","Data":"9b10b55573274a7ecaded4c0b33a6e94a7e8aa54fd8f5827328e8fb7c089eac6"} Feb 18 14:19:35 crc kubenswrapper[4817]: I0218 14:19:35.994361 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"65d143e8-b16a-4487-8330-d5af2dbf6e2a","Type":"ContainerStarted","Data":"8469958379c907187bd469dd296e67916e173294bbf53e6624a1467134684b8e"} Feb 18 14:19:35 crc kubenswrapper[4817]: I0218 14:19:35.996303 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90dec55d-864d-49da-b960-0a8e51e4d0ad","Type":"ContainerStarted","Data":"ec8203377c9136561a642d02e614be61b931ce2db90459c2b1b5eee7adbb51ec"} Feb 18 14:19:36 crc kubenswrapper[4817]: I0218 14:19:36.010750 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7","Type":"ContainerStarted","Data":"9bacaf3c5b801bc9502710084ebee2b6855d90936d07210ebb0413ee7796bbaf"} Feb 18 14:19:36 crc kubenswrapper[4817]: I0218 14:19:36.019854 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7" containerName="cinder-api-log" containerID="cri-o://214d080275967660b3f537f3d2cf11a7cb70e800ae3470c2e8612f12dc95813f" gracePeriod=30 Feb 18 14:19:36 crc kubenswrapper[4817]: I0218 14:19:36.020010 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 18 14:19:36 crc kubenswrapper[4817]: I0218 14:19:36.020046 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7" containerName="cinder-api" containerID="cri-o://9bacaf3c5b801bc9502710084ebee2b6855d90936d07210ebb0413ee7796bbaf" gracePeriod=30 Feb 18 14:19:36 crc kubenswrapper[4817]: I0218 14:19:36.035059 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=6.546374661 podStartE2EDuration="8.035042633s" podCreationTimestamp="2026-02-18 14:19:28 +0000 UTC" firstStartedPulling="2026-02-18 14:19:30.377230831 +0000 UTC m=+1232.952766814" lastFinishedPulling="2026-02-18 14:19:31.865898803 +0000 UTC m=+1234.441434786" observedRunningTime="2026-02-18 14:19:36.033766401 +0000 UTC m=+1238.609302384" watchObservedRunningTime="2026-02-18 14:19:36.035042633 +0000 UTC m=+1238.610578616" Feb 18 14:19:36 crc kubenswrapper[4817]: I0218 14:19:36.108309 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=7.108284665 podStartE2EDuration="7.108284665s" podCreationTimestamp="2026-02-18 14:19:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:19:36.061091838 +0000 UTC m=+1238.636627821" watchObservedRunningTime="2026-02-18 14:19:36.108284665 +0000 UTC m=+1238.683820648" Feb 18 14:19:36 crc kubenswrapper[4817]: I0218 14:19:36.745189 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 14:19:36 crc kubenswrapper[4817]: I0218 14:19:36.806355 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-549ff9d7ff-4pfxq" Feb 18 14:19:36 crc kubenswrapper[4817]: I0218 14:19:36.808543 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-549ff9d7ff-4pfxq" Feb 18 14:19:36 crc kubenswrapper[4817]: I0218 14:19:36.894483 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 14:19:37 crc kubenswrapper[4817]: I0218 14:19:37.033913 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"65d143e8-b16a-4487-8330-d5af2dbf6e2a","Type":"ContainerStarted","Data":"587d5c6fd3cccb4e6f7d661b227173a2069885175c27ccc174b47ea01a95fa2e"} Feb 18 14:19:37 crc kubenswrapper[4817]: I0218 14:19:37.035420 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="65d143e8-b16a-4487-8330-d5af2dbf6e2a" containerName="glance-log" containerID="cri-o://8469958379c907187bd469dd296e67916e173294bbf53e6624a1467134684b8e" gracePeriod=30 Feb 18 14:19:37 crc kubenswrapper[4817]: I0218 14:19:37.038280 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="65d143e8-b16a-4487-8330-d5af2dbf6e2a" containerName="glance-httpd" containerID="cri-o://587d5c6fd3cccb4e6f7d661b227173a2069885175c27ccc174b47ea01a95fa2e" gracePeriod=30 Feb 18 14:19:37 crc kubenswrapper[4817]: I0218 14:19:37.054892 4817 generic.go:334] "Generic (PLEG): container finished" podID="83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7" containerID="9bacaf3c5b801bc9502710084ebee2b6855d90936d07210ebb0413ee7796bbaf" exitCode=0 Feb 18 14:19:37 crc kubenswrapper[4817]: I0218 14:19:37.054926 4817 generic.go:334] "Generic (PLEG): container finished" podID="83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7" containerID="214d080275967660b3f537f3d2cf11a7cb70e800ae3470c2e8612f12dc95813f" exitCode=143 Feb 18 14:19:37 crc kubenswrapper[4817]: I0218 14:19:37.054971 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7","Type":"ContainerDied","Data":"9bacaf3c5b801bc9502710084ebee2b6855d90936d07210ebb0413ee7796bbaf"} Feb 18 14:19:37 crc kubenswrapper[4817]: I0218 14:19:37.055014 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7","Type":"ContainerDied","Data":"214d080275967660b3f537f3d2cf11a7cb70e800ae3470c2e8612f12dc95813f"} Feb 18 14:19:37 crc kubenswrapper[4817]: I0218 14:19:37.059170 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="fb00d9fd-87a9-4180-a447-3a689a07d067" containerName="glance-log" containerID="cri-o://102e8b4d69409586336698ca07f72650a94e8c19610c2b8bea65e622adf39861" gracePeriod=30 Feb 18 14:19:37 crc kubenswrapper[4817]: I0218 14:19:37.059442 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fb00d9fd-87a9-4180-a447-3a689a07d067","Type":"ContainerStarted","Data":"b3095c153f58e3e658bedb00f3f4bb82b4486abb784059e8e499fd883398d49c"} Feb 18 14:19:37 crc kubenswrapper[4817]: I0218 14:19:37.060231 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="fb00d9fd-87a9-4180-a447-3a689a07d067" containerName="glance-httpd" containerID="cri-o://b3095c153f58e3e658bedb00f3f4bb82b4486abb784059e8e499fd883398d49c" gracePeriod=30 Feb 18 14:19:37 crc kubenswrapper[4817]: I0218 14:19:37.068828 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.06881166 podStartE2EDuration="6.06881166s" podCreationTimestamp="2026-02-18 14:19:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:19:37.065616349 +0000 UTC m=+1239.641152332" watchObservedRunningTime="2026-02-18 14:19:37.06881166 +0000 UTC m=+1239.644347643" Feb 18 14:19:37 crc kubenswrapper[4817]: I0218 14:19:37.115072 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.115045642 podStartE2EDuration="6.115045642s" podCreationTimestamp="2026-02-18 14:19:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:19:37.105558173 +0000 UTC m=+1239.681094156" watchObservedRunningTime="2026-02-18 14:19:37.115045642 +0000 UTC m=+1239.690581625" Feb 18 14:19:37 crc kubenswrapper[4817]: I0218 14:19:37.657829 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 18 14:19:37 crc kubenswrapper[4817]: I0218 14:19:37.693591 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-scripts\") pod \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\" (UID: \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\") " Feb 18 14:19:37 crc kubenswrapper[4817]: I0218 14:19:37.693662 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-config-data-custom\") pod \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\" (UID: \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\") " Feb 18 14:19:37 crc kubenswrapper[4817]: I0218 14:19:37.693707 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-config-data\") pod \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\" (UID: \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\") " Feb 18 14:19:37 crc kubenswrapper[4817]: I0218 14:19:37.693787 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-etc-machine-id\") pod \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\" (UID: \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\") " Feb 18 14:19:37 crc kubenswrapper[4817]: I0218 14:19:37.693859 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xh2wv\" (UniqueName: \"kubernetes.io/projected/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-kube-api-access-xh2wv\") pod \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\" (UID: \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\") " Feb 18 14:19:37 crc kubenswrapper[4817]: I0218 14:19:37.693923 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-logs\") pod \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\" (UID: \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\") " Feb 18 14:19:37 crc kubenswrapper[4817]: I0218 14:19:37.693943 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-combined-ca-bundle\") pod \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\" (UID: \"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7\") " Feb 18 14:19:37 crc kubenswrapper[4817]: I0218 14:19:37.695282 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7" (UID: "83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 14:19:37 crc kubenswrapper[4817]: I0218 14:19:37.696275 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-logs" (OuterVolumeSpecName: "logs") pod "83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7" (UID: "83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:19:37 crc kubenswrapper[4817]: I0218 14:19:37.708206 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-kube-api-access-xh2wv" (OuterVolumeSpecName: "kube-api-access-xh2wv") pod "83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7" (UID: "83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7"). InnerVolumeSpecName "kube-api-access-xh2wv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:19:37 crc kubenswrapper[4817]: I0218 14:19:37.714653 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7" (UID: "83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:37 crc kubenswrapper[4817]: I0218 14:19:37.721176 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-scripts" (OuterVolumeSpecName: "scripts") pod "83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7" (UID: "83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:37 crc kubenswrapper[4817]: I0218 14:19:37.796274 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:37 crc kubenswrapper[4817]: I0218 14:19:37.796303 4817 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:37 crc kubenswrapper[4817]: I0218 14:19:37.796313 4817 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:37 crc kubenswrapper[4817]: I0218 14:19:37.796321 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xh2wv\" (UniqueName: \"kubernetes.io/projected/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-kube-api-access-xh2wv\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:37 crc kubenswrapper[4817]: I0218 14:19:37.796351 4817 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-logs\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:37 crc kubenswrapper[4817]: I0218 14:19:37.817229 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7" (UID: "83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:37 crc kubenswrapper[4817]: I0218 14:19:37.900391 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:37 crc kubenswrapper[4817]: I0218 14:19:37.907112 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-config-data" (OuterVolumeSpecName: "config-data") pod "83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7" (UID: "83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.002717 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.067894 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.067895 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7","Type":"ContainerDied","Data":"d84806ac705d4ba0ad5837998525e506fd7f8f684d5f5bbafa345154eafd4ad3"} Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.068031 4817 scope.go:117] "RemoveContainer" containerID="9bacaf3c5b801bc9502710084ebee2b6855d90936d07210ebb0413ee7796bbaf" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.069945 4817 generic.go:334] "Generic (PLEG): container finished" podID="fb00d9fd-87a9-4180-a447-3a689a07d067" containerID="b3095c153f58e3e658bedb00f3f4bb82b4486abb784059e8e499fd883398d49c" exitCode=143 Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.070064 4817 generic.go:334] "Generic (PLEG): container finished" podID="fb00d9fd-87a9-4180-a447-3a689a07d067" containerID="102e8b4d69409586336698ca07f72650a94e8c19610c2b8bea65e622adf39861" exitCode=143 Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.069997 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fb00d9fd-87a9-4180-a447-3a689a07d067","Type":"ContainerDied","Data":"b3095c153f58e3e658bedb00f3f4bb82b4486abb784059e8e499fd883398d49c"} Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.070269 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fb00d9fd-87a9-4180-a447-3a689a07d067","Type":"ContainerDied","Data":"102e8b4d69409586336698ca07f72650a94e8c19610c2b8bea65e622adf39861"} Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.072182 4817 generic.go:334] "Generic (PLEG): container finished" podID="65d143e8-b16a-4487-8330-d5af2dbf6e2a" containerID="587d5c6fd3cccb4e6f7d661b227173a2069885175c27ccc174b47ea01a95fa2e" exitCode=143 Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.072278 4817 generic.go:334] "Generic (PLEG): container finished" podID="65d143e8-b16a-4487-8330-d5af2dbf6e2a" containerID="8469958379c907187bd469dd296e67916e173294bbf53e6624a1467134684b8e" exitCode=143 Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.072237 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"65d143e8-b16a-4487-8330-d5af2dbf6e2a","Type":"ContainerDied","Data":"587d5c6fd3cccb4e6f7d661b227173a2069885175c27ccc174b47ea01a95fa2e"} Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.072391 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"65d143e8-b16a-4487-8330-d5af2dbf6e2a","Type":"ContainerDied","Data":"8469958379c907187bd469dd296e67916e173294bbf53e6624a1467134684b8e"} Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.107392 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.119118 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.143044 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 18 14:19:38 crc kubenswrapper[4817]: E0218 14:19:38.143537 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7" containerName="cinder-api-log" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.143562 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7" containerName="cinder-api-log" Feb 18 14:19:38 crc kubenswrapper[4817]: E0218 14:19:38.143587 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7" containerName="cinder-api" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.143595 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7" containerName="cinder-api" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.143824 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7" containerName="cinder-api" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.143847 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7" containerName="cinder-api-log" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.144899 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.147253 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.147434 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.147574 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.203228 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7" path="/var/lib/kubelet/pods/83ca1ccc-c8db-46fe-b7ad-03ecc07fb8b7/volumes" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.214004 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.311311 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a414e293-71b9-44c3-8f07-20f3696f7db6-scripts\") pod \"cinder-api-0\" (UID: \"a414e293-71b9-44c3-8f07-20f3696f7db6\") " pod="openstack/cinder-api-0" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.311381 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a414e293-71b9-44c3-8f07-20f3696f7db6-logs\") pod \"cinder-api-0\" (UID: \"a414e293-71b9-44c3-8f07-20f3696f7db6\") " pod="openstack/cinder-api-0" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.311417 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a414e293-71b9-44c3-8f07-20f3696f7db6-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a414e293-71b9-44c3-8f07-20f3696f7db6\") " pod="openstack/cinder-api-0" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.311450 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a414e293-71b9-44c3-8f07-20f3696f7db6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a414e293-71b9-44c3-8f07-20f3696f7db6\") " pod="openstack/cinder-api-0" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.311502 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a414e293-71b9-44c3-8f07-20f3696f7db6-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a414e293-71b9-44c3-8f07-20f3696f7db6\") " pod="openstack/cinder-api-0" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.311556 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a414e293-71b9-44c3-8f07-20f3696f7db6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a414e293-71b9-44c3-8f07-20f3696f7db6\") " pod="openstack/cinder-api-0" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.311594 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9lxh\" (UniqueName: \"kubernetes.io/projected/a414e293-71b9-44c3-8f07-20f3696f7db6-kube-api-access-j9lxh\") pod \"cinder-api-0\" (UID: \"a414e293-71b9-44c3-8f07-20f3696f7db6\") " pod="openstack/cinder-api-0" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.311630 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a414e293-71b9-44c3-8f07-20f3696f7db6-config-data-custom\") pod \"cinder-api-0\" (UID: \"a414e293-71b9-44c3-8f07-20f3696f7db6\") " pod="openstack/cinder-api-0" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.311929 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a414e293-71b9-44c3-8f07-20f3696f7db6-config-data\") pod \"cinder-api-0\" (UID: \"a414e293-71b9-44c3-8f07-20f3696f7db6\") " pod="openstack/cinder-api-0" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.327620 4817 scope.go:117] "RemoveContainer" containerID="214d080275967660b3f537f3d2cf11a7cb70e800ae3470c2e8612f12dc95813f" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.416217 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a414e293-71b9-44c3-8f07-20f3696f7db6-scripts\") pod \"cinder-api-0\" (UID: \"a414e293-71b9-44c3-8f07-20f3696f7db6\") " pod="openstack/cinder-api-0" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.416261 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a414e293-71b9-44c3-8f07-20f3696f7db6-logs\") pod \"cinder-api-0\" (UID: \"a414e293-71b9-44c3-8f07-20f3696f7db6\") " pod="openstack/cinder-api-0" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.416292 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a414e293-71b9-44c3-8f07-20f3696f7db6-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a414e293-71b9-44c3-8f07-20f3696f7db6\") " pod="openstack/cinder-api-0" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.416317 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a414e293-71b9-44c3-8f07-20f3696f7db6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a414e293-71b9-44c3-8f07-20f3696f7db6\") " pod="openstack/cinder-api-0" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.416355 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a414e293-71b9-44c3-8f07-20f3696f7db6-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a414e293-71b9-44c3-8f07-20f3696f7db6\") " pod="openstack/cinder-api-0" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.416393 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a414e293-71b9-44c3-8f07-20f3696f7db6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a414e293-71b9-44c3-8f07-20f3696f7db6\") " pod="openstack/cinder-api-0" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.416416 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9lxh\" (UniqueName: \"kubernetes.io/projected/a414e293-71b9-44c3-8f07-20f3696f7db6-kube-api-access-j9lxh\") pod \"cinder-api-0\" (UID: \"a414e293-71b9-44c3-8f07-20f3696f7db6\") " pod="openstack/cinder-api-0" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.416440 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a414e293-71b9-44c3-8f07-20f3696f7db6-config-data-custom\") pod \"cinder-api-0\" (UID: \"a414e293-71b9-44c3-8f07-20f3696f7db6\") " pod="openstack/cinder-api-0" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.416487 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a414e293-71b9-44c3-8f07-20f3696f7db6-config-data\") pod \"cinder-api-0\" (UID: \"a414e293-71b9-44c3-8f07-20f3696f7db6\") " pod="openstack/cinder-api-0" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.421015 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a414e293-71b9-44c3-8f07-20f3696f7db6-config-data\") pod \"cinder-api-0\" (UID: \"a414e293-71b9-44c3-8f07-20f3696f7db6\") " pod="openstack/cinder-api-0" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.429401 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a414e293-71b9-44c3-8f07-20f3696f7db6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a414e293-71b9-44c3-8f07-20f3696f7db6\") " pod="openstack/cinder-api-0" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.431411 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a414e293-71b9-44c3-8f07-20f3696f7db6-logs\") pod \"cinder-api-0\" (UID: \"a414e293-71b9-44c3-8f07-20f3696f7db6\") " pod="openstack/cinder-api-0" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.443577 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a414e293-71b9-44c3-8f07-20f3696f7db6-config-data-custom\") pod \"cinder-api-0\" (UID: \"a414e293-71b9-44c3-8f07-20f3696f7db6\") " pod="openstack/cinder-api-0" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.443684 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a414e293-71b9-44c3-8f07-20f3696f7db6-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a414e293-71b9-44c3-8f07-20f3696f7db6\") " pod="openstack/cinder-api-0" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.443860 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a414e293-71b9-44c3-8f07-20f3696f7db6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a414e293-71b9-44c3-8f07-20f3696f7db6\") " pod="openstack/cinder-api-0" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.445349 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a414e293-71b9-44c3-8f07-20f3696f7db6-scripts\") pod \"cinder-api-0\" (UID: \"a414e293-71b9-44c3-8f07-20f3696f7db6\") " pod="openstack/cinder-api-0" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.445782 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a414e293-71b9-44c3-8f07-20f3696f7db6-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a414e293-71b9-44c3-8f07-20f3696f7db6\") " pod="openstack/cinder-api-0" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.450155 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9lxh\" (UniqueName: \"kubernetes.io/projected/a414e293-71b9-44c3-8f07-20f3696f7db6-kube-api-access-j9lxh\") pod \"cinder-api-0\" (UID: \"a414e293-71b9-44c3-8f07-20f3696f7db6\") " pod="openstack/cinder-api-0" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.479184 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 18 14:19:38 crc kubenswrapper[4817]: I0218 14:19:38.907018 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.003781 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.037712 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb00d9fd-87a9-4180-a447-3a689a07d067-combined-ca-bundle\") pod \"fb00d9fd-87a9-4180-a447-3a689a07d067\" (UID: \"fb00d9fd-87a9-4180-a447-3a689a07d067\") " Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.037923 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fb00d9fd-87a9-4180-a447-3a689a07d067-httpd-run\") pod \"fb00d9fd-87a9-4180-a447-3a689a07d067\" (UID: \"fb00d9fd-87a9-4180-a447-3a689a07d067\") " Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.038045 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb00d9fd-87a9-4180-a447-3a689a07d067-scripts\") pod \"fb00d9fd-87a9-4180-a447-3a689a07d067\" (UID: \"fb00d9fd-87a9-4180-a447-3a689a07d067\") " Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.038080 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb00d9fd-87a9-4180-a447-3a689a07d067-config-data\") pod \"fb00d9fd-87a9-4180-a447-3a689a07d067\" (UID: \"fb00d9fd-87a9-4180-a447-3a689a07d067\") " Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.038137 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hl9jj\" (UniqueName: \"kubernetes.io/projected/fb00d9fd-87a9-4180-a447-3a689a07d067-kube-api-access-hl9jj\") pod \"fb00d9fd-87a9-4180-a447-3a689a07d067\" (UID: \"fb00d9fd-87a9-4180-a447-3a689a07d067\") " Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.038242 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb00d9fd-87a9-4180-a447-3a689a07d067-logs\") pod \"fb00d9fd-87a9-4180-a447-3a689a07d067\" (UID: \"fb00d9fd-87a9-4180-a447-3a689a07d067\") " Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.038369 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be497b0f-a266-4a29-bed4-fc9e942da919\") pod \"fb00d9fd-87a9-4180-a447-3a689a07d067\" (UID: \"fb00d9fd-87a9-4180-a447-3a689a07d067\") " Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.039283 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb00d9fd-87a9-4180-a447-3a689a07d067-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fb00d9fd-87a9-4180-a447-3a689a07d067" (UID: "fb00d9fd-87a9-4180-a447-3a689a07d067"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.039762 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb00d9fd-87a9-4180-a447-3a689a07d067-logs" (OuterVolumeSpecName: "logs") pod "fb00d9fd-87a9-4180-a447-3a689a07d067" (UID: "fb00d9fd-87a9-4180-a447-3a689a07d067"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.044652 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb00d9fd-87a9-4180-a447-3a689a07d067-kube-api-access-hl9jj" (OuterVolumeSpecName: "kube-api-access-hl9jj") pod "fb00d9fd-87a9-4180-a447-3a689a07d067" (UID: "fb00d9fd-87a9-4180-a447-3a689a07d067"). InnerVolumeSpecName "kube-api-access-hl9jj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.044883 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb00d9fd-87a9-4180-a447-3a689a07d067-scripts" (OuterVolumeSpecName: "scripts") pod "fb00d9fd-87a9-4180-a447-3a689a07d067" (UID: "fb00d9fd-87a9-4180-a447-3a689a07d067"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.076149 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be497b0f-a266-4a29-bed4-fc9e942da919" (OuterVolumeSpecName: "glance") pod "fb00d9fd-87a9-4180-a447-3a689a07d067" (UID: "fb00d9fd-87a9-4180-a447-3a689a07d067"). InnerVolumeSpecName "pvc-be497b0f-a266-4a29-bed4-fc9e942da919". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.096199 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb00d9fd-87a9-4180-a447-3a689a07d067-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb00d9fd-87a9-4180-a447-3a689a07d067" (UID: "fb00d9fd-87a9-4180-a447-3a689a07d067"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.117943 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"65d143e8-b16a-4487-8330-d5af2dbf6e2a","Type":"ContainerDied","Data":"1d2035899bb0e7b63a465290dd9163ea976130a32a901386b88a9ecbfa49da71"} Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.118009 4817 scope.go:117] "RemoveContainer" containerID="587d5c6fd3cccb4e6f7d661b227173a2069885175c27ccc174b47ea01a95fa2e" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.118067 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.127805 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fb00d9fd-87a9-4180-a447-3a689a07d067","Type":"ContainerDied","Data":"720fa685f6bb4429e26616a2b2afd95eaaeb36a47112bc221a6b8aff8d9c768a"} Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.127907 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.141222 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.145519 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65d143e8-b16a-4487-8330-d5af2dbf6e2a-httpd-run\") pod \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\" (UID: \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\") " Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.145647 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d143e8-b16a-4487-8330-d5af2dbf6e2a-config-data\") pod \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\" (UID: \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\") " Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.145712 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9b4g4\" (UniqueName: \"kubernetes.io/projected/65d143e8-b16a-4487-8330-d5af2dbf6e2a-kube-api-access-9b4g4\") pod \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\" (UID: \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\") " Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.145809 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65d143e8-b16a-4487-8330-d5af2dbf6e2a-logs\") pod \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\" (UID: \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\") " Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.146034 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c\") pod \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\" (UID: \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\") " Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.146066 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d143e8-b16a-4487-8330-d5af2dbf6e2a-combined-ca-bundle\") pod \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\" (UID: \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\") " Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.146195 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65d143e8-b16a-4487-8330-d5af2dbf6e2a-scripts\") pod \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\" (UID: \"65d143e8-b16a-4487-8330-d5af2dbf6e2a\") " Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.148219 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65d143e8-b16a-4487-8330-d5af2dbf6e2a-logs" (OuterVolumeSpecName: "logs") pod "65d143e8-b16a-4487-8330-d5af2dbf6e2a" (UID: "65d143e8-b16a-4487-8330-d5af2dbf6e2a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.149712 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb00d9fd-87a9-4180-a447-3a689a07d067-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.149735 4817 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fb00d9fd-87a9-4180-a447-3a689a07d067-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.149748 4817 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65d143e8-b16a-4487-8330-d5af2dbf6e2a-logs\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.149760 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb00d9fd-87a9-4180-a447-3a689a07d067-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.149771 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hl9jj\" (UniqueName: \"kubernetes.io/projected/fb00d9fd-87a9-4180-a447-3a689a07d067-kube-api-access-hl9jj\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.149785 4817 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb00d9fd-87a9-4180-a447-3a689a07d067-logs\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.149816 4817 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-be497b0f-a266-4a29-bed4-fc9e942da919\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be497b0f-a266-4a29-bed4-fc9e942da919\") on node \"crc\" " Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.150165 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65d143e8-b16a-4487-8330-d5af2dbf6e2a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "65d143e8-b16a-4487-8330-d5af2dbf6e2a" (UID: "65d143e8-b16a-4487-8330-d5af2dbf6e2a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.168868 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb00d9fd-87a9-4180-a447-3a689a07d067-config-data" (OuterVolumeSpecName: "config-data") pod "fb00d9fd-87a9-4180-a447-3a689a07d067" (UID: "fb00d9fd-87a9-4180-a447-3a689a07d067"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.171924 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65d143e8-b16a-4487-8330-d5af2dbf6e2a-kube-api-access-9b4g4" (OuterVolumeSpecName: "kube-api-access-9b4g4") pod "65d143e8-b16a-4487-8330-d5af2dbf6e2a" (UID: "65d143e8-b16a-4487-8330-d5af2dbf6e2a"). InnerVolumeSpecName "kube-api-access-9b4g4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.173272 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65d143e8-b16a-4487-8330-d5af2dbf6e2a-scripts" (OuterVolumeSpecName: "scripts") pod "65d143e8-b16a-4487-8330-d5af2dbf6e2a" (UID: "65d143e8-b16a-4487-8330-d5af2dbf6e2a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.183567 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7c8c8d4f9c-f58g5"] Feb 18 14:19:39 crc kubenswrapper[4817]: E0218 14:19:39.185325 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65d143e8-b16a-4487-8330-d5af2dbf6e2a" containerName="glance-log" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.185452 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="65d143e8-b16a-4487-8330-d5af2dbf6e2a" containerName="glance-log" Feb 18 14:19:39 crc kubenswrapper[4817]: E0218 14:19:39.185540 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65d143e8-b16a-4487-8330-d5af2dbf6e2a" containerName="glance-httpd" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.185607 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="65d143e8-b16a-4487-8330-d5af2dbf6e2a" containerName="glance-httpd" Feb 18 14:19:39 crc kubenswrapper[4817]: E0218 14:19:39.185680 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb00d9fd-87a9-4180-a447-3a689a07d067" containerName="glance-log" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.185728 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb00d9fd-87a9-4180-a447-3a689a07d067" containerName="glance-log" Feb 18 14:19:39 crc kubenswrapper[4817]: E0218 14:19:39.185779 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb00d9fd-87a9-4180-a447-3a689a07d067" containerName="glance-httpd" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.185830 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb00d9fd-87a9-4180-a447-3a689a07d067" containerName="glance-httpd" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.186125 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb00d9fd-87a9-4180-a447-3a689a07d067" containerName="glance-log" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.186217 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb00d9fd-87a9-4180-a447-3a689a07d067" containerName="glance-httpd" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.186279 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="65d143e8-b16a-4487-8330-d5af2dbf6e2a" containerName="glance-httpd" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.186339 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="65d143e8-b16a-4487-8330-d5af2dbf6e2a" containerName="glance-log" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.189219 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c8c8d4f9c-f58g5" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.193455 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7c8c8d4f9c-f58g5"] Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.204722 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.204970 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.205106 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c" (OuterVolumeSpecName: "glance") pod "65d143e8-b16a-4487-8330-d5af2dbf6e2a" (UID: "65d143e8-b16a-4487-8330-d5af2dbf6e2a"). InnerVolumeSpecName "pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.233786 4817 scope.go:117] "RemoveContainer" containerID="8469958379c907187bd469dd296e67916e173294bbf53e6624a1467134684b8e" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.251711 4817 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c\") on node \"crc\" " Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.251959 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb00d9fd-87a9-4180-a447-3a689a07d067-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.252095 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65d143e8-b16a-4487-8330-d5af2dbf6e2a-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.252184 4817 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65d143e8-b16a-4487-8330-d5af2dbf6e2a-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.252245 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9b4g4\" (UniqueName: \"kubernetes.io/projected/65d143e8-b16a-4487-8330-d5af2dbf6e2a-kube-api-access-9b4g4\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.252116 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65d143e8-b16a-4487-8330-d5af2dbf6e2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65d143e8-b16a-4487-8330-d5af2dbf6e2a" (UID: "65d143e8-b16a-4487-8330-d5af2dbf6e2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.271871 4817 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.272294 4817 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-be497b0f-a266-4a29-bed4-fc9e942da919" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be497b0f-a266-4a29-bed4-fc9e942da919") on node "crc" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.274110 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65d143e8-b16a-4487-8330-d5af2dbf6e2a-config-data" (OuterVolumeSpecName: "config-data") pod "65d143e8-b16a-4487-8330-d5af2dbf6e2a" (UID: "65d143e8-b16a-4487-8330-d5af2dbf6e2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.290246 4817 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.290444 4817 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c") on node "crc" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.358369 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/206709c6-0550-4932-8f0e-f9d4c342a26c-httpd-config\") pod \"neutron-7c8c8d4f9c-f58g5\" (UID: \"206709c6-0550-4932-8f0e-f9d4c342a26c\") " pod="openstack/neutron-7c8c8d4f9c-f58g5" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.358482 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/206709c6-0550-4932-8f0e-f9d4c342a26c-combined-ca-bundle\") pod \"neutron-7c8c8d4f9c-f58g5\" (UID: \"206709c6-0550-4932-8f0e-f9d4c342a26c\") " pod="openstack/neutron-7c8c8d4f9c-f58g5" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.358605 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/206709c6-0550-4932-8f0e-f9d4c342a26c-ovndb-tls-certs\") pod \"neutron-7c8c8d4f9c-f58g5\" (UID: \"206709c6-0550-4932-8f0e-f9d4c342a26c\") " pod="openstack/neutron-7c8c8d4f9c-f58g5" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.358744 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/206709c6-0550-4932-8f0e-f9d4c342a26c-internal-tls-certs\") pod \"neutron-7c8c8d4f9c-f58g5\" (UID: \"206709c6-0550-4932-8f0e-f9d4c342a26c\") " pod="openstack/neutron-7c8c8d4f9c-f58g5" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.358910 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/206709c6-0550-4932-8f0e-f9d4c342a26c-public-tls-certs\") pod \"neutron-7c8c8d4f9c-f58g5\" (UID: \"206709c6-0550-4932-8f0e-f9d4c342a26c\") " pod="openstack/neutron-7c8c8d4f9c-f58g5" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.359034 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24qz4\" (UniqueName: \"kubernetes.io/projected/206709c6-0550-4932-8f0e-f9d4c342a26c-kube-api-access-24qz4\") pod \"neutron-7c8c8d4f9c-f58g5\" (UID: \"206709c6-0550-4932-8f0e-f9d4c342a26c\") " pod="openstack/neutron-7c8c8d4f9c-f58g5" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.359066 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/206709c6-0550-4932-8f0e-f9d4c342a26c-config\") pod \"neutron-7c8c8d4f9c-f58g5\" (UID: \"206709c6-0550-4932-8f0e-f9d4c342a26c\") " pod="openstack/neutron-7c8c8d4f9c-f58g5" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.359184 4817 reconciler_common.go:293] "Volume detached for volume \"pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.359205 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d143e8-b16a-4487-8330-d5af2dbf6e2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.359219 4817 reconciler_common.go:293] "Volume detached for volume \"pvc-be497b0f-a266-4a29-bed4-fc9e942da919\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be497b0f-a266-4a29-bed4-fc9e942da919\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.359232 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d143e8-b16a-4487-8330-d5af2dbf6e2a-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.451409 4817 scope.go:117] "RemoveContainer" containerID="b3095c153f58e3e658bedb00f3f4bb82b4486abb784059e8e499fd883398d49c" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.460844 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/206709c6-0550-4932-8f0e-f9d4c342a26c-httpd-config\") pod \"neutron-7c8c8d4f9c-f58g5\" (UID: \"206709c6-0550-4932-8f0e-f9d4c342a26c\") " pod="openstack/neutron-7c8c8d4f9c-f58g5" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.460920 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/206709c6-0550-4932-8f0e-f9d4c342a26c-combined-ca-bundle\") pod \"neutron-7c8c8d4f9c-f58g5\" (UID: \"206709c6-0550-4932-8f0e-f9d4c342a26c\") " pod="openstack/neutron-7c8c8d4f9c-f58g5" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.460992 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/206709c6-0550-4932-8f0e-f9d4c342a26c-ovndb-tls-certs\") pod \"neutron-7c8c8d4f9c-f58g5\" (UID: \"206709c6-0550-4932-8f0e-f9d4c342a26c\") " pod="openstack/neutron-7c8c8d4f9c-f58g5" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.461054 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/206709c6-0550-4932-8f0e-f9d4c342a26c-internal-tls-certs\") pod \"neutron-7c8c8d4f9c-f58g5\" (UID: \"206709c6-0550-4932-8f0e-f9d4c342a26c\") " pod="openstack/neutron-7c8c8d4f9c-f58g5" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.461133 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/206709c6-0550-4932-8f0e-f9d4c342a26c-public-tls-certs\") pod \"neutron-7c8c8d4f9c-f58g5\" (UID: \"206709c6-0550-4932-8f0e-f9d4c342a26c\") " pod="openstack/neutron-7c8c8d4f9c-f58g5" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.461169 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24qz4\" (UniqueName: \"kubernetes.io/projected/206709c6-0550-4932-8f0e-f9d4c342a26c-kube-api-access-24qz4\") pod \"neutron-7c8c8d4f9c-f58g5\" (UID: \"206709c6-0550-4932-8f0e-f9d4c342a26c\") " pod="openstack/neutron-7c8c8d4f9c-f58g5" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.461192 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/206709c6-0550-4932-8f0e-f9d4c342a26c-config\") pod \"neutron-7c8c8d4f9c-f58g5\" (UID: \"206709c6-0550-4932-8f0e-f9d4c342a26c\") " pod="openstack/neutron-7c8c8d4f9c-f58g5" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.468364 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.472998 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/206709c6-0550-4932-8f0e-f9d4c342a26c-ovndb-tls-certs\") pod \"neutron-7c8c8d4f9c-f58g5\" (UID: \"206709c6-0550-4932-8f0e-f9d4c342a26c\") " pod="openstack/neutron-7c8c8d4f9c-f58g5" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.477046 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/206709c6-0550-4932-8f0e-f9d4c342a26c-public-tls-certs\") pod \"neutron-7c8c8d4f9c-f58g5\" (UID: \"206709c6-0550-4932-8f0e-f9d4c342a26c\") " pod="openstack/neutron-7c8c8d4f9c-f58g5" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.477846 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/206709c6-0550-4932-8f0e-f9d4c342a26c-httpd-config\") pod \"neutron-7c8c8d4f9c-f58g5\" (UID: \"206709c6-0550-4932-8f0e-f9d4c342a26c\") " pod="openstack/neutron-7c8c8d4f9c-f58g5" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.478849 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/206709c6-0550-4932-8f0e-f9d4c342a26c-combined-ca-bundle\") pod \"neutron-7c8c8d4f9c-f58g5\" (UID: \"206709c6-0550-4932-8f0e-f9d4c342a26c\") " pod="openstack/neutron-7c8c8d4f9c-f58g5" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.480055 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/206709c6-0550-4932-8f0e-f9d4c342a26c-internal-tls-certs\") pod \"neutron-7c8c8d4f9c-f58g5\" (UID: \"206709c6-0550-4932-8f0e-f9d4c342a26c\") " pod="openstack/neutron-7c8c8d4f9c-f58g5" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.489035 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/206709c6-0550-4932-8f0e-f9d4c342a26c-config\") pod \"neutron-7c8c8d4f9c-f58g5\" (UID: \"206709c6-0550-4932-8f0e-f9d4c342a26c\") " pod="openstack/neutron-7c8c8d4f9c-f58g5" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.493099 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24qz4\" (UniqueName: \"kubernetes.io/projected/206709c6-0550-4932-8f0e-f9d4c342a26c-kube-api-access-24qz4\") pod \"neutron-7c8c8d4f9c-f58g5\" (UID: \"206709c6-0550-4932-8f0e-f9d4c342a26c\") " pod="openstack/neutron-7c8c8d4f9c-f58g5" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.513069 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.516198 4817 scope.go:117] "RemoveContainer" containerID="102e8b4d69409586336698ca07f72650a94e8c19610c2b8bea65e622adf39861" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.522862 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c8c8d4f9c-f58g5" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.532186 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.546882 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.561500 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.568287 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.568499 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.568645 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-24d5c" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.568810 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.570021 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.573220 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.574879 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="90dec55d-864d-49da-b960-0a8e51e4d0ad" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.180:8080/\": dial tcp 10.217.0.180:8080: connect: connection refused" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.599452 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.624347 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.626293 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.631479 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.631657 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.636185 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.668373 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/459bd237-6715-4821-8613-1b24c597b247-scripts\") pod \"glance-default-external-api-0\" (UID: \"459bd237-6715-4821-8613-1b24c597b247\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.668427 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c\") pod \"glance-default-external-api-0\" (UID: \"459bd237-6715-4821-8613-1b24c597b247\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.668491 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/459bd237-6715-4821-8613-1b24c597b247-config-data\") pod \"glance-default-external-api-0\" (UID: \"459bd237-6715-4821-8613-1b24c597b247\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.668538 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79875\" (UniqueName: \"kubernetes.io/projected/459bd237-6715-4821-8613-1b24c597b247-kube-api-access-79875\") pod \"glance-default-external-api-0\" (UID: \"459bd237-6715-4821-8613-1b24c597b247\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.668584 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/459bd237-6715-4821-8613-1b24c597b247-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"459bd237-6715-4821-8613-1b24c597b247\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.668602 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/459bd237-6715-4821-8613-1b24c597b247-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"459bd237-6715-4821-8613-1b24c597b247\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.668664 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/459bd237-6715-4821-8613-1b24c597b247-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"459bd237-6715-4821-8613-1b24c597b247\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.668757 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/459bd237-6715-4821-8613-1b24c597b247-logs\") pod \"glance-default-external-api-0\" (UID: \"459bd237-6715-4821-8613-1b24c597b247\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.686095 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-95d56546f-smk2l" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.770125 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/459bd237-6715-4821-8613-1b24c597b247-config-data\") pod \"glance-default-external-api-0\" (UID: \"459bd237-6715-4821-8613-1b24c597b247\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.770299 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-be497b0f-a266-4a29-bed4-fc9e942da919\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be497b0f-a266-4a29-bed4-fc9e942da919\") pod \"glance-default-internal-api-0\" (UID: \"eae76135-65ce-405b-8f64-55bb66e7de22\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.770435 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79875\" (UniqueName: \"kubernetes.io/projected/459bd237-6715-4821-8613-1b24c597b247-kube-api-access-79875\") pod \"glance-default-external-api-0\" (UID: \"459bd237-6715-4821-8613-1b24c597b247\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.770530 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eae76135-65ce-405b-8f64-55bb66e7de22-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"eae76135-65ce-405b-8f64-55bb66e7de22\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.770659 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/459bd237-6715-4821-8613-1b24c597b247-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"459bd237-6715-4821-8613-1b24c597b247\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.770759 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/459bd237-6715-4821-8613-1b24c597b247-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"459bd237-6715-4821-8613-1b24c597b247\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.770866 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eae76135-65ce-405b-8f64-55bb66e7de22-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"eae76135-65ce-405b-8f64-55bb66e7de22\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.771029 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eae76135-65ce-405b-8f64-55bb66e7de22-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eae76135-65ce-405b-8f64-55bb66e7de22\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.773441 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/459bd237-6715-4821-8613-1b24c597b247-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"459bd237-6715-4821-8613-1b24c597b247\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.773629 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eae76135-65ce-405b-8f64-55bb66e7de22-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eae76135-65ce-405b-8f64-55bb66e7de22\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.773658 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/459bd237-6715-4821-8613-1b24c597b247-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"459bd237-6715-4821-8613-1b24c597b247\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.775880 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eae76135-65ce-405b-8f64-55bb66e7de22-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eae76135-65ce-405b-8f64-55bb66e7de22\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.775953 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb6sp\" (UniqueName: \"kubernetes.io/projected/eae76135-65ce-405b-8f64-55bb66e7de22-kube-api-access-kb6sp\") pod \"glance-default-internal-api-0\" (UID: \"eae76135-65ce-405b-8f64-55bb66e7de22\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.776208 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/459bd237-6715-4821-8613-1b24c597b247-logs\") pod \"glance-default-external-api-0\" (UID: \"459bd237-6715-4821-8613-1b24c597b247\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.776238 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/459bd237-6715-4821-8613-1b24c597b247-scripts\") pod \"glance-default-external-api-0\" (UID: \"459bd237-6715-4821-8613-1b24c597b247\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.776285 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c\") pod \"glance-default-external-api-0\" (UID: \"459bd237-6715-4821-8613-1b24c597b247\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.776373 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eae76135-65ce-405b-8f64-55bb66e7de22-logs\") pod \"glance-default-internal-api-0\" (UID: \"eae76135-65ce-405b-8f64-55bb66e7de22\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.776970 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/459bd237-6715-4821-8613-1b24c597b247-logs\") pod \"glance-default-external-api-0\" (UID: \"459bd237-6715-4821-8613-1b24c597b247\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.778197 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/459bd237-6715-4821-8613-1b24c597b247-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"459bd237-6715-4821-8613-1b24c597b247\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.779143 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/459bd237-6715-4821-8613-1b24c597b247-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"459bd237-6715-4821-8613-1b24c597b247\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.779661 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/459bd237-6715-4821-8613-1b24c597b247-config-data\") pod \"glance-default-external-api-0\" (UID: \"459bd237-6715-4821-8613-1b24c597b247\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.784771 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/459bd237-6715-4821-8613-1b24c597b247-scripts\") pod \"glance-default-external-api-0\" (UID: \"459bd237-6715-4821-8613-1b24c597b247\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.796610 4817 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.796649 4817 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c\") pod \"glance-default-external-api-0\" (UID: \"459bd237-6715-4821-8613-1b24c597b247\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/33f17c260ac02f21898fdc222178f71cf5d08760d15202906af70758a1a97161/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.808410 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79875\" (UniqueName: \"kubernetes.io/projected/459bd237-6715-4821-8613-1b24c597b247-kube-api-access-79875\") pod \"glance-default-external-api-0\" (UID: \"459bd237-6715-4821-8613-1b24c597b247\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.864124 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c\") pod \"glance-default-external-api-0\" (UID: \"459bd237-6715-4821-8613-1b24c597b247\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.878206 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eae76135-65ce-405b-8f64-55bb66e7de22-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eae76135-65ce-405b-8f64-55bb66e7de22\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.878267 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eae76135-65ce-405b-8f64-55bb66e7de22-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eae76135-65ce-405b-8f64-55bb66e7de22\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.878284 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb6sp\" (UniqueName: \"kubernetes.io/projected/eae76135-65ce-405b-8f64-55bb66e7de22-kube-api-access-kb6sp\") pod \"glance-default-internal-api-0\" (UID: \"eae76135-65ce-405b-8f64-55bb66e7de22\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.878352 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eae76135-65ce-405b-8f64-55bb66e7de22-logs\") pod \"glance-default-internal-api-0\" (UID: \"eae76135-65ce-405b-8f64-55bb66e7de22\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.878382 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-be497b0f-a266-4a29-bed4-fc9e942da919\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be497b0f-a266-4a29-bed4-fc9e942da919\") pod \"glance-default-internal-api-0\" (UID: \"eae76135-65ce-405b-8f64-55bb66e7de22\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.878421 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eae76135-65ce-405b-8f64-55bb66e7de22-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"eae76135-65ce-405b-8f64-55bb66e7de22\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.878465 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eae76135-65ce-405b-8f64-55bb66e7de22-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"eae76135-65ce-405b-8f64-55bb66e7de22\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.878502 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eae76135-65ce-405b-8f64-55bb66e7de22-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eae76135-65ce-405b-8f64-55bb66e7de22\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.882794 4817 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.882835 4817 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-be497b0f-a266-4a29-bed4-fc9e942da919\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be497b0f-a266-4a29-bed4-fc9e942da919\") pod \"glance-default-internal-api-0\" (UID: \"eae76135-65ce-405b-8f64-55bb66e7de22\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/81a23472548d53035963276e43796643f625826f28e59ad0adc4b50496ee8da7/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.882883 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eae76135-65ce-405b-8f64-55bb66e7de22-logs\") pod \"glance-default-internal-api-0\" (UID: \"eae76135-65ce-405b-8f64-55bb66e7de22\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.883371 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eae76135-65ce-405b-8f64-55bb66e7de22-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eae76135-65ce-405b-8f64-55bb66e7de22\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.890580 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eae76135-65ce-405b-8f64-55bb66e7de22-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eae76135-65ce-405b-8f64-55bb66e7de22\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.891415 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eae76135-65ce-405b-8f64-55bb66e7de22-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eae76135-65ce-405b-8f64-55bb66e7de22\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.896012 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eae76135-65ce-405b-8f64-55bb66e7de22-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"eae76135-65ce-405b-8f64-55bb66e7de22\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.896089 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eae76135-65ce-405b-8f64-55bb66e7de22-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"eae76135-65ce-405b-8f64-55bb66e7de22\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.902061 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb6sp\" (UniqueName: \"kubernetes.io/projected/eae76135-65ce-405b-8f64-55bb66e7de22-kube-api-access-kb6sp\") pod \"glance-default-internal-api-0\" (UID: \"eae76135-65ce-405b-8f64-55bb66e7de22\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:39 crc kubenswrapper[4817]: I0218 14:19:39.962208 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-be497b0f-a266-4a29-bed4-fc9e942da919\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be497b0f-a266-4a29-bed4-fc9e942da919\") pod \"glance-default-internal-api-0\" (UID: \"eae76135-65ce-405b-8f64-55bb66e7de22\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:40 crc kubenswrapper[4817]: I0218 14:19:40.015686 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 14:19:40 crc kubenswrapper[4817]: I0218 14:19:40.019108 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 14:19:40 crc kubenswrapper[4817]: I0218 14:19:40.191673 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65d143e8-b16a-4487-8330-d5af2dbf6e2a" path="/var/lib/kubelet/pods/65d143e8-b16a-4487-8330-d5af2dbf6e2a/volumes" Feb 18 14:19:40 crc kubenswrapper[4817]: I0218 14:19:40.192716 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb00d9fd-87a9-4180-a447-3a689a07d067" path="/var/lib/kubelet/pods/fb00d9fd-87a9-4180-a447-3a689a07d067/volumes" Feb 18 14:19:40 crc kubenswrapper[4817]: I0218 14:19:40.193358 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a","Type":"ContainerStarted","Data":"41f2843f88e5027daafa351b27095b39840388b314da0591e06ecb86d9b2bd49"} Feb 18 14:19:40 crc kubenswrapper[4817]: I0218 14:19:40.193382 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a414e293-71b9-44c3-8f07-20f3696f7db6","Type":"ContainerStarted","Data":"05f942ad93c2fae9c54b7848d403474b0065d4b610081bd1d49c7818187062be"} Feb 18 14:19:40 crc kubenswrapper[4817]: W0218 14:19:40.308249 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod206709c6_0550_4932_8f0e_f9d4c342a26c.slice/crio-7cd81565a3b9057139a1580c0c171a0fcc55d916c9b304d5aaa63ff241ff03ee WatchSource:0}: Error finding container 7cd81565a3b9057139a1580c0c171a0fcc55d916c9b304d5aaa63ff241ff03ee: Status 404 returned error can't find the container with id 7cd81565a3b9057139a1580c0c171a0fcc55d916c9b304d5aaa63ff241ff03ee Feb 18 14:19:40 crc kubenswrapper[4817]: I0218 14:19:40.310549 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7c8c8d4f9c-f58g5"] Feb 18 14:19:40 crc kubenswrapper[4817]: W0218 14:19:40.771481 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod459bd237_6715_4821_8613_1b24c597b247.slice/crio-7d758ee87f43bc2a0c84fe31628662ed96e252698309e06aa5e31e1e35ad3f1d WatchSource:0}: Error finding container 7d758ee87f43bc2a0c84fe31628662ed96e252698309e06aa5e31e1e35ad3f1d: Status 404 returned error can't find the container with id 7d758ee87f43bc2a0c84fe31628662ed96e252698309e06aa5e31e1e35ad3f1d Feb 18 14:19:40 crc kubenswrapper[4817]: I0218 14:19:40.790199 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 14:19:40 crc kubenswrapper[4817]: I0218 14:19:40.955742 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 14:19:40 crc kubenswrapper[4817]: W0218 14:19:40.970806 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeae76135_65ce_405b_8f64_55bb66e7de22.slice/crio-7f16d213ed7ec1f8b7ec07b54781f87fb8619d8f2cb9021138a53fdad020930d WatchSource:0}: Error finding container 7f16d213ed7ec1f8b7ec07b54781f87fb8619d8f2cb9021138a53fdad020930d: Status 404 returned error can't find the container with id 7f16d213ed7ec1f8b7ec07b54781f87fb8619d8f2cb9021138a53fdad020930d Feb 18 14:19:41 crc kubenswrapper[4817]: I0218 14:19:41.201993 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eae76135-65ce-405b-8f64-55bb66e7de22","Type":"ContainerStarted","Data":"7f16d213ed7ec1f8b7ec07b54781f87fb8619d8f2cb9021138a53fdad020930d"} Feb 18 14:19:41 crc kubenswrapper[4817]: I0218 14:19:41.204186 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"459bd237-6715-4821-8613-1b24c597b247","Type":"ContainerStarted","Data":"7d758ee87f43bc2a0c84fe31628662ed96e252698309e06aa5e31e1e35ad3f1d"} Feb 18 14:19:41 crc kubenswrapper[4817]: I0218 14:19:41.205807 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c8c8d4f9c-f58g5" event={"ID":"206709c6-0550-4932-8f0e-f9d4c342a26c","Type":"ContainerStarted","Data":"7cd81565a3b9057139a1580c0c171a0fcc55d916c9b304d5aaa63ff241ff03ee"} Feb 18 14:19:41 crc kubenswrapper[4817]: I0218 14:19:41.718579 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5784cf869f-9w568" Feb 18 14:19:41 crc kubenswrapper[4817]: I0218 14:19:41.810948 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7979dc8455-w5k2f"] Feb 18 14:19:41 crc kubenswrapper[4817]: I0218 14:19:41.811202 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7979dc8455-w5k2f" podUID="f1544da4-e27b-4191-960d-d2a2145c6a42" containerName="dnsmasq-dns" containerID="cri-o://c14342e7e3cade0d49811e211d0118efbb76d10f00b8a8c70653c697a6221a7a" gracePeriod=10 Feb 18 14:19:42 crc kubenswrapper[4817]: I0218 14:19:42.220630 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a414e293-71b9-44c3-8f07-20f3696f7db6","Type":"ContainerStarted","Data":"742fa86c3b9aa7c899c86fc5072541dec81ac83b5d1d0324d66839cec93ec5be"} Feb 18 14:19:42 crc kubenswrapper[4817]: I0218 14:19:42.863458 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:19:42 crc kubenswrapper[4817]: I0218 14:19:42.863834 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:19:43 crc kubenswrapper[4817]: I0218 14:19:43.257858 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eae76135-65ce-405b-8f64-55bb66e7de22","Type":"ContainerStarted","Data":"00b7ee1455cd7a3092cd842cc05733b05151c0b5e49ea4f38c9ac945e2102a1b"} Feb 18 14:19:43 crc kubenswrapper[4817]: I0218 14:19:43.266947 4817 generic.go:334] "Generic (PLEG): container finished" podID="f1544da4-e27b-4191-960d-d2a2145c6a42" containerID="c14342e7e3cade0d49811e211d0118efbb76d10f00b8a8c70653c697a6221a7a" exitCode=0 Feb 18 14:19:43 crc kubenswrapper[4817]: I0218 14:19:43.267051 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7979dc8455-w5k2f" event={"ID":"f1544da4-e27b-4191-960d-d2a2145c6a42","Type":"ContainerDied","Data":"c14342e7e3cade0d49811e211d0118efbb76d10f00b8a8c70653c697a6221a7a"} Feb 18 14:19:43 crc kubenswrapper[4817]: I0218 14:19:43.271533 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"459bd237-6715-4821-8613-1b24c597b247","Type":"ContainerStarted","Data":"26a099e10bfc0bf4ac865885a1dd0e0c626c8870d54745a408ed5e9d466170d9"} Feb 18 14:19:43 crc kubenswrapper[4817]: I0218 14:19:43.279819 4817 generic.go:334] "Generic (PLEG): container finished" podID="c85f7224-7818-41ea-a46b-3d55ac66cece" containerID="2d049fe5389d43826253222b153a9b4695d232e1861bfda298f5838892f8f621" exitCode=137 Feb 18 14:19:43 crc kubenswrapper[4817]: I0218 14:19:43.279934 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95d56546f-smk2l" event={"ID":"c85f7224-7818-41ea-a46b-3d55ac66cece","Type":"ContainerDied","Data":"2d049fe5389d43826253222b153a9b4695d232e1861bfda298f5838892f8f621"} Feb 18 14:19:43 crc kubenswrapper[4817]: I0218 14:19:43.281806 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c8c8d4f9c-f58g5" event={"ID":"206709c6-0550-4932-8f0e-f9d4c342a26c","Type":"ContainerStarted","Data":"8b9bb202948d78adde198398f70205841c99a751ef4de2a6a7c4ea5ddf55ff41"} Feb 18 14:19:44 crc kubenswrapper[4817]: I0218 14:19:44.493240 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7979dc8455-w5k2f" podUID="f1544da4-e27b-4191-960d-d2a2145c6a42" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.172:5353: connect: connection refused" Feb 18 14:19:44 crc kubenswrapper[4817]: I0218 14:19:44.522545 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 14:19:44 crc kubenswrapper[4817]: I0218 14:19:44.652560 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95d56546f-smk2l" Feb 18 14:19:44 crc kubenswrapper[4817]: I0218 14:19:44.724337 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c85f7224-7818-41ea-a46b-3d55ac66cece-ovsdbserver-sb\") pod \"c85f7224-7818-41ea-a46b-3d55ac66cece\" (UID: \"c85f7224-7818-41ea-a46b-3d55ac66cece\") " Feb 18 14:19:44 crc kubenswrapper[4817]: I0218 14:19:44.724453 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c85f7224-7818-41ea-a46b-3d55ac66cece-dns-svc\") pod \"c85f7224-7818-41ea-a46b-3d55ac66cece\" (UID: \"c85f7224-7818-41ea-a46b-3d55ac66cece\") " Feb 18 14:19:44 crc kubenswrapper[4817]: I0218 14:19:44.724493 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c85f7224-7818-41ea-a46b-3d55ac66cece-dns-swift-storage-0\") pod \"c85f7224-7818-41ea-a46b-3d55ac66cece\" (UID: \"c85f7224-7818-41ea-a46b-3d55ac66cece\") " Feb 18 14:19:44 crc kubenswrapper[4817]: I0218 14:19:44.724524 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blbkx\" (UniqueName: \"kubernetes.io/projected/c85f7224-7818-41ea-a46b-3d55ac66cece-kube-api-access-blbkx\") pod \"c85f7224-7818-41ea-a46b-3d55ac66cece\" (UID: \"c85f7224-7818-41ea-a46b-3d55ac66cece\") " Feb 18 14:19:44 crc kubenswrapper[4817]: I0218 14:19:44.724601 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c85f7224-7818-41ea-a46b-3d55ac66cece-ovsdbserver-nb\") pod \"c85f7224-7818-41ea-a46b-3d55ac66cece\" (UID: \"c85f7224-7818-41ea-a46b-3d55ac66cece\") " Feb 18 14:19:44 crc kubenswrapper[4817]: I0218 14:19:44.724658 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c85f7224-7818-41ea-a46b-3d55ac66cece-config\") pod \"c85f7224-7818-41ea-a46b-3d55ac66cece\" (UID: \"c85f7224-7818-41ea-a46b-3d55ac66cece\") " Feb 18 14:19:44 crc kubenswrapper[4817]: I0218 14:19:44.735207 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c85f7224-7818-41ea-a46b-3d55ac66cece-kube-api-access-blbkx" (OuterVolumeSpecName: "kube-api-access-blbkx") pod "c85f7224-7818-41ea-a46b-3d55ac66cece" (UID: "c85f7224-7818-41ea-a46b-3d55ac66cece"). InnerVolumeSpecName "kube-api-access-blbkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:19:44 crc kubenswrapper[4817]: I0218 14:19:44.784921 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c85f7224-7818-41ea-a46b-3d55ac66cece-config" (OuterVolumeSpecName: "config") pod "c85f7224-7818-41ea-a46b-3d55ac66cece" (UID: "c85f7224-7818-41ea-a46b-3d55ac66cece"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:19:44 crc kubenswrapper[4817]: I0218 14:19:44.800238 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c85f7224-7818-41ea-a46b-3d55ac66cece-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c85f7224-7818-41ea-a46b-3d55ac66cece" (UID: "c85f7224-7818-41ea-a46b-3d55ac66cece"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:19:44 crc kubenswrapper[4817]: I0218 14:19:44.835890 4817 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c85f7224-7818-41ea-a46b-3d55ac66cece-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:44 crc kubenswrapper[4817]: I0218 14:19:44.835922 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blbkx\" (UniqueName: \"kubernetes.io/projected/c85f7224-7818-41ea-a46b-3d55ac66cece-kube-api-access-blbkx\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:44 crc kubenswrapper[4817]: I0218 14:19:44.835937 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c85f7224-7818-41ea-a46b-3d55ac66cece-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:44 crc kubenswrapper[4817]: I0218 14:19:44.840487 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c85f7224-7818-41ea-a46b-3d55ac66cece-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c85f7224-7818-41ea-a46b-3d55ac66cece" (UID: "c85f7224-7818-41ea-a46b-3d55ac66cece"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:19:44 crc kubenswrapper[4817]: I0218 14:19:44.850511 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c85f7224-7818-41ea-a46b-3d55ac66cece-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c85f7224-7818-41ea-a46b-3d55ac66cece" (UID: "c85f7224-7818-41ea-a46b-3d55ac66cece"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:19:44 crc kubenswrapper[4817]: I0218 14:19:44.862886 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c85f7224-7818-41ea-a46b-3d55ac66cece-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c85f7224-7818-41ea-a46b-3d55ac66cece" (UID: "c85f7224-7818-41ea-a46b-3d55ac66cece"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:19:44 crc kubenswrapper[4817]: I0218 14:19:44.938342 4817 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c85f7224-7818-41ea-a46b-3d55ac66cece-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:44 crc kubenswrapper[4817]: I0218 14:19:44.938382 4817 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c85f7224-7818-41ea-a46b-3d55ac66cece-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:44 crc kubenswrapper[4817]: I0218 14:19:44.938396 4817 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c85f7224-7818-41ea-a46b-3d55ac66cece-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:45 crc kubenswrapper[4817]: I0218 14:19:45.306703 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7979dc8455-w5k2f" event={"ID":"f1544da4-e27b-4191-960d-d2a2145c6a42","Type":"ContainerDied","Data":"4d9280af8a1dcf41baf26dc55c6c58e8abdd9dfa171442035a9c5ee2fe14f149"} Feb 18 14:19:45 crc kubenswrapper[4817]: I0218 14:19:45.307007 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d9280af8a1dcf41baf26dc55c6c58e8abdd9dfa171442035a9c5ee2fe14f149" Feb 18 14:19:45 crc kubenswrapper[4817]: I0218 14:19:45.309043 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95d56546f-smk2l" event={"ID":"c85f7224-7818-41ea-a46b-3d55ac66cece","Type":"ContainerDied","Data":"ec3fd88083ad8710ce3c97055baf1639095ca1d0a921d5b1f756914d0f6bd9dd"} Feb 18 14:19:45 crc kubenswrapper[4817]: I0218 14:19:45.309073 4817 scope.go:117] "RemoveContainer" containerID="2d049fe5389d43826253222b153a9b4695d232e1861bfda298f5838892f8f621" Feb 18 14:19:45 crc kubenswrapper[4817]: I0218 14:19:45.309189 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95d56546f-smk2l" Feb 18 14:19:45 crc kubenswrapper[4817]: I0218 14:19:45.321343 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c8c8d4f9c-f58g5" event={"ID":"206709c6-0550-4932-8f0e-f9d4c342a26c","Type":"ContainerStarted","Data":"e8ee12c217c57bb82956f536cabfbb0994df8da7dd633caf57fdbc9dd4dfa5c8"} Feb 18 14:19:45 crc kubenswrapper[4817]: I0218 14:19:45.322689 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7c8c8d4f9c-f58g5" Feb 18 14:19:45 crc kubenswrapper[4817]: I0218 14:19:45.330426 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a414e293-71b9-44c3-8f07-20f3696f7db6","Type":"ContainerStarted","Data":"c17e382df06013ecd26a17408d6ad6dc514b3c5ec881ff7922d2a8d76fb1039b"} Feb 18 14:19:45 crc kubenswrapper[4817]: I0218 14:19:45.333256 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eae76135-65ce-405b-8f64-55bb66e7de22","Type":"ContainerStarted","Data":"bc36f162fe93a5e7759536bc57cd9d936b0f1585401521e41f1f3361da1146dc"} Feb 18 14:19:45 crc kubenswrapper[4817]: I0218 14:19:45.346366 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7c8c8d4f9c-f58g5" podStartSLOduration=6.346348083 podStartE2EDuration="6.346348083s" podCreationTimestamp="2026-02-18 14:19:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:19:45.345743508 +0000 UTC m=+1247.921279481" watchObservedRunningTime="2026-02-18 14:19:45.346348083 +0000 UTC m=+1247.921884066" Feb 18 14:19:45 crc kubenswrapper[4817]: I0218 14:19:45.372150 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7979dc8455-w5k2f" Feb 18 14:19:45 crc kubenswrapper[4817]: I0218 14:19:45.403401 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95d56546f-smk2l"] Feb 18 14:19:45 crc kubenswrapper[4817]: I0218 14:19:45.422966 4817 scope.go:117] "RemoveContainer" containerID="d9e448eeffe6875db790257741d76906384b352c0bfb87810a0851bc44f249fa" Feb 18 14:19:45 crc kubenswrapper[4817]: I0218 14:19:45.428608 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-95d56546f-smk2l"] Feb 18 14:19:45 crc kubenswrapper[4817]: I0218 14:19:45.450370 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1544da4-e27b-4191-960d-d2a2145c6a42-dns-svc\") pod \"f1544da4-e27b-4191-960d-d2a2145c6a42\" (UID: \"f1544da4-e27b-4191-960d-d2a2145c6a42\") " Feb 18 14:19:45 crc kubenswrapper[4817]: I0218 14:19:45.450502 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1544da4-e27b-4191-960d-d2a2145c6a42-ovsdbserver-sb\") pod \"f1544da4-e27b-4191-960d-d2a2145c6a42\" (UID: \"f1544da4-e27b-4191-960d-d2a2145c6a42\") " Feb 18 14:19:45 crc kubenswrapper[4817]: I0218 14:19:45.450544 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1544da4-e27b-4191-960d-d2a2145c6a42-config\") pod \"f1544da4-e27b-4191-960d-d2a2145c6a42\" (UID: \"f1544da4-e27b-4191-960d-d2a2145c6a42\") " Feb 18 14:19:45 crc kubenswrapper[4817]: I0218 14:19:45.450597 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skmws\" (UniqueName: \"kubernetes.io/projected/f1544da4-e27b-4191-960d-d2a2145c6a42-kube-api-access-skmws\") pod \"f1544da4-e27b-4191-960d-d2a2145c6a42\" (UID: \"f1544da4-e27b-4191-960d-d2a2145c6a42\") " Feb 18 14:19:45 crc kubenswrapper[4817]: I0218 14:19:45.450755 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1544da4-e27b-4191-960d-d2a2145c6a42-dns-swift-storage-0\") pod \"f1544da4-e27b-4191-960d-d2a2145c6a42\" (UID: \"f1544da4-e27b-4191-960d-d2a2145c6a42\") " Feb 18 14:19:45 crc kubenswrapper[4817]: I0218 14:19:45.450796 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1544da4-e27b-4191-960d-d2a2145c6a42-ovsdbserver-nb\") pod \"f1544da4-e27b-4191-960d-d2a2145c6a42\" (UID: \"f1544da4-e27b-4191-960d-d2a2145c6a42\") " Feb 18 14:19:45 crc kubenswrapper[4817]: I0218 14:19:45.467158 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1544da4-e27b-4191-960d-d2a2145c6a42-kube-api-access-skmws" (OuterVolumeSpecName: "kube-api-access-skmws") pod "f1544da4-e27b-4191-960d-d2a2145c6a42" (UID: "f1544da4-e27b-4191-960d-d2a2145c6a42"). InnerVolumeSpecName "kube-api-access-skmws". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:19:45 crc kubenswrapper[4817]: I0218 14:19:45.555806 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skmws\" (UniqueName: \"kubernetes.io/projected/f1544da4-e27b-4191-960d-d2a2145c6a42-kube-api-access-skmws\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:45 crc kubenswrapper[4817]: I0218 14:19:45.567313 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1544da4-e27b-4191-960d-d2a2145c6a42-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f1544da4-e27b-4191-960d-d2a2145c6a42" (UID: "f1544da4-e27b-4191-960d-d2a2145c6a42"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:19:45 crc kubenswrapper[4817]: I0218 14:19:45.568224 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1544da4-e27b-4191-960d-d2a2145c6a42-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f1544da4-e27b-4191-960d-d2a2145c6a42" (UID: "f1544da4-e27b-4191-960d-d2a2145c6a42"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:19:45 crc kubenswrapper[4817]: I0218 14:19:45.570575 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1544da4-e27b-4191-960d-d2a2145c6a42-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f1544da4-e27b-4191-960d-d2a2145c6a42" (UID: "f1544da4-e27b-4191-960d-d2a2145c6a42"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:19:45 crc kubenswrapper[4817]: I0218 14:19:45.570962 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1544da4-e27b-4191-960d-d2a2145c6a42-config" (OuterVolumeSpecName: "config") pod "f1544da4-e27b-4191-960d-d2a2145c6a42" (UID: "f1544da4-e27b-4191-960d-d2a2145c6a42"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:19:45 crc kubenswrapper[4817]: I0218 14:19:45.594575 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1544da4-e27b-4191-960d-d2a2145c6a42-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f1544da4-e27b-4191-960d-d2a2145c6a42" (UID: "f1544da4-e27b-4191-960d-d2a2145c6a42"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:19:45 crc kubenswrapper[4817]: I0218 14:19:45.657386 4817 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1544da4-e27b-4191-960d-d2a2145c6a42-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:45 crc kubenswrapper[4817]: I0218 14:19:45.657423 4817 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1544da4-e27b-4191-960d-d2a2145c6a42-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:45 crc kubenswrapper[4817]: I0218 14:19:45.657432 4817 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1544da4-e27b-4191-960d-d2a2145c6a42-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:45 crc kubenswrapper[4817]: I0218 14:19:45.657442 4817 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1544da4-e27b-4191-960d-d2a2145c6a42-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:45 crc kubenswrapper[4817]: I0218 14:19:45.657450 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1544da4-e27b-4191-960d-d2a2145c6a42-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:46 crc kubenswrapper[4817]: I0218 14:19:46.067206 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="90dec55d-864d-49da-b960-0a8e51e4d0ad" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 18 14:19:46 crc kubenswrapper[4817]: I0218 14:19:46.184588 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c85f7224-7818-41ea-a46b-3d55ac66cece" path="/var/lib/kubelet/pods/c85f7224-7818-41ea-a46b-3d55ac66cece/volumes" Feb 18 14:19:46 crc kubenswrapper[4817]: I0218 14:19:46.354662 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"459bd237-6715-4821-8613-1b24c597b247","Type":"ContainerStarted","Data":"948d5d6021789e69b7281940d668ba8e1b9a56a119e64c506c79734eff6f04f9"} Feb 18 14:19:46 crc kubenswrapper[4817]: I0218 14:19:46.354791 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7979dc8455-w5k2f" Feb 18 14:19:46 crc kubenswrapper[4817]: I0218 14:19:46.354815 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="459bd237-6715-4821-8613-1b24c597b247" containerName="glance-log" containerID="cri-o://26a099e10bfc0bf4ac865885a1dd0e0c626c8870d54745a408ed5e9d466170d9" gracePeriod=30 Feb 18 14:19:46 crc kubenswrapper[4817]: I0218 14:19:46.354887 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="459bd237-6715-4821-8613-1b24c597b247" containerName="glance-httpd" containerID="cri-o://948d5d6021789e69b7281940d668ba8e1b9a56a119e64c506c79734eff6f04f9" gracePeriod=30 Feb 18 14:19:46 crc kubenswrapper[4817]: I0218 14:19:46.436184 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.436161938 podStartE2EDuration="7.436161938s" podCreationTimestamp="2026-02-18 14:19:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:19:46.431940362 +0000 UTC m=+1249.007476345" watchObservedRunningTime="2026-02-18 14:19:46.436161938 +0000 UTC m=+1249.011697921" Feb 18 14:19:46 crc kubenswrapper[4817]: I0218 14:19:46.439103 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.439090152 podStartE2EDuration="7.439090152s" podCreationTimestamp="2026-02-18 14:19:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:19:46.396301646 +0000 UTC m=+1248.971837629" watchObservedRunningTime="2026-02-18 14:19:46.439090152 +0000 UTC m=+1249.014626145" Feb 18 14:19:46 crc kubenswrapper[4817]: I0218 14:19:46.463456 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=8.463438104 podStartE2EDuration="8.463438104s" podCreationTimestamp="2026-02-18 14:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:19:46.46091907 +0000 UTC m=+1249.036455053" watchObservedRunningTime="2026-02-18 14:19:46.463438104 +0000 UTC m=+1249.038974087" Feb 18 14:19:46 crc kubenswrapper[4817]: I0218 14:19:46.554367 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7979dc8455-w5k2f"] Feb 18 14:19:46 crc kubenswrapper[4817]: I0218 14:19:46.573923 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7979dc8455-w5k2f"] Feb 18 14:19:47 crc kubenswrapper[4817]: I0218 14:19:47.367322 4817 generic.go:334] "Generic (PLEG): container finished" podID="459bd237-6715-4821-8613-1b24c597b247" containerID="948d5d6021789e69b7281940d668ba8e1b9a56a119e64c506c79734eff6f04f9" exitCode=0 Feb 18 14:19:47 crc kubenswrapper[4817]: I0218 14:19:47.367352 4817 generic.go:334] "Generic (PLEG): container finished" podID="459bd237-6715-4821-8613-1b24c597b247" containerID="26a099e10bfc0bf4ac865885a1dd0e0c626c8870d54745a408ed5e9d466170d9" exitCode=143 Feb 18 14:19:47 crc kubenswrapper[4817]: I0218 14:19:47.367363 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"459bd237-6715-4821-8613-1b24c597b247","Type":"ContainerDied","Data":"948d5d6021789e69b7281940d668ba8e1b9a56a119e64c506c79734eff6f04f9"} Feb 18 14:19:47 crc kubenswrapper[4817]: I0218 14:19:47.367441 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"459bd237-6715-4821-8613-1b24c597b247","Type":"ContainerDied","Data":"26a099e10bfc0bf4ac865885a1dd0e0c626c8870d54745a408ed5e9d466170d9"} Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.183742 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1544da4-e27b-4191-960d-d2a2145c6a42" path="/var/lib/kubelet/pods/f1544da4-e27b-4191-960d-d2a2145c6a42/volumes" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.232842 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.313762 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/459bd237-6715-4821-8613-1b24c597b247-httpd-run\") pod \"459bd237-6715-4821-8613-1b24c597b247\" (UID: \"459bd237-6715-4821-8613-1b24c597b247\") " Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.314090 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/459bd237-6715-4821-8613-1b24c597b247-public-tls-certs\") pod \"459bd237-6715-4821-8613-1b24c597b247\" (UID: \"459bd237-6715-4821-8613-1b24c597b247\") " Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.314158 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/459bd237-6715-4821-8613-1b24c597b247-scripts\") pod \"459bd237-6715-4821-8613-1b24c597b247\" (UID: \"459bd237-6715-4821-8613-1b24c597b247\") " Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.314387 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c\") pod \"459bd237-6715-4821-8613-1b24c597b247\" (UID: \"459bd237-6715-4821-8613-1b24c597b247\") " Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.314435 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/459bd237-6715-4821-8613-1b24c597b247-logs\") pod \"459bd237-6715-4821-8613-1b24c597b247\" (UID: \"459bd237-6715-4821-8613-1b24c597b247\") " Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.314456 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79875\" (UniqueName: \"kubernetes.io/projected/459bd237-6715-4821-8613-1b24c597b247-kube-api-access-79875\") pod \"459bd237-6715-4821-8613-1b24c597b247\" (UID: \"459bd237-6715-4821-8613-1b24c597b247\") " Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.314475 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/459bd237-6715-4821-8613-1b24c597b247-combined-ca-bundle\") pod \"459bd237-6715-4821-8613-1b24c597b247\" (UID: \"459bd237-6715-4821-8613-1b24c597b247\") " Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.314508 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/459bd237-6715-4821-8613-1b24c597b247-config-data\") pod \"459bd237-6715-4821-8613-1b24c597b247\" (UID: \"459bd237-6715-4821-8613-1b24c597b247\") " Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.315847 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/459bd237-6715-4821-8613-1b24c597b247-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "459bd237-6715-4821-8613-1b24c597b247" (UID: "459bd237-6715-4821-8613-1b24c597b247"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.318644 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/459bd237-6715-4821-8613-1b24c597b247-logs" (OuterVolumeSpecName: "logs") pod "459bd237-6715-4821-8613-1b24c597b247" (UID: "459bd237-6715-4821-8613-1b24c597b247"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.327123 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/459bd237-6715-4821-8613-1b24c597b247-scripts" (OuterVolumeSpecName: "scripts") pod "459bd237-6715-4821-8613-1b24c597b247" (UID: "459bd237-6715-4821-8613-1b24c597b247"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.338829 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/459bd237-6715-4821-8613-1b24c597b247-kube-api-access-79875" (OuterVolumeSpecName: "kube-api-access-79875") pod "459bd237-6715-4821-8613-1b24c597b247" (UID: "459bd237-6715-4821-8613-1b24c597b247"). InnerVolumeSpecName "kube-api-access-79875". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.354589 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c" (OuterVolumeSpecName: "glance") pod "459bd237-6715-4821-8613-1b24c597b247" (UID: "459bd237-6715-4821-8613-1b24c597b247"). InnerVolumeSpecName "pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.358324 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/459bd237-6715-4821-8613-1b24c597b247-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "459bd237-6715-4821-8613-1b24c597b247" (UID: "459bd237-6715-4821-8613-1b24c597b247"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.382777 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"459bd237-6715-4821-8613-1b24c597b247","Type":"ContainerDied","Data":"7d758ee87f43bc2a0c84fe31628662ed96e252698309e06aa5e31e1e35ad3f1d"} Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.382836 4817 scope.go:117] "RemoveContainer" containerID="948d5d6021789e69b7281940d668ba8e1b9a56a119e64c506c79734eff6f04f9" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.383025 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.398606 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/459bd237-6715-4821-8613-1b24c597b247-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "459bd237-6715-4821-8613-1b24c597b247" (UID: "459bd237-6715-4821-8613-1b24c597b247"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.416737 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/459bd237-6715-4821-8613-1b24c597b247-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.416786 4817 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c\") on node \"crc\" " Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.416801 4817 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/459bd237-6715-4821-8613-1b24c597b247-logs\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.416813 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79875\" (UniqueName: \"kubernetes.io/projected/459bd237-6715-4821-8613-1b24c597b247-kube-api-access-79875\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.416822 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/459bd237-6715-4821-8613-1b24c597b247-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.416830 4817 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/459bd237-6715-4821-8613-1b24c597b247-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.416838 4817 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/459bd237-6715-4821-8613-1b24c597b247-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.417699 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/459bd237-6715-4821-8613-1b24c597b247-config-data" (OuterVolumeSpecName: "config-data") pod "459bd237-6715-4821-8613-1b24c597b247" (UID: "459bd237-6715-4821-8613-1b24c597b247"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.443992 4817 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.444199 4817 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c") on node "crc" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.480585 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.518315 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/459bd237-6715-4821-8613-1b24c597b247-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.518360 4817 reconciler_common.go:293] "Volume detached for volume \"pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.690237 4817 scope.go:117] "RemoveContainer" containerID="26a099e10bfc0bf4ac865885a1dd0e0c626c8870d54745a408ed5e9d466170d9" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.726576 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.739899 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.758235 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 14:19:48 crc kubenswrapper[4817]: E0218 14:19:48.758778 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1544da4-e27b-4191-960d-d2a2145c6a42" containerName="init" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.758796 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1544da4-e27b-4191-960d-d2a2145c6a42" containerName="init" Feb 18 14:19:48 crc kubenswrapper[4817]: E0218 14:19:48.758807 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1544da4-e27b-4191-960d-d2a2145c6a42" containerName="dnsmasq-dns" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.758830 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1544da4-e27b-4191-960d-d2a2145c6a42" containerName="dnsmasq-dns" Feb 18 14:19:48 crc kubenswrapper[4817]: E0218 14:19:48.758843 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c85f7224-7818-41ea-a46b-3d55ac66cece" containerName="init" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.758848 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="c85f7224-7818-41ea-a46b-3d55ac66cece" containerName="init" Feb 18 14:19:48 crc kubenswrapper[4817]: E0218 14:19:48.758862 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="459bd237-6715-4821-8613-1b24c597b247" containerName="glance-log" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.758868 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="459bd237-6715-4821-8613-1b24c597b247" containerName="glance-log" Feb 18 14:19:48 crc kubenswrapper[4817]: E0218 14:19:48.758881 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c85f7224-7818-41ea-a46b-3d55ac66cece" containerName="dnsmasq-dns" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.758900 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="c85f7224-7818-41ea-a46b-3d55ac66cece" containerName="dnsmasq-dns" Feb 18 14:19:48 crc kubenswrapper[4817]: E0218 14:19:48.758912 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="459bd237-6715-4821-8613-1b24c597b247" containerName="glance-httpd" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.758917 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="459bd237-6715-4821-8613-1b24c597b247" containerName="glance-httpd" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.759118 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1544da4-e27b-4191-960d-d2a2145c6a42" containerName="dnsmasq-dns" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.759134 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="459bd237-6715-4821-8613-1b24c597b247" containerName="glance-httpd" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.759160 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="459bd237-6715-4821-8613-1b24c597b247" containerName="glance-log" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.759174 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="c85f7224-7818-41ea-a46b-3d55ac66cece" containerName="dnsmasq-dns" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.760405 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.765589 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.765712 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.775077 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.824774 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gtn9\" (UniqueName: \"kubernetes.io/projected/fc799768-b6dd-4b19-aee6-909d985e2441-kube-api-access-6gtn9\") pod \"glance-default-external-api-0\" (UID: \"fc799768-b6dd-4b19-aee6-909d985e2441\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.824849 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc799768-b6dd-4b19-aee6-909d985e2441-config-data\") pod \"glance-default-external-api-0\" (UID: \"fc799768-b6dd-4b19-aee6-909d985e2441\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.824872 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc799768-b6dd-4b19-aee6-909d985e2441-scripts\") pod \"glance-default-external-api-0\" (UID: \"fc799768-b6dd-4b19-aee6-909d985e2441\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.824937 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc799768-b6dd-4b19-aee6-909d985e2441-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fc799768-b6dd-4b19-aee6-909d985e2441\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.825001 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc799768-b6dd-4b19-aee6-909d985e2441-logs\") pod \"glance-default-external-api-0\" (UID: \"fc799768-b6dd-4b19-aee6-909d985e2441\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.825083 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc799768-b6dd-4b19-aee6-909d985e2441-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fc799768-b6dd-4b19-aee6-909d985e2441\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.825103 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc799768-b6dd-4b19-aee6-909d985e2441-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fc799768-b6dd-4b19-aee6-909d985e2441\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.825182 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c\") pod \"glance-default-external-api-0\" (UID: \"fc799768-b6dd-4b19-aee6-909d985e2441\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.928309 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gtn9\" (UniqueName: \"kubernetes.io/projected/fc799768-b6dd-4b19-aee6-909d985e2441-kube-api-access-6gtn9\") pod \"glance-default-external-api-0\" (UID: \"fc799768-b6dd-4b19-aee6-909d985e2441\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.928370 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc799768-b6dd-4b19-aee6-909d985e2441-config-data\") pod \"glance-default-external-api-0\" (UID: \"fc799768-b6dd-4b19-aee6-909d985e2441\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.928402 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc799768-b6dd-4b19-aee6-909d985e2441-scripts\") pod \"glance-default-external-api-0\" (UID: \"fc799768-b6dd-4b19-aee6-909d985e2441\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.928454 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc799768-b6dd-4b19-aee6-909d985e2441-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fc799768-b6dd-4b19-aee6-909d985e2441\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.928488 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc799768-b6dd-4b19-aee6-909d985e2441-logs\") pod \"glance-default-external-api-0\" (UID: \"fc799768-b6dd-4b19-aee6-909d985e2441\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.928569 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc799768-b6dd-4b19-aee6-909d985e2441-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fc799768-b6dd-4b19-aee6-909d985e2441\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.928607 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc799768-b6dd-4b19-aee6-909d985e2441-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fc799768-b6dd-4b19-aee6-909d985e2441\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.928674 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c\") pod \"glance-default-external-api-0\" (UID: \"fc799768-b6dd-4b19-aee6-909d985e2441\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.929169 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc799768-b6dd-4b19-aee6-909d985e2441-logs\") pod \"glance-default-external-api-0\" (UID: \"fc799768-b6dd-4b19-aee6-909d985e2441\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.929474 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc799768-b6dd-4b19-aee6-909d985e2441-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fc799768-b6dd-4b19-aee6-909d985e2441\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.934194 4817 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.934225 4817 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c\") pod \"glance-default-external-api-0\" (UID: \"fc799768-b6dd-4b19-aee6-909d985e2441\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/33f17c260ac02f21898fdc222178f71cf5d08760d15202906af70758a1a97161/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.934266 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc799768-b6dd-4b19-aee6-909d985e2441-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fc799768-b6dd-4b19-aee6-909d985e2441\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.934366 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc799768-b6dd-4b19-aee6-909d985e2441-scripts\") pod \"glance-default-external-api-0\" (UID: \"fc799768-b6dd-4b19-aee6-909d985e2441\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.934423 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc799768-b6dd-4b19-aee6-909d985e2441-config-data\") pod \"glance-default-external-api-0\" (UID: \"fc799768-b6dd-4b19-aee6-909d985e2441\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.939541 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc799768-b6dd-4b19-aee6-909d985e2441-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fc799768-b6dd-4b19-aee6-909d985e2441\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.950274 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gtn9\" (UniqueName: \"kubernetes.io/projected/fc799768-b6dd-4b19-aee6-909d985e2441-kube-api-access-6gtn9\") pod \"glance-default-external-api-0\" (UID: \"fc799768-b6dd-4b19-aee6-909d985e2441\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:48 crc kubenswrapper[4817]: I0218 14:19:48.988284 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8abaec47-e902-4ec5-bad1-d4dfc189915c\") pod \"glance-default-external-api-0\" (UID: \"fc799768-b6dd-4b19-aee6-909d985e2441\") " pod="openstack/glance-default-external-api-0" Feb 18 14:19:49 crc kubenswrapper[4817]: I0218 14:19:49.089955 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 18 14:19:49 crc kubenswrapper[4817]: I0218 14:19:49.579178 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 18 14:19:49 crc kubenswrapper[4817]: I0218 14:19:49.653118 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 14:19:49 crc kubenswrapper[4817]: W0218 14:19:49.724672 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc799768_b6dd_4b19_aee6_909d985e2441.slice/crio-8afd186bff4e9831919e8260862b75de4ad3120fbd5f120e18c36cf77f1e2d3c WatchSource:0}: Error finding container 8afd186bff4e9831919e8260862b75de4ad3120fbd5f120e18c36cf77f1e2d3c: Status 404 returned error can't find the container with id 8afd186bff4e9831919e8260862b75de4ad3120fbd5f120e18c36cf77f1e2d3c Feb 18 14:19:49 crc kubenswrapper[4817]: I0218 14:19:49.749903 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 18 14:19:50 crc kubenswrapper[4817]: I0218 14:19:50.020105 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 18 14:19:50 crc kubenswrapper[4817]: I0218 14:19:50.020386 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 18 14:19:50 crc kubenswrapper[4817]: I0218 14:19:50.034518 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 14:19:50 crc kubenswrapper[4817]: I0218 14:19:50.110108 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 18 14:19:50 crc kubenswrapper[4817]: I0218 14:19:50.125110 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 18 14:19:50 crc kubenswrapper[4817]: I0218 14:19:50.192564 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="459bd237-6715-4821-8613-1b24c597b247" path="/var/lib/kubelet/pods/459bd237-6715-4821-8613-1b24c597b247/volumes" Feb 18 14:19:50 crc kubenswrapper[4817]: I0218 14:19:50.416410 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a","Type":"ContainerStarted","Data":"0a6317101227e2ad8e39fd2c97747591d2646ae8aefe7e421d9081caf929b39b"} Feb 18 14:19:50 crc kubenswrapper[4817]: I0218 14:19:50.416545 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 18 14:19:50 crc kubenswrapper[4817]: I0218 14:19:50.418031 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc799768-b6dd-4b19-aee6-909d985e2441","Type":"ContainerStarted","Data":"8afd186bff4e9831919e8260862b75de4ad3120fbd5f120e18c36cf77f1e2d3c"} Feb 18 14:19:50 crc kubenswrapper[4817]: I0218 14:19:50.418146 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="eae76135-65ce-405b-8f64-55bb66e7de22" containerName="glance-log" containerID="cri-o://00b7ee1455cd7a3092cd842cc05733b05151c0b5e49ea4f38c9ac945e2102a1b" gracePeriod=30 Feb 18 14:19:50 crc kubenswrapper[4817]: I0218 14:19:50.418257 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="eae76135-65ce-405b-8f64-55bb66e7de22" containerName="glance-httpd" containerID="cri-o://bc36f162fe93a5e7759536bc57cd9d936b0f1585401521e41f1f3361da1146dc" gracePeriod=30 Feb 18 14:19:50 crc kubenswrapper[4817]: I0218 14:19:50.418372 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 18 14:19:50 crc kubenswrapper[4817]: I0218 14:19:50.418400 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 18 14:19:50 crc kubenswrapper[4817]: I0218 14:19:50.418551 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="90dec55d-864d-49da-b960-0a8e51e4d0ad" containerName="cinder-scheduler" containerID="cri-o://21a162474f340754b12b22f618a7353fa5b951b83e285a68a10c6cb791a6a248" gracePeriod=30 Feb 18 14:19:50 crc kubenswrapper[4817]: I0218 14:19:50.418572 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="90dec55d-864d-49da-b960-0a8e51e4d0ad" containerName="probe" containerID="cri-o://ec8203377c9136561a642d02e614be61b931ce2db90459c2b1b5eee7adbb51ec" gracePeriod=30 Feb 18 14:19:50 crc kubenswrapper[4817]: I0218 14:19:50.465671 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.224548955 podStartE2EDuration="21.465651588s" podCreationTimestamp="2026-02-18 14:19:29 +0000 UTC" firstStartedPulling="2026-02-18 14:19:31.449443214 +0000 UTC m=+1234.024979197" lastFinishedPulling="2026-02-18 14:19:48.690545847 +0000 UTC m=+1251.266081830" observedRunningTime="2026-02-18 14:19:50.455112483 +0000 UTC m=+1253.030648476" watchObservedRunningTime="2026-02-18 14:19:50.465651588 +0000 UTC m=+1253.041187581" Feb 18 14:19:50 crc kubenswrapper[4817]: I0218 14:19:50.576922 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="eae76135-65ce-405b-8f64-55bb66e7de22" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.191:9292/healthcheck\": EOF" Feb 18 14:19:50 crc kubenswrapper[4817]: I0218 14:19:50.576925 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="eae76135-65ce-405b-8f64-55bb66e7de22" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.191:9292/healthcheck\": read tcp 10.217.0.2:35706->10.217.0.191:9292: read: connection reset by peer" Feb 18 14:19:51 crc kubenswrapper[4817]: I0218 14:19:51.428309 4817 generic.go:334] "Generic (PLEG): container finished" podID="eae76135-65ce-405b-8f64-55bb66e7de22" containerID="00b7ee1455cd7a3092cd842cc05733b05151c0b5e49ea4f38c9ac945e2102a1b" exitCode=143 Feb 18 14:19:51 crc kubenswrapper[4817]: I0218 14:19:51.428405 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eae76135-65ce-405b-8f64-55bb66e7de22","Type":"ContainerDied","Data":"00b7ee1455cd7a3092cd842cc05733b05151c0b5e49ea4f38c9ac945e2102a1b"} Feb 18 14:19:52 crc kubenswrapper[4817]: I0218 14:19:52.446581 4817 generic.go:334] "Generic (PLEG): container finished" podID="eae76135-65ce-405b-8f64-55bb66e7de22" containerID="bc36f162fe93a5e7759536bc57cd9d936b0f1585401521e41f1f3361da1146dc" exitCode=0 Feb 18 14:19:52 crc kubenswrapper[4817]: I0218 14:19:52.446954 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eae76135-65ce-405b-8f64-55bb66e7de22","Type":"ContainerDied","Data":"bc36f162fe93a5e7759536bc57cd9d936b0f1585401521e41f1f3361da1146dc"} Feb 18 14:19:52 crc kubenswrapper[4817]: I0218 14:19:52.448710 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc799768-b6dd-4b19-aee6-909d985e2441","Type":"ContainerStarted","Data":"e73efb26d3733ad0e06c88042c77e68729f787341bdf0e0883bdd20cab733a92"} Feb 18 14:19:52 crc kubenswrapper[4817]: I0218 14:19:52.452178 4817 generic.go:334] "Generic (PLEG): container finished" podID="90dec55d-864d-49da-b960-0a8e51e4d0ad" containerID="ec8203377c9136561a642d02e614be61b931ce2db90459c2b1b5eee7adbb51ec" exitCode=0 Feb 18 14:19:52 crc kubenswrapper[4817]: I0218 14:19:52.452212 4817 generic.go:334] "Generic (PLEG): container finished" podID="90dec55d-864d-49da-b960-0a8e51e4d0ad" containerID="21a162474f340754b12b22f618a7353fa5b951b83e285a68a10c6cb791a6a248" exitCode=0 Feb 18 14:19:52 crc kubenswrapper[4817]: I0218 14:19:52.452234 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90dec55d-864d-49da-b960-0a8e51e4d0ad","Type":"ContainerDied","Data":"ec8203377c9136561a642d02e614be61b931ce2db90459c2b1b5eee7adbb51ec"} Feb 18 14:19:52 crc kubenswrapper[4817]: I0218 14:19:52.452258 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90dec55d-864d-49da-b960-0a8e51e4d0ad","Type":"ContainerDied","Data":"21a162474f340754b12b22f618a7353fa5b951b83e285a68a10c6cb791a6a248"} Feb 18 14:19:52 crc kubenswrapper[4817]: I0218 14:19:52.731533 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 14:19:52 crc kubenswrapper[4817]: I0218 14:19:52.826721 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eae76135-65ce-405b-8f64-55bb66e7de22-logs\") pod \"eae76135-65ce-405b-8f64-55bb66e7de22\" (UID: \"eae76135-65ce-405b-8f64-55bb66e7de22\") " Feb 18 14:19:52 crc kubenswrapper[4817]: I0218 14:19:52.826802 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb6sp\" (UniqueName: \"kubernetes.io/projected/eae76135-65ce-405b-8f64-55bb66e7de22-kube-api-access-kb6sp\") pod \"eae76135-65ce-405b-8f64-55bb66e7de22\" (UID: \"eae76135-65ce-405b-8f64-55bb66e7de22\") " Feb 18 14:19:52 crc kubenswrapper[4817]: I0218 14:19:52.826892 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eae76135-65ce-405b-8f64-55bb66e7de22-combined-ca-bundle\") pod \"eae76135-65ce-405b-8f64-55bb66e7de22\" (UID: \"eae76135-65ce-405b-8f64-55bb66e7de22\") " Feb 18 14:19:52 crc kubenswrapper[4817]: I0218 14:19:52.826939 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eae76135-65ce-405b-8f64-55bb66e7de22-config-data\") pod \"eae76135-65ce-405b-8f64-55bb66e7de22\" (UID: \"eae76135-65ce-405b-8f64-55bb66e7de22\") " Feb 18 14:19:52 crc kubenswrapper[4817]: I0218 14:19:52.827088 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be497b0f-a266-4a29-bed4-fc9e942da919\") pod \"eae76135-65ce-405b-8f64-55bb66e7de22\" (UID: \"eae76135-65ce-405b-8f64-55bb66e7de22\") " Feb 18 14:19:52 crc kubenswrapper[4817]: I0218 14:19:52.827142 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eae76135-65ce-405b-8f64-55bb66e7de22-internal-tls-certs\") pod \"eae76135-65ce-405b-8f64-55bb66e7de22\" (UID: \"eae76135-65ce-405b-8f64-55bb66e7de22\") " Feb 18 14:19:52 crc kubenswrapper[4817]: I0218 14:19:52.827275 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eae76135-65ce-405b-8f64-55bb66e7de22-scripts\") pod \"eae76135-65ce-405b-8f64-55bb66e7de22\" (UID: \"eae76135-65ce-405b-8f64-55bb66e7de22\") " Feb 18 14:19:52 crc kubenswrapper[4817]: I0218 14:19:52.827314 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eae76135-65ce-405b-8f64-55bb66e7de22-httpd-run\") pod \"eae76135-65ce-405b-8f64-55bb66e7de22\" (UID: \"eae76135-65ce-405b-8f64-55bb66e7de22\") " Feb 18 14:19:52 crc kubenswrapper[4817]: I0218 14:19:52.827338 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eae76135-65ce-405b-8f64-55bb66e7de22-logs" (OuterVolumeSpecName: "logs") pod "eae76135-65ce-405b-8f64-55bb66e7de22" (UID: "eae76135-65ce-405b-8f64-55bb66e7de22"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:19:52 crc kubenswrapper[4817]: I0218 14:19:52.827955 4817 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eae76135-65ce-405b-8f64-55bb66e7de22-logs\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:52 crc kubenswrapper[4817]: I0218 14:19:52.828289 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eae76135-65ce-405b-8f64-55bb66e7de22-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "eae76135-65ce-405b-8f64-55bb66e7de22" (UID: "eae76135-65ce-405b-8f64-55bb66e7de22"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:19:52 crc kubenswrapper[4817]: I0218 14:19:52.838150 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eae76135-65ce-405b-8f64-55bb66e7de22-scripts" (OuterVolumeSpecName: "scripts") pod "eae76135-65ce-405b-8f64-55bb66e7de22" (UID: "eae76135-65ce-405b-8f64-55bb66e7de22"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:52 crc kubenswrapper[4817]: I0218 14:19:52.838212 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eae76135-65ce-405b-8f64-55bb66e7de22-kube-api-access-kb6sp" (OuterVolumeSpecName: "kube-api-access-kb6sp") pod "eae76135-65ce-405b-8f64-55bb66e7de22" (UID: "eae76135-65ce-405b-8f64-55bb66e7de22"). InnerVolumeSpecName "kube-api-access-kb6sp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:19:52 crc kubenswrapper[4817]: I0218 14:19:52.856756 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be497b0f-a266-4a29-bed4-fc9e942da919" (OuterVolumeSpecName: "glance") pod "eae76135-65ce-405b-8f64-55bb66e7de22" (UID: "eae76135-65ce-405b-8f64-55bb66e7de22"). InnerVolumeSpecName "pvc-be497b0f-a266-4a29-bed4-fc9e942da919". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 18 14:19:52 crc kubenswrapper[4817]: I0218 14:19:52.879339 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eae76135-65ce-405b-8f64-55bb66e7de22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eae76135-65ce-405b-8f64-55bb66e7de22" (UID: "eae76135-65ce-405b-8f64-55bb66e7de22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:52 crc kubenswrapper[4817]: I0218 14:19:52.915125 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eae76135-65ce-405b-8f64-55bb66e7de22-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "eae76135-65ce-405b-8f64-55bb66e7de22" (UID: "eae76135-65ce-405b-8f64-55bb66e7de22"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:52 crc kubenswrapper[4817]: I0218 14:19:52.924123 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eae76135-65ce-405b-8f64-55bb66e7de22-config-data" (OuterVolumeSpecName: "config-data") pod "eae76135-65ce-405b-8f64-55bb66e7de22" (UID: "eae76135-65ce-405b-8f64-55bb66e7de22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:52 crc kubenswrapper[4817]: I0218 14:19:52.931472 4817 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-be497b0f-a266-4a29-bed4-fc9e942da919\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be497b0f-a266-4a29-bed4-fc9e942da919\") on node \"crc\" " Feb 18 14:19:52 crc kubenswrapper[4817]: I0218 14:19:52.931513 4817 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eae76135-65ce-405b-8f64-55bb66e7de22-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:52 crc kubenswrapper[4817]: I0218 14:19:52.931527 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eae76135-65ce-405b-8f64-55bb66e7de22-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:52 crc kubenswrapper[4817]: I0218 14:19:52.931538 4817 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eae76135-65ce-405b-8f64-55bb66e7de22-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:52 crc kubenswrapper[4817]: I0218 14:19:52.931547 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kb6sp\" (UniqueName: \"kubernetes.io/projected/eae76135-65ce-405b-8f64-55bb66e7de22-kube-api-access-kb6sp\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:52 crc kubenswrapper[4817]: I0218 14:19:52.931557 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eae76135-65ce-405b-8f64-55bb66e7de22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:52 crc kubenswrapper[4817]: I0218 14:19:52.931564 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eae76135-65ce-405b-8f64-55bb66e7de22-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:52 crc kubenswrapper[4817]: I0218 14:19:52.957756 4817 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 18 14:19:52 crc kubenswrapper[4817]: I0218 14:19:52.957957 4817 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-be497b0f-a266-4a29-bed4-fc9e942da919" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be497b0f-a266-4a29-bed4-fc9e942da919") on node "crc" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.033094 4817 reconciler_common.go:293] "Volume detached for volume \"pvc-be497b0f-a266-4a29-bed4-fc9e942da919\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be497b0f-a266-4a29-bed4-fc9e942da919\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.099376 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.099716 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f8d07244-c4ab-4bd5-8963-9e2213cb3e9a" containerName="ceilometer-central-agent" containerID="cri-o://3c57c14c6a1470b98b5ec85da130fbeefabf9505c6c010408d4e953e7f4a7ca1" gracePeriod=30 Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.100179 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f8d07244-c4ab-4bd5-8963-9e2213cb3e9a" containerName="ceilometer-notification-agent" containerID="cri-o://9b10b55573274a7ecaded4c0b33a6e94a7e8aa54fd8f5827328e8fb7c089eac6" gracePeriod=30 Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.100188 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f8d07244-c4ab-4bd5-8963-9e2213cb3e9a" containerName="sg-core" containerID="cri-o://41f2843f88e5027daafa351b27095b39840388b314da0591e06ecb86d9b2bd49" gracePeriod=30 Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.100179 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f8d07244-c4ab-4bd5-8963-9e2213cb3e9a" containerName="proxy-httpd" containerID="cri-o://0a6317101227e2ad8e39fd2c97747591d2646ae8aefe7e421d9081caf929b39b" gracePeriod=30 Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.227380 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-rk57x"] Feb 18 14:19:53 crc kubenswrapper[4817]: E0218 14:19:53.227901 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eae76135-65ce-405b-8f64-55bb66e7de22" containerName="glance-log" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.227919 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="eae76135-65ce-405b-8f64-55bb66e7de22" containerName="glance-log" Feb 18 14:19:53 crc kubenswrapper[4817]: E0218 14:19:53.227940 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eae76135-65ce-405b-8f64-55bb66e7de22" containerName="glance-httpd" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.227947 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="eae76135-65ce-405b-8f64-55bb66e7de22" containerName="glance-httpd" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.232475 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="eae76135-65ce-405b-8f64-55bb66e7de22" containerName="glance-httpd" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.232505 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="eae76135-65ce-405b-8f64-55bb66e7de22" containerName="glance-log" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.234441 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rk57x" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.289897 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.344686 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90dec55d-864d-49da-b960-0a8e51e4d0ad-scripts\") pod \"90dec55d-864d-49da-b960-0a8e51e4d0ad\" (UID: \"90dec55d-864d-49da-b960-0a8e51e4d0ad\") " Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.344767 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90dec55d-864d-49da-b960-0a8e51e4d0ad-etc-machine-id\") pod \"90dec55d-864d-49da-b960-0a8e51e4d0ad\" (UID: \"90dec55d-864d-49da-b960-0a8e51e4d0ad\") " Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.344812 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86b5k\" (UniqueName: \"kubernetes.io/projected/90dec55d-864d-49da-b960-0a8e51e4d0ad-kube-api-access-86b5k\") pod \"90dec55d-864d-49da-b960-0a8e51e4d0ad\" (UID: \"90dec55d-864d-49da-b960-0a8e51e4d0ad\") " Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.344873 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90dec55d-864d-49da-b960-0a8e51e4d0ad-config-data-custom\") pod \"90dec55d-864d-49da-b960-0a8e51e4d0ad\" (UID: \"90dec55d-864d-49da-b960-0a8e51e4d0ad\") " Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.344921 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90dec55d-864d-49da-b960-0a8e51e4d0ad-config-data\") pod \"90dec55d-864d-49da-b960-0a8e51e4d0ad\" (UID: \"90dec55d-864d-49da-b960-0a8e51e4d0ad\") " Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.345096 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90dec55d-864d-49da-b960-0a8e51e4d0ad-combined-ca-bundle\") pod \"90dec55d-864d-49da-b960-0a8e51e4d0ad\" (UID: \"90dec55d-864d-49da-b960-0a8e51e4d0ad\") " Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.345459 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5ds5\" (UniqueName: \"kubernetes.io/projected/4fe876a5-4499-40ff-b468-d395efa01d26-kube-api-access-t5ds5\") pod \"nova-api-db-create-rk57x\" (UID: \"4fe876a5-4499-40ff-b468-d395efa01d26\") " pod="openstack/nova-api-db-create-rk57x" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.345643 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fe876a5-4499-40ff-b468-d395efa01d26-operator-scripts\") pod \"nova-api-db-create-rk57x\" (UID: \"4fe876a5-4499-40ff-b468-d395efa01d26\") " pod="openstack/nova-api-db-create-rk57x" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.348083 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90dec55d-864d-49da-b960-0a8e51e4d0ad-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "90dec55d-864d-49da-b960-0a8e51e4d0ad" (UID: "90dec55d-864d-49da-b960-0a8e51e4d0ad"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.359466 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90dec55d-864d-49da-b960-0a8e51e4d0ad-kube-api-access-86b5k" (OuterVolumeSpecName: "kube-api-access-86b5k") pod "90dec55d-864d-49da-b960-0a8e51e4d0ad" (UID: "90dec55d-864d-49da-b960-0a8e51e4d0ad"). InnerVolumeSpecName "kube-api-access-86b5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.362123 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-rk57x"] Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.362186 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90dec55d-864d-49da-b960-0a8e51e4d0ad-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "90dec55d-864d-49da-b960-0a8e51e4d0ad" (UID: "90dec55d-864d-49da-b960-0a8e51e4d0ad"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.362236 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90dec55d-864d-49da-b960-0a8e51e4d0ad-scripts" (OuterVolumeSpecName: "scripts") pod "90dec55d-864d-49da-b960-0a8e51e4d0ad" (UID: "90dec55d-864d-49da-b960-0a8e51e4d0ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.392181 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-pjb2r"] Feb 18 14:19:53 crc kubenswrapper[4817]: E0218 14:19:53.392587 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90dec55d-864d-49da-b960-0a8e51e4d0ad" containerName="cinder-scheduler" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.392601 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="90dec55d-864d-49da-b960-0a8e51e4d0ad" containerName="cinder-scheduler" Feb 18 14:19:53 crc kubenswrapper[4817]: E0218 14:19:53.392622 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90dec55d-864d-49da-b960-0a8e51e4d0ad" containerName="probe" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.392630 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="90dec55d-864d-49da-b960-0a8e51e4d0ad" containerName="probe" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.392799 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="90dec55d-864d-49da-b960-0a8e51e4d0ad" containerName="probe" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.392817 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="90dec55d-864d-49da-b960-0a8e51e4d0ad" containerName="cinder-scheduler" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.393496 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pjb2r" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.436777 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-pjb2r"] Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.450317 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fe876a5-4499-40ff-b468-d395efa01d26-operator-scripts\") pod \"nova-api-db-create-rk57x\" (UID: \"4fe876a5-4499-40ff-b468-d395efa01d26\") " pod="openstack/nova-api-db-create-rk57x" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.450461 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5ds5\" (UniqueName: \"kubernetes.io/projected/4fe876a5-4499-40ff-b468-d395efa01d26-kube-api-access-t5ds5\") pod \"nova-api-db-create-rk57x\" (UID: \"4fe876a5-4499-40ff-b468-d395efa01d26\") " pod="openstack/nova-api-db-create-rk57x" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.450502 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1-operator-scripts\") pod \"nova-cell0-db-create-pjb2r\" (UID: \"10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1\") " pod="openstack/nova-cell0-db-create-pjb2r" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.450532 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv2hb\" (UniqueName: \"kubernetes.io/projected/10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1-kube-api-access-sv2hb\") pod \"nova-cell0-db-create-pjb2r\" (UID: \"10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1\") " pod="openstack/nova-cell0-db-create-pjb2r" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.450707 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90dec55d-864d-49da-b960-0a8e51e4d0ad-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.450722 4817 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90dec55d-864d-49da-b960-0a8e51e4d0ad-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.450736 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86b5k\" (UniqueName: \"kubernetes.io/projected/90dec55d-864d-49da-b960-0a8e51e4d0ad-kube-api-access-86b5k\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.450747 4817 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90dec55d-864d-49da-b960-0a8e51e4d0ad-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.451613 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fe876a5-4499-40ff-b468-d395efa01d26-operator-scripts\") pod \"nova-api-db-create-rk57x\" (UID: \"4fe876a5-4499-40ff-b468-d395efa01d26\") " pod="openstack/nova-api-db-create-rk57x" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.525488 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5ds5\" (UniqueName: \"kubernetes.io/projected/4fe876a5-4499-40ff-b468-d395efa01d26-kube-api-access-t5ds5\") pod \"nova-api-db-create-rk57x\" (UID: \"4fe876a5-4499-40ff-b468-d395efa01d26\") " pod="openstack/nova-api-db-create-rk57x" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.541076 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-e448-account-create-update-fzq87"] Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.542747 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e448-account-create-update-fzq87" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.545323 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90dec55d-864d-49da-b960-0a8e51e4d0ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90dec55d-864d-49da-b960-0a8e51e4d0ad" (UID: "90dec55d-864d-49da-b960-0a8e51e4d0ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.545459 4817 generic.go:334] "Generic (PLEG): container finished" podID="f8d07244-c4ab-4bd5-8963-9e2213cb3e9a" containerID="41f2843f88e5027daafa351b27095b39840388b314da0591e06ecb86d9b2bd49" exitCode=2 Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.545536 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a","Type":"ContainerDied","Data":"41f2843f88e5027daafa351b27095b39840388b314da0591e06ecb86d9b2bd49"} Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.552450 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1-operator-scripts\") pod \"nova-cell0-db-create-pjb2r\" (UID: \"10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1\") " pod="openstack/nova-cell0-db-create-pjb2r" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.552501 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv2hb\" (UniqueName: \"kubernetes.io/projected/10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1-kube-api-access-sv2hb\") pod \"nova-cell0-db-create-pjb2r\" (UID: \"10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1\") " pod="openstack/nova-cell0-db-create-pjb2r" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.552626 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90dec55d-864d-49da-b960-0a8e51e4d0ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.553583 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1-operator-scripts\") pod \"nova-cell0-db-create-pjb2r\" (UID: \"10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1\") " pod="openstack/nova-cell0-db-create-pjb2r" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.556155 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-54xpq"] Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.557800 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-54xpq" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.561152 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.561182 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eae76135-65ce-405b-8f64-55bb66e7de22","Type":"ContainerDied","Data":"7f16d213ed7ec1f8b7ec07b54781f87fb8619d8f2cb9021138a53fdad020930d"} Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.564888 4817 scope.go:117] "RemoveContainer" containerID="bc36f162fe93a5e7759536bc57cd9d936b0f1585401521e41f1f3361da1146dc" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.571050 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-e448-account-create-update-fzq87"] Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.578296 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-54xpq"] Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.585433 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rk57x" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.586226 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.619273 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc799768-b6dd-4b19-aee6-909d985e2441","Type":"ContainerStarted","Data":"bb5a168995a9fb2b64aa5c620fcfe2813c98b376e0cae346c04d5d3eb41a19ad"} Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.619528 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv2hb\" (UniqueName: \"kubernetes.io/projected/10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1-kube-api-access-sv2hb\") pod \"nova-cell0-db-create-pjb2r\" (UID: \"10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1\") " pod="openstack/nova-cell0-db-create-pjb2r" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.682604 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90dec55d-864d-49da-b960-0a8e51e4d0ad","Type":"ContainerDied","Data":"4f3ad76fad3426460ffadd9343cac72bcd7b354bcc695f9a091e0bcd3cf85fdd"} Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.682730 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.710771 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90dec55d-864d-49da-b960-0a8e51e4d0ad-config-data" (OuterVolumeSpecName: "config-data") pod "90dec55d-864d-49da-b960-0a8e51e4d0ad" (UID: "90dec55d-864d-49da-b960-0a8e51e4d0ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.737421 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pjb2r" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.755530 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc4d4\" (UniqueName: \"kubernetes.io/projected/42f4b322-ace9-42b0-944b-c5fa3181fc54-kube-api-access-vc4d4\") pod \"nova-cell1-db-create-54xpq\" (UID: \"42f4b322-ace9-42b0-944b-c5fa3181fc54\") " pod="openstack/nova-cell1-db-create-54xpq" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.755620 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/585ec0f8-a374-44ae-8b97-024af4983f69-operator-scripts\") pod \"nova-api-e448-account-create-update-fzq87\" (UID: \"585ec0f8-a374-44ae-8b97-024af4983f69\") " pod="openstack/nova-api-e448-account-create-update-fzq87" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.755721 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nhxs\" (UniqueName: \"kubernetes.io/projected/585ec0f8-a374-44ae-8b97-024af4983f69-kube-api-access-2nhxs\") pod \"nova-api-e448-account-create-update-fzq87\" (UID: \"585ec0f8-a374-44ae-8b97-024af4983f69\") " pod="openstack/nova-api-e448-account-create-update-fzq87" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.755747 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42f4b322-ace9-42b0-944b-c5fa3181fc54-operator-scripts\") pod \"nova-cell1-db-create-54xpq\" (UID: \"42f4b322-ace9-42b0-944b-c5fa3181fc54\") " pod="openstack/nova-cell1-db-create-54xpq" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.755856 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90dec55d-864d-49da-b960-0a8e51e4d0ad-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.765515 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.765494337 podStartE2EDuration="5.765494337s" podCreationTimestamp="2026-02-18 14:19:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:19:53.692532563 +0000 UTC m=+1256.268068546" watchObservedRunningTime="2026-02-18 14:19:53.765494337 +0000 UTC m=+1256.341030320" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.806336 4817 scope.go:117] "RemoveContainer" containerID="00b7ee1455cd7a3092cd842cc05733b05151c0b5e49ea4f38c9ac945e2102a1b" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.806457 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-16c7-account-create-update-qgzsl"] Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.807760 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-16c7-account-create-update-qgzsl" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.812290 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.841533 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-16c7-account-create-update-qgzsl"] Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.857494 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nhxs\" (UniqueName: \"kubernetes.io/projected/585ec0f8-a374-44ae-8b97-024af4983f69-kube-api-access-2nhxs\") pod \"nova-api-e448-account-create-update-fzq87\" (UID: \"585ec0f8-a374-44ae-8b97-024af4983f69\") " pod="openstack/nova-api-e448-account-create-update-fzq87" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.857568 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42f4b322-ace9-42b0-944b-c5fa3181fc54-operator-scripts\") pod \"nova-cell1-db-create-54xpq\" (UID: \"42f4b322-ace9-42b0-944b-c5fa3181fc54\") " pod="openstack/nova-cell1-db-create-54xpq" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.857734 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc4d4\" (UniqueName: \"kubernetes.io/projected/42f4b322-ace9-42b0-944b-c5fa3181fc54-kube-api-access-vc4d4\") pod \"nova-cell1-db-create-54xpq\" (UID: \"42f4b322-ace9-42b0-944b-c5fa3181fc54\") " pod="openstack/nova-cell1-db-create-54xpq" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.857789 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/585ec0f8-a374-44ae-8b97-024af4983f69-operator-scripts\") pod \"nova-api-e448-account-create-update-fzq87\" (UID: \"585ec0f8-a374-44ae-8b97-024af4983f69\") " pod="openstack/nova-api-e448-account-create-update-fzq87" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.858682 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/585ec0f8-a374-44ae-8b97-024af4983f69-operator-scripts\") pod \"nova-api-e448-account-create-update-fzq87\" (UID: \"585ec0f8-a374-44ae-8b97-024af4983f69\") " pod="openstack/nova-api-e448-account-create-update-fzq87" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.858998 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42f4b322-ace9-42b0-944b-c5fa3181fc54-operator-scripts\") pod \"nova-cell1-db-create-54xpq\" (UID: \"42f4b322-ace9-42b0-944b-c5fa3181fc54\") " pod="openstack/nova-cell1-db-create-54xpq" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.883695 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc4d4\" (UniqueName: \"kubernetes.io/projected/42f4b322-ace9-42b0-944b-c5fa3181fc54-kube-api-access-vc4d4\") pod \"nova-cell1-db-create-54xpq\" (UID: \"42f4b322-ace9-42b0-944b-c5fa3181fc54\") " pod="openstack/nova-cell1-db-create-54xpq" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.885077 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nhxs\" (UniqueName: \"kubernetes.io/projected/585ec0f8-a374-44ae-8b97-024af4983f69-kube-api-access-2nhxs\") pod \"nova-api-e448-account-create-update-fzq87\" (UID: \"585ec0f8-a374-44ae-8b97-024af4983f69\") " pod="openstack/nova-api-e448-account-create-update-fzq87" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.959416 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpzq9\" (UniqueName: \"kubernetes.io/projected/43f9dae0-f2ed-4f91-b922-6f3432c8997d-kube-api-access-tpzq9\") pod \"nova-cell0-16c7-account-create-update-qgzsl\" (UID: \"43f9dae0-f2ed-4f91-b922-6f3432c8997d\") " pod="openstack/nova-cell0-16c7-account-create-update-qgzsl" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.959560 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43f9dae0-f2ed-4f91-b922-6f3432c8997d-operator-scripts\") pod \"nova-cell0-16c7-account-create-update-qgzsl\" (UID: \"43f9dae0-f2ed-4f91-b922-6f3432c8997d\") " pod="openstack/nova-cell0-16c7-account-create-update-qgzsl" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.963623 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-9b47-account-create-update-cxfbq"] Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.964995 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9b47-account-create-update-cxfbq" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.968700 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 18 14:19:53 crc kubenswrapper[4817]: I0218 14:19:53.971136 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e448-account-create-update-fzq87" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.035321 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9b47-account-create-update-cxfbq"] Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.062277 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lvpj\" (UniqueName: \"kubernetes.io/projected/c5d3725d-5bb0-4edd-b707-6690d2ac99f5-kube-api-access-5lvpj\") pod \"nova-cell1-9b47-account-create-update-cxfbq\" (UID: \"c5d3725d-5bb0-4edd-b707-6690d2ac99f5\") " pod="openstack/nova-cell1-9b47-account-create-update-cxfbq" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.062341 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpzq9\" (UniqueName: \"kubernetes.io/projected/43f9dae0-f2ed-4f91-b922-6f3432c8997d-kube-api-access-tpzq9\") pod \"nova-cell0-16c7-account-create-update-qgzsl\" (UID: \"43f9dae0-f2ed-4f91-b922-6f3432c8997d\") " pod="openstack/nova-cell0-16c7-account-create-update-qgzsl" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.062431 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5d3725d-5bb0-4edd-b707-6690d2ac99f5-operator-scripts\") pod \"nova-cell1-9b47-account-create-update-cxfbq\" (UID: \"c5d3725d-5bb0-4edd-b707-6690d2ac99f5\") " pod="openstack/nova-cell1-9b47-account-create-update-cxfbq" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.062474 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43f9dae0-f2ed-4f91-b922-6f3432c8997d-operator-scripts\") pod \"nova-cell0-16c7-account-create-update-qgzsl\" (UID: \"43f9dae0-f2ed-4f91-b922-6f3432c8997d\") " pod="openstack/nova-cell0-16c7-account-create-update-qgzsl" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.063597 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43f9dae0-f2ed-4f91-b922-6f3432c8997d-operator-scripts\") pod \"nova-cell0-16c7-account-create-update-qgzsl\" (UID: \"43f9dae0-f2ed-4f91-b922-6f3432c8997d\") " pod="openstack/nova-cell0-16c7-account-create-update-qgzsl" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.099096 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpzq9\" (UniqueName: \"kubernetes.io/projected/43f9dae0-f2ed-4f91-b922-6f3432c8997d-kube-api-access-tpzq9\") pod \"nova-cell0-16c7-account-create-update-qgzsl\" (UID: \"43f9dae0-f2ed-4f91-b922-6f3432c8997d\") " pod="openstack/nova-cell0-16c7-account-create-update-qgzsl" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.113799 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-16c7-account-create-update-qgzsl" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.114281 4817 scope.go:117] "RemoveContainer" containerID="ec8203377c9136561a642d02e614be61b931ce2db90459c2b1b5eee7adbb51ec" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.130889 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-54xpq" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.165177 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5d3725d-5bb0-4edd-b707-6690d2ac99f5-operator-scripts\") pod \"nova-cell1-9b47-account-create-update-cxfbq\" (UID: \"c5d3725d-5bb0-4edd-b707-6690d2ac99f5\") " pod="openstack/nova-cell1-9b47-account-create-update-cxfbq" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.165314 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lvpj\" (UniqueName: \"kubernetes.io/projected/c5d3725d-5bb0-4edd-b707-6690d2ac99f5-kube-api-access-5lvpj\") pod \"nova-cell1-9b47-account-create-update-cxfbq\" (UID: \"c5d3725d-5bb0-4edd-b707-6690d2ac99f5\") " pod="openstack/nova-cell1-9b47-account-create-update-cxfbq" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.168189 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.170457 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5d3725d-5bb0-4edd-b707-6690d2ac99f5-operator-scripts\") pod \"nova-cell1-9b47-account-create-update-cxfbq\" (UID: \"c5d3725d-5bb0-4edd-b707-6690d2ac99f5\") " pod="openstack/nova-cell1-9b47-account-create-update-cxfbq" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.200651 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lvpj\" (UniqueName: \"kubernetes.io/projected/c5d3725d-5bb0-4edd-b707-6690d2ac99f5-kube-api-access-5lvpj\") pod \"nova-cell1-9b47-account-create-update-cxfbq\" (UID: \"c5d3725d-5bb0-4edd-b707-6690d2ac99f5\") " pod="openstack/nova-cell1-9b47-account-create-update-cxfbq" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.272326 4817 scope.go:117] "RemoveContainer" containerID="21a162474f340754b12b22f618a7353fa5b951b83e285a68a10c6cb791a6a248" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.303088 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9b47-account-create-update-cxfbq" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.306419 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.306453 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.315697 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.315821 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.331806 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.334792 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.336889 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.362676 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.379103 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.381049 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.385553 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.426679 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.502732 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89455d4a-c424-4e7a-85c5-42163318e132-scripts\") pod \"cinder-scheduler-0\" (UID: \"89455d4a-c424-4e7a-85c5-42163318e132\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.503876 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7hlm\" (UniqueName: \"kubernetes.io/projected/b542b984-8146-47e2-b20a-1b344762c302-kube-api-access-p7hlm\") pod \"glance-default-internal-api-0\" (UID: \"b542b984-8146-47e2-b20a-1b344762c302\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.504219 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b542b984-8146-47e2-b20a-1b344762c302-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b542b984-8146-47e2-b20a-1b344762c302\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.504367 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tvbl\" (UniqueName: \"kubernetes.io/projected/89455d4a-c424-4e7a-85c5-42163318e132-kube-api-access-7tvbl\") pod \"cinder-scheduler-0\" (UID: \"89455d4a-c424-4e7a-85c5-42163318e132\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.504447 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89455d4a-c424-4e7a-85c5-42163318e132-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"89455d4a-c424-4e7a-85c5-42163318e132\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.504529 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/89455d4a-c424-4e7a-85c5-42163318e132-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"89455d4a-c424-4e7a-85c5-42163318e132\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.504667 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b542b984-8146-47e2-b20a-1b344762c302-logs\") pod \"glance-default-internal-api-0\" (UID: \"b542b984-8146-47e2-b20a-1b344762c302\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.504761 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b542b984-8146-47e2-b20a-1b344762c302-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b542b984-8146-47e2-b20a-1b344762c302\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.504869 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89455d4a-c424-4e7a-85c5-42163318e132-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"89455d4a-c424-4e7a-85c5-42163318e132\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.504960 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-be497b0f-a266-4a29-bed4-fc9e942da919\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be497b0f-a266-4a29-bed4-fc9e942da919\") pod \"glance-default-internal-api-0\" (UID: \"b542b984-8146-47e2-b20a-1b344762c302\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.505237 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b542b984-8146-47e2-b20a-1b344762c302-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b542b984-8146-47e2-b20a-1b344762c302\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.505355 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b542b984-8146-47e2-b20a-1b344762c302-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b542b984-8146-47e2-b20a-1b344762c302\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.505945 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89455d4a-c424-4e7a-85c5-42163318e132-config-data\") pod \"cinder-scheduler-0\" (UID: \"89455d4a-c424-4e7a-85c5-42163318e132\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.506094 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b542b984-8146-47e2-b20a-1b344762c302-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b542b984-8146-47e2-b20a-1b344762c302\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.554167 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sqhxh"] Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.558816 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sqhxh" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.603377 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sqhxh"] Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.608398 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89455d4a-c424-4e7a-85c5-42163318e132-scripts\") pod \"cinder-scheduler-0\" (UID: \"89455d4a-c424-4e7a-85c5-42163318e132\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.608460 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7hlm\" (UniqueName: \"kubernetes.io/projected/b542b984-8146-47e2-b20a-1b344762c302-kube-api-access-p7hlm\") pod \"glance-default-internal-api-0\" (UID: \"b542b984-8146-47e2-b20a-1b344762c302\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.608493 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b542b984-8146-47e2-b20a-1b344762c302-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b542b984-8146-47e2-b20a-1b344762c302\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.608558 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tvbl\" (UniqueName: \"kubernetes.io/projected/89455d4a-c424-4e7a-85c5-42163318e132-kube-api-access-7tvbl\") pod \"cinder-scheduler-0\" (UID: \"89455d4a-c424-4e7a-85c5-42163318e132\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.608582 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89455d4a-c424-4e7a-85c5-42163318e132-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"89455d4a-c424-4e7a-85c5-42163318e132\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.608606 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/89455d4a-c424-4e7a-85c5-42163318e132-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"89455d4a-c424-4e7a-85c5-42163318e132\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.608658 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b542b984-8146-47e2-b20a-1b344762c302-logs\") pod \"glance-default-internal-api-0\" (UID: \"b542b984-8146-47e2-b20a-1b344762c302\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.608683 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b542b984-8146-47e2-b20a-1b344762c302-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b542b984-8146-47e2-b20a-1b344762c302\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.608721 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89455d4a-c424-4e7a-85c5-42163318e132-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"89455d4a-c424-4e7a-85c5-42163318e132\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.608800 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-be497b0f-a266-4a29-bed4-fc9e942da919\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be497b0f-a266-4a29-bed4-fc9e942da919\") pod \"glance-default-internal-api-0\" (UID: \"b542b984-8146-47e2-b20a-1b344762c302\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.608830 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b542b984-8146-47e2-b20a-1b344762c302-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b542b984-8146-47e2-b20a-1b344762c302\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.608868 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b542b984-8146-47e2-b20a-1b344762c302-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b542b984-8146-47e2-b20a-1b344762c302\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.608888 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89455d4a-c424-4e7a-85c5-42163318e132-config-data\") pod \"cinder-scheduler-0\" (UID: \"89455d4a-c424-4e7a-85c5-42163318e132\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.608931 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b542b984-8146-47e2-b20a-1b344762c302-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b542b984-8146-47e2-b20a-1b344762c302\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.611134 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/89455d4a-c424-4e7a-85c5-42163318e132-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"89455d4a-c424-4e7a-85c5-42163318e132\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.611512 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b542b984-8146-47e2-b20a-1b344762c302-logs\") pod \"glance-default-internal-api-0\" (UID: \"b542b984-8146-47e2-b20a-1b344762c302\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.611861 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b542b984-8146-47e2-b20a-1b344762c302-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b542b984-8146-47e2-b20a-1b344762c302\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.619674 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b542b984-8146-47e2-b20a-1b344762c302-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b542b984-8146-47e2-b20a-1b344762c302\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.620048 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89455d4a-c424-4e7a-85c5-42163318e132-scripts\") pod \"cinder-scheduler-0\" (UID: \"89455d4a-c424-4e7a-85c5-42163318e132\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.620642 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b542b984-8146-47e2-b20a-1b344762c302-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b542b984-8146-47e2-b20a-1b344762c302\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.622005 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89455d4a-c424-4e7a-85c5-42163318e132-config-data\") pod \"cinder-scheduler-0\" (UID: \"89455d4a-c424-4e7a-85c5-42163318e132\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.622358 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b542b984-8146-47e2-b20a-1b344762c302-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b542b984-8146-47e2-b20a-1b344762c302\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.622852 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89455d4a-c424-4e7a-85c5-42163318e132-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"89455d4a-c424-4e7a-85c5-42163318e132\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.627184 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b542b984-8146-47e2-b20a-1b344762c302-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b542b984-8146-47e2-b20a-1b344762c302\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.630843 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89455d4a-c424-4e7a-85c5-42163318e132-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"89455d4a-c424-4e7a-85c5-42163318e132\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.632260 4817 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.632309 4817 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-be497b0f-a266-4a29-bed4-fc9e942da919\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be497b0f-a266-4a29-bed4-fc9e942da919\") pod \"glance-default-internal-api-0\" (UID: \"b542b984-8146-47e2-b20a-1b344762c302\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/81a23472548d53035963276e43796643f625826f28e59ad0adc4b50496ee8da7/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.641435 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7hlm\" (UniqueName: \"kubernetes.io/projected/b542b984-8146-47e2-b20a-1b344762c302-kube-api-access-p7hlm\") pod \"glance-default-internal-api-0\" (UID: \"b542b984-8146-47e2-b20a-1b344762c302\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.665276 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tvbl\" (UniqueName: \"kubernetes.io/projected/89455d4a-c424-4e7a-85c5-42163318e132-kube-api-access-7tvbl\") pod \"cinder-scheduler-0\" (UID: \"89455d4a-c424-4e7a-85c5-42163318e132\") " pod="openstack/cinder-scheduler-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.711169 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a938bc12-3666-41cf-b8e5-3fa647fe32f0-catalog-content\") pod \"certified-operators-sqhxh\" (UID: \"a938bc12-3666-41cf-b8e5-3fa647fe32f0\") " pod="openshift-marketplace/certified-operators-sqhxh" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.711229 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a938bc12-3666-41cf-b8e5-3fa647fe32f0-utilities\") pod \"certified-operators-sqhxh\" (UID: \"a938bc12-3666-41cf-b8e5-3fa647fe32f0\") " pod="openshift-marketplace/certified-operators-sqhxh" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.711417 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zdjh\" (UniqueName: \"kubernetes.io/projected/a938bc12-3666-41cf-b8e5-3fa647fe32f0-kube-api-access-2zdjh\") pod \"certified-operators-sqhxh\" (UID: \"a938bc12-3666-41cf-b8e5-3fa647fe32f0\") " pod="openshift-marketplace/certified-operators-sqhxh" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.744357 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-be497b0f-a266-4a29-bed4-fc9e942da919\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be497b0f-a266-4a29-bed4-fc9e942da919\") pod \"glance-default-internal-api-0\" (UID: \"b542b984-8146-47e2-b20a-1b344762c302\") " pod="openstack/glance-default-internal-api-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.759868 4817 generic.go:334] "Generic (PLEG): container finished" podID="c6f7d4df-bc28-4a01-a044-091894ac27c2" containerID="f04251b56135cf24124d6f6b653e2c82951eecc3a89e526f44eabe6a9a119e2f" exitCode=0 Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.763440 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-jgz44" event={"ID":"c6f7d4df-bc28-4a01-a044-091894ac27c2","Type":"ContainerDied","Data":"f04251b56135cf24124d6f6b653e2c82951eecc3a89e526f44eabe6a9a119e2f"} Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.770818 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.813245 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zdjh\" (UniqueName: \"kubernetes.io/projected/a938bc12-3666-41cf-b8e5-3fa647fe32f0-kube-api-access-2zdjh\") pod \"certified-operators-sqhxh\" (UID: \"a938bc12-3666-41cf-b8e5-3fa647fe32f0\") " pod="openshift-marketplace/certified-operators-sqhxh" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.813385 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a938bc12-3666-41cf-b8e5-3fa647fe32f0-catalog-content\") pod \"certified-operators-sqhxh\" (UID: \"a938bc12-3666-41cf-b8e5-3fa647fe32f0\") " pod="openshift-marketplace/certified-operators-sqhxh" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.813421 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a938bc12-3666-41cf-b8e5-3fa647fe32f0-utilities\") pod \"certified-operators-sqhxh\" (UID: \"a938bc12-3666-41cf-b8e5-3fa647fe32f0\") " pod="openshift-marketplace/certified-operators-sqhxh" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.814069 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a938bc12-3666-41cf-b8e5-3fa647fe32f0-utilities\") pod \"certified-operators-sqhxh\" (UID: \"a938bc12-3666-41cf-b8e5-3fa647fe32f0\") " pod="openshift-marketplace/certified-operators-sqhxh" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.814639 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a938bc12-3666-41cf-b8e5-3fa647fe32f0-catalog-content\") pod \"certified-operators-sqhxh\" (UID: \"a938bc12-3666-41cf-b8e5-3fa647fe32f0\") " pod="openshift-marketplace/certified-operators-sqhxh" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.840908 4817 generic.go:334] "Generic (PLEG): container finished" podID="f8d07244-c4ab-4bd5-8963-9e2213cb3e9a" containerID="0a6317101227e2ad8e39fd2c97747591d2646ae8aefe7e421d9081caf929b39b" exitCode=0 Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.840942 4817 generic.go:334] "Generic (PLEG): container finished" podID="f8d07244-c4ab-4bd5-8963-9e2213cb3e9a" containerID="9b10b55573274a7ecaded4c0b33a6e94a7e8aa54fd8f5827328e8fb7c089eac6" exitCode=0 Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.840950 4817 generic.go:334] "Generic (PLEG): container finished" podID="f8d07244-c4ab-4bd5-8963-9e2213cb3e9a" containerID="3c57c14c6a1470b98b5ec85da130fbeefabf9505c6c010408d4e953e7f4a7ca1" exitCode=0 Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.841045 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a","Type":"ContainerDied","Data":"0a6317101227e2ad8e39fd2c97747591d2646ae8aefe7e421d9081caf929b39b"} Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.841072 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a","Type":"ContainerDied","Data":"9b10b55573274a7ecaded4c0b33a6e94a7e8aa54fd8f5827328e8fb7c089eac6"} Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.841103 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a","Type":"ContainerDied","Data":"3c57c14c6a1470b98b5ec85da130fbeefabf9505c6c010408d4e953e7f4a7ca1"} Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.868088 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zdjh\" (UniqueName: \"kubernetes.io/projected/a938bc12-3666-41cf-b8e5-3fa647fe32f0-kube-api-access-2zdjh\") pod \"certified-operators-sqhxh\" (UID: \"a938bc12-3666-41cf-b8e5-3fa647fe32f0\") " pod="openshift-marketplace/certified-operators-sqhxh" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.901815 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-rk57x"] Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.943756 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sqhxh" Feb 18 14:19:54 crc kubenswrapper[4817]: I0218 14:19:54.975277 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 18 14:19:55 crc kubenswrapper[4817]: I0218 14:19:55.132848 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-pjb2r"] Feb 18 14:19:55 crc kubenswrapper[4817]: W0218 14:19:55.191640 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10d7cc80_38b6_46ed_8ec0_4c8ae03eb2b1.slice/crio-adfefa68fbab6423c3f6b55e77b63634176c163871dff6e9b545d353d2d85695 WatchSource:0}: Error finding container adfefa68fbab6423c3f6b55e77b63634176c163871dff6e9b545d353d2d85695: Status 404 returned error can't find the container with id adfefa68fbab6423c3f6b55e77b63634176c163871dff6e9b545d353d2d85695 Feb 18 14:19:55 crc kubenswrapper[4817]: I0218 14:19:55.618647 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-54xpq"] Feb 18 14:19:55 crc kubenswrapper[4817]: W0218 14:19:55.638223 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43f9dae0_f2ed_4f91_b922_6f3432c8997d.slice/crio-48689e4127dc5d8e237fcea680ce46a71ed71fdda702d4ecc48a558221cf0ce9 WatchSource:0}: Error finding container 48689e4127dc5d8e237fcea680ce46a71ed71fdda702d4ecc48a558221cf0ce9: Status 404 returned error can't find the container with id 48689e4127dc5d8e237fcea680ce46a71ed71fdda702d4ecc48a558221cf0ce9 Feb 18 14:19:55 crc kubenswrapper[4817]: I0218 14:19:55.688444 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-16c7-account-create-update-qgzsl"] Feb 18 14:19:55 crc kubenswrapper[4817]: I0218 14:19:55.705313 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:19:55 crc kubenswrapper[4817]: I0218 14:19:55.749258 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2spmb\" (UniqueName: \"kubernetes.io/projected/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-kube-api-access-2spmb\") pod \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\" (UID: \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\") " Feb 18 14:19:55 crc kubenswrapper[4817]: I0218 14:19:55.749303 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-scripts\") pod \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\" (UID: \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\") " Feb 18 14:19:55 crc kubenswrapper[4817]: I0218 14:19:55.749329 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-sg-core-conf-yaml\") pod \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\" (UID: \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\") " Feb 18 14:19:55 crc kubenswrapper[4817]: I0218 14:19:55.749384 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-combined-ca-bundle\") pod \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\" (UID: \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\") " Feb 18 14:19:55 crc kubenswrapper[4817]: I0218 14:19:55.749483 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-run-httpd\") pod \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\" (UID: \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\") " Feb 18 14:19:55 crc kubenswrapper[4817]: I0218 14:19:55.749583 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-config-data\") pod \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\" (UID: \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\") " Feb 18 14:19:55 crc kubenswrapper[4817]: I0218 14:19:55.749631 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-log-httpd\") pod \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\" (UID: \"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a\") " Feb 18 14:19:55 crc kubenswrapper[4817]: I0218 14:19:55.750714 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f8d07244-c4ab-4bd5-8963-9e2213cb3e9a" (UID: "f8d07244-c4ab-4bd5-8963-9e2213cb3e9a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:19:55 crc kubenswrapper[4817]: I0218 14:19:55.757122 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f8d07244-c4ab-4bd5-8963-9e2213cb3e9a" (UID: "f8d07244-c4ab-4bd5-8963-9e2213cb3e9a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:19:55 crc kubenswrapper[4817]: I0218 14:19:55.759897 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-kube-api-access-2spmb" (OuterVolumeSpecName: "kube-api-access-2spmb") pod "f8d07244-c4ab-4bd5-8963-9e2213cb3e9a" (UID: "f8d07244-c4ab-4bd5-8963-9e2213cb3e9a"). InnerVolumeSpecName "kube-api-access-2spmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:19:55 crc kubenswrapper[4817]: I0218 14:19:55.773371 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-e448-account-create-update-fzq87"] Feb 18 14:19:55 crc kubenswrapper[4817]: I0218 14:19:55.795108 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-scripts" (OuterVolumeSpecName: "scripts") pod "f8d07244-c4ab-4bd5-8963-9e2213cb3e9a" (UID: "f8d07244-c4ab-4bd5-8963-9e2213cb3e9a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:55 crc kubenswrapper[4817]: I0218 14:19:55.856505 4817 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:55 crc kubenswrapper[4817]: I0218 14:19:55.856526 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2spmb\" (UniqueName: \"kubernetes.io/projected/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-kube-api-access-2spmb\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:55 crc kubenswrapper[4817]: I0218 14:19:55.856539 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:55 crc kubenswrapper[4817]: I0218 14:19:55.856548 4817 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:55 crc kubenswrapper[4817]: I0218 14:19:55.906243 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f8d07244-c4ab-4bd5-8963-9e2213cb3e9a" (UID: "f8d07244-c4ab-4bd5-8963-9e2213cb3e9a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:55 crc kubenswrapper[4817]: I0218 14:19:55.938390 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rk57x" event={"ID":"4fe876a5-4499-40ff-b468-d395efa01d26","Type":"ContainerStarted","Data":"7536b26f3ab7582e5f8c97ef070a635653e8f1b7325cb897f3d1d94e52ad96ac"} Feb 18 14:19:55 crc kubenswrapper[4817]: I0218 14:19:55.938495 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rk57x" event={"ID":"4fe876a5-4499-40ff-b468-d395efa01d26","Type":"ContainerStarted","Data":"0ca1516243e78c35d92eb4d64536976d5f905d1920cf8d2bf0fcbfdb0719dfe5"} Feb 18 14:19:55 crc kubenswrapper[4817]: I0218 14:19:55.981622 4817 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:55.997797 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8d07244-c4ab-4bd5-8963-9e2213cb3e9a","Type":"ContainerDied","Data":"bb4547964222a00b27ee283cc159d46ca241897827d6e086ca055d9f5273b15f"} Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:55.997842 4817 scope.go:117] "RemoveContainer" containerID="0a6317101227e2ad8e39fd2c97747591d2646ae8aefe7e421d9081caf929b39b" Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:55.997965 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.008484 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-16c7-account-create-update-qgzsl" event={"ID":"43f9dae0-f2ed-4f91-b922-6f3432c8997d","Type":"ContainerStarted","Data":"48689e4127dc5d8e237fcea680ce46a71ed71fdda702d4ecc48a558221cf0ce9"} Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.013480 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9b47-account-create-update-cxfbq"] Feb 18 14:19:56 crc kubenswrapper[4817]: W0218 14:19:56.019954 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89455d4a_c424_4e7a_85c5_42163318e132.slice/crio-981ba6cc597de180f6af7f99373e10e2b4c033d3ae34a6ae2040733c396711f2 WatchSource:0}: Error finding container 981ba6cc597de180f6af7f99373e10e2b4c033d3ae34a6ae2040733c396711f2: Status 404 returned error can't find the container with id 981ba6cc597de180f6af7f99373e10e2b4c033d3ae34a6ae2040733c396711f2 Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.021786 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pjb2r" event={"ID":"10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1","Type":"ContainerStarted","Data":"a8000b31370d1a61d32be27c11e56e9a063f1030ec9c7d7906a0b4c19aa8246a"} Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.021823 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pjb2r" event={"ID":"10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1","Type":"ContainerStarted","Data":"adfefa68fbab6423c3f6b55e77b63634176c163871dff6e9b545d353d2d85695"} Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.022892 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-rk57x" podStartSLOduration=3.022881391 podStartE2EDuration="3.022881391s" podCreationTimestamp="2026-02-18 14:19:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:19:55.993205595 +0000 UTC m=+1258.568741578" watchObservedRunningTime="2026-02-18 14:19:56.022881391 +0000 UTC m=+1258.598417364" Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.060261 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e448-account-create-update-fzq87" event={"ID":"585ec0f8-a374-44ae-8b97-024af4983f69","Type":"ContainerStarted","Data":"dde004a2e88eef26f861ef181840ce352289df68e44f3d0b079bcf5b728d8b28"} Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.072840 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.081428 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-pjb2r" podStartSLOduration=3.081403702 podStartE2EDuration="3.081403702s" podCreationTimestamp="2026-02-18 14:19:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:19:56.047353486 +0000 UTC m=+1258.622889469" watchObservedRunningTime="2026-02-18 14:19:56.081403702 +0000 UTC m=+1258.656939695" Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.113292 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-54xpq" event={"ID":"42f4b322-ace9-42b0-944b-c5fa3181fc54","Type":"ContainerStarted","Data":"85f9cb0a1926cc0c5c3d15d4e8da22aa02bb937d58b4999780c080f93a5cd480"} Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.159621 4817 scope.go:117] "RemoveContainer" containerID="41f2843f88e5027daafa351b27095b39840388b314da0591e06ecb86d9b2bd49" Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.205295 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90dec55d-864d-49da-b960-0a8e51e4d0ad" path="/var/lib/kubelet/pods/90dec55d-864d-49da-b960-0a8e51e4d0ad/volumes" Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.205457 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8d07244-c4ab-4bd5-8963-9e2213cb3e9a" (UID: "f8d07244-c4ab-4bd5-8963-9e2213cb3e9a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.208922 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eae76135-65ce-405b-8f64-55bb66e7de22" path="/var/lib/kubelet/pods/eae76135-65ce-405b-8f64-55bb66e7de22/volumes" Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.294467 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.368180 4817 scope.go:117] "RemoveContainer" containerID="9b10b55573274a7ecaded4c0b33a6e94a7e8aa54fd8f5827328e8fb7c089eac6" Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.437493 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sqhxh"] Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.456320 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-config-data" (OuterVolumeSpecName: "config-data") pod "f8d07244-c4ab-4bd5-8963-9e2213cb3e9a" (UID: "f8d07244-c4ab-4bd5-8963-9e2213cb3e9a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.458168 4817 scope.go:117] "RemoveContainer" containerID="3c57c14c6a1470b98b5ec85da130fbeefabf9505c6c010408d4e953e7f4a7ca1" Feb 18 14:19:56 crc kubenswrapper[4817]: W0218 14:19:56.495445 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda938bc12_3666_41cf_b8e5_3fa647fe32f0.slice/crio-ebf4308eacf7d19fc07c508db18b16275883c3406256abe69cbb109dc19a29b7 WatchSource:0}: Error finding container ebf4308eacf7d19fc07c508db18b16275883c3406256abe69cbb109dc19a29b7: Status 404 returned error can't find the container with id ebf4308eacf7d19fc07c508db18b16275883c3406256abe69cbb109dc19a29b7 Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.501218 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.562042 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.730487 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.757166 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.798269 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:19:56 crc kubenswrapper[4817]: E0218 14:19:56.802041 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8d07244-c4ab-4bd5-8963-9e2213cb3e9a" containerName="sg-core" Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.802070 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8d07244-c4ab-4bd5-8963-9e2213cb3e9a" containerName="sg-core" Feb 18 14:19:56 crc kubenswrapper[4817]: E0218 14:19:56.802091 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8d07244-c4ab-4bd5-8963-9e2213cb3e9a" containerName="ceilometer-central-agent" Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.802099 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8d07244-c4ab-4bd5-8963-9e2213cb3e9a" containerName="ceilometer-central-agent" Feb 18 14:19:56 crc kubenswrapper[4817]: E0218 14:19:56.802116 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8d07244-c4ab-4bd5-8963-9e2213cb3e9a" containerName="proxy-httpd" Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.802125 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8d07244-c4ab-4bd5-8963-9e2213cb3e9a" containerName="proxy-httpd" Feb 18 14:19:56 crc kubenswrapper[4817]: E0218 14:19:56.802144 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8d07244-c4ab-4bd5-8963-9e2213cb3e9a" containerName="ceilometer-notification-agent" Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.802151 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8d07244-c4ab-4bd5-8963-9e2213cb3e9a" containerName="ceilometer-notification-agent" Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.802494 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8d07244-c4ab-4bd5-8963-9e2213cb3e9a" containerName="ceilometer-notification-agent" Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.802520 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8d07244-c4ab-4bd5-8963-9e2213cb3e9a" containerName="ceilometer-central-agent" Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.802537 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8d07244-c4ab-4bd5-8963-9e2213cb3e9a" containerName="sg-core" Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.802551 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8d07244-c4ab-4bd5-8963-9e2213cb3e9a" containerName="proxy-httpd" Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.805808 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.815166 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.817183 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 14:19:56 crc kubenswrapper[4817]: I0218 14:19:56.915052 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.021035 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4194006-61f7-4c39-91cf-bc5a9003a2d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\") " pod="openstack/ceilometer-0" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.021223 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msdnz\" (UniqueName: \"kubernetes.io/projected/f4194006-61f7-4c39-91cf-bc5a9003a2d1-kube-api-access-msdnz\") pod \"ceilometer-0\" (UID: \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\") " pod="openstack/ceilometer-0" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.021323 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4194006-61f7-4c39-91cf-bc5a9003a2d1-config-data\") pod \"ceilometer-0\" (UID: \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\") " pod="openstack/ceilometer-0" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.021347 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4194006-61f7-4c39-91cf-bc5a9003a2d1-run-httpd\") pod \"ceilometer-0\" (UID: \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\") " pod="openstack/ceilometer-0" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.021431 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f4194006-61f7-4c39-91cf-bc5a9003a2d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\") " pod="openstack/ceilometer-0" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.021509 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4194006-61f7-4c39-91cf-bc5a9003a2d1-scripts\") pod \"ceilometer-0\" (UID: \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\") " pod="openstack/ceilometer-0" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.021547 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4194006-61f7-4c39-91cf-bc5a9003a2d1-log-httpd\") pod \"ceilometer-0\" (UID: \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\") " pod="openstack/ceilometer-0" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.105079 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-jgz44" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.129840 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msdnz\" (UniqueName: \"kubernetes.io/projected/f4194006-61f7-4c39-91cf-bc5a9003a2d1-kube-api-access-msdnz\") pod \"ceilometer-0\" (UID: \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\") " pod="openstack/ceilometer-0" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.129932 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4194006-61f7-4c39-91cf-bc5a9003a2d1-config-data\") pod \"ceilometer-0\" (UID: \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\") " pod="openstack/ceilometer-0" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.129952 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4194006-61f7-4c39-91cf-bc5a9003a2d1-run-httpd\") pod \"ceilometer-0\" (UID: \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\") " pod="openstack/ceilometer-0" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.130050 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f4194006-61f7-4c39-91cf-bc5a9003a2d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\") " pod="openstack/ceilometer-0" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.130092 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4194006-61f7-4c39-91cf-bc5a9003a2d1-scripts\") pod \"ceilometer-0\" (UID: \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\") " pod="openstack/ceilometer-0" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.130116 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4194006-61f7-4c39-91cf-bc5a9003a2d1-log-httpd\") pod \"ceilometer-0\" (UID: \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\") " pod="openstack/ceilometer-0" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.130164 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4194006-61f7-4c39-91cf-bc5a9003a2d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\") " pod="openstack/ceilometer-0" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.131717 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4194006-61f7-4c39-91cf-bc5a9003a2d1-run-httpd\") pod \"ceilometer-0\" (UID: \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\") " pod="openstack/ceilometer-0" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.133159 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4194006-61f7-4c39-91cf-bc5a9003a2d1-log-httpd\") pod \"ceilometer-0\" (UID: \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\") " pod="openstack/ceilometer-0" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.149030 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4194006-61f7-4c39-91cf-bc5a9003a2d1-config-data\") pod \"ceilometer-0\" (UID: \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\") " pod="openstack/ceilometer-0" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.151523 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4194006-61f7-4c39-91cf-bc5a9003a2d1-scripts\") pod \"ceilometer-0\" (UID: \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\") " pod="openstack/ceilometer-0" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.154575 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4194006-61f7-4c39-91cf-bc5a9003a2d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\") " pod="openstack/ceilometer-0" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.157298 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9b47-account-create-update-cxfbq" event={"ID":"c5d3725d-5bb0-4edd-b707-6690d2ac99f5","Type":"ContainerStarted","Data":"285489d62434a2d1524abfb4ca7649979b15a2848e6d6d60c03d663094c8870b"} Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.157396 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9b47-account-create-update-cxfbq" event={"ID":"c5d3725d-5bb0-4edd-b707-6690d2ac99f5","Type":"ContainerStarted","Data":"9dc2bcd640a30ec8537c76920ad6547038b920518352a865bd66f20f18757ab6"} Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.157412 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e448-account-create-update-fzq87" event={"ID":"585ec0f8-a374-44ae-8b97-024af4983f69","Type":"ContainerStarted","Data":"a57f60ed37120dae989c8e17c03a3028d7155108550b32a921b8a3e1b9462836"} Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.159282 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msdnz\" (UniqueName: \"kubernetes.io/projected/f4194006-61f7-4c39-91cf-bc5a9003a2d1-kube-api-access-msdnz\") pod \"ceilometer-0\" (UID: \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\") " pod="openstack/ceilometer-0" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.162098 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-54xpq" event={"ID":"42f4b322-ace9-42b0-944b-c5fa3181fc54","Type":"ContainerStarted","Data":"d103343187fc0b61663b10abdb9c7a21d00e8d59353a5b51024fe08bcfe57e4d"} Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.168091 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f4194006-61f7-4c39-91cf-bc5a9003a2d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\") " pod="openstack/ceilometer-0" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.172902 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqhxh" event={"ID":"a938bc12-3666-41cf-b8e5-3fa647fe32f0","Type":"ContainerStarted","Data":"ebf4308eacf7d19fc07c508db18b16275883c3406256abe69cbb109dc19a29b7"} Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.206849 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"89455d4a-c424-4e7a-85c5-42163318e132","Type":"ContainerStarted","Data":"981ba6cc597de180f6af7f99373e10e2b4c033d3ae34a6ae2040733c396711f2"} Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.218402 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-16c7-account-create-update-qgzsl" event={"ID":"43f9dae0-f2ed-4f91-b922-6f3432c8997d","Type":"ContainerStarted","Data":"b7c54389cd72035e803566681b0052acb780bca86033ace971da40787be37322"} Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.223995 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-jgz44" event={"ID":"c6f7d4df-bc28-4a01-a044-091894ac27c2","Type":"ContainerDied","Data":"9ec7de488cfce9abf572a11225c170ce0aaf396fa32791239bcc1cff60fbce32"} Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.224043 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ec7de488cfce9abf572a11225c170ce0aaf396fa32791239bcc1cff60fbce32" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.224169 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-jgz44" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.238135 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/c6f7d4df-bc28-4a01-a044-091894ac27c2-certs\") pod \"c6f7d4df-bc28-4a01-a044-091894ac27c2\" (UID: \"c6f7d4df-bc28-4a01-a044-091894ac27c2\") " Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.242335 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f7d4df-bc28-4a01-a044-091894ac27c2-combined-ca-bundle\") pod \"c6f7d4df-bc28-4a01-a044-091894ac27c2\" (UID: \"c6f7d4df-bc28-4a01-a044-091894ac27c2\") " Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.243052 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b542b984-8146-47e2-b20a-1b344762c302","Type":"ContainerStarted","Data":"f909af278d183856eb38c8c9e35372a47bbb772b70cd21650ae0854bf2bd4f2a"} Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.243259 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6f7d4df-bc28-4a01-a044-091894ac27c2-scripts\") pod \"c6f7d4df-bc28-4a01-a044-091894ac27c2\" (UID: \"c6f7d4df-bc28-4a01-a044-091894ac27c2\") " Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.243282 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gwv8\" (UniqueName: \"kubernetes.io/projected/c6f7d4df-bc28-4a01-a044-091894ac27c2-kube-api-access-5gwv8\") pod \"c6f7d4df-bc28-4a01-a044-091894ac27c2\" (UID: \"c6f7d4df-bc28-4a01-a044-091894ac27c2\") " Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.243362 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6f7d4df-bc28-4a01-a044-091894ac27c2-config-data\") pod \"c6f7d4df-bc28-4a01-a044-091894ac27c2\" (UID: \"c6f7d4df-bc28-4a01-a044-091894ac27c2\") " Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.253772 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6f7d4df-bc28-4a01-a044-091894ac27c2-kube-api-access-5gwv8" (OuterVolumeSpecName: "kube-api-access-5gwv8") pod "c6f7d4df-bc28-4a01-a044-091894ac27c2" (UID: "c6f7d4df-bc28-4a01-a044-091894ac27c2"). InnerVolumeSpecName "kube-api-access-5gwv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.255134 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6f7d4df-bc28-4a01-a044-091894ac27c2-scripts" (OuterVolumeSpecName: "scripts") pod "c6f7d4df-bc28-4a01-a044-091894ac27c2" (UID: "c6f7d4df-bc28-4a01-a044-091894ac27c2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.255867 4817 generic.go:334] "Generic (PLEG): container finished" podID="4fe876a5-4499-40ff-b468-d395efa01d26" containerID="7536b26f3ab7582e5f8c97ef070a635653e8f1b7325cb897f3d1d94e52ad96ac" exitCode=0 Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.255917 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rk57x" event={"ID":"4fe876a5-4499-40ff-b468-d395efa01d26","Type":"ContainerDied","Data":"7536b26f3ab7582e5f8c97ef070a635653e8f1b7325cb897f3d1d94e52ad96ac"} Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.257770 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6f7d4df-bc28-4a01-a044-091894ac27c2-certs" (OuterVolumeSpecName: "certs") pod "c6f7d4df-bc28-4a01-a044-091894ac27c2" (UID: "c6f7d4df-bc28-4a01-a044-091894ac27c2"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.259449 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-9b47-account-create-update-cxfbq" podStartSLOduration=4.259433125 podStartE2EDuration="4.259433125s" podCreationTimestamp="2026-02-18 14:19:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:19:57.196238196 +0000 UTC m=+1259.771774179" watchObservedRunningTime="2026-02-18 14:19:57.259433125 +0000 UTC m=+1259.834969108" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.262007 4817 generic.go:334] "Generic (PLEG): container finished" podID="10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1" containerID="a8000b31370d1a61d32be27c11e56e9a063f1030ec9c7d7906a0b4c19aa8246a" exitCode=0 Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.262043 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pjb2r" event={"ID":"10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1","Type":"ContainerDied","Data":"a8000b31370d1a61d32be27c11e56e9a063f1030ec9c7d7906a0b4c19aa8246a"} Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.280235 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-54xpq" podStartSLOduration=4.280212267 podStartE2EDuration="4.280212267s" podCreationTimestamp="2026-02-18 14:19:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:19:57.225705527 +0000 UTC m=+1259.801241520" watchObservedRunningTime="2026-02-18 14:19:57.280212267 +0000 UTC m=+1259.855748250" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.287641 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-e448-account-create-update-fzq87" podStartSLOduration=4.287619443 podStartE2EDuration="4.287619443s" podCreationTimestamp="2026-02-18 14:19:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:19:57.243084244 +0000 UTC m=+1259.818620227" watchObservedRunningTime="2026-02-18 14:19:57.287619443 +0000 UTC m=+1259.863155426" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.304527 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6f7d4df-bc28-4a01-a044-091894ac27c2-config-data" (OuterVolumeSpecName: "config-data") pod "c6f7d4df-bc28-4a01-a044-091894ac27c2" (UID: "c6f7d4df-bc28-4a01-a044-091894ac27c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.323061 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-16c7-account-create-update-qgzsl" podStartSLOduration=4.323030213 podStartE2EDuration="4.323030213s" podCreationTimestamp="2026-02-18 14:19:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:19:57.266918013 +0000 UTC m=+1259.842453996" watchObservedRunningTime="2026-02-18 14:19:57.323030213 +0000 UTC m=+1259.898566196" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.329420 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6f7d4df-bc28-4a01-a044-091894ac27c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c6f7d4df-bc28-4a01-a044-091894ac27c2" (UID: "c6f7d4df-bc28-4a01-a044-091894ac27c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.350353 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6f7d4df-bc28-4a01-a044-091894ac27c2-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.350394 4817 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/c6f7d4df-bc28-4a01-a044-091894ac27c2-certs\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.350406 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f7d4df-bc28-4a01-a044-091894ac27c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.350418 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6f7d4df-bc28-4a01-a044-091894ac27c2-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.350429 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gwv8\" (UniqueName: \"kubernetes.io/projected/c6f7d4df-bc28-4a01-a044-091894ac27c2-kube-api-access-5gwv8\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.396373 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:19:57 crc kubenswrapper[4817]: W0218 14:19:57.982136 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4194006_61f7_4c39_91cf_bc5a9003a2d1.slice/crio-bf1c2612dfa02f87daf40bcb3f230742c2444888338fb4618acf64ca0a37966f WatchSource:0}: Error finding container bf1c2612dfa02f87daf40bcb3f230742c2444888338fb4618acf64ca0a37966f: Status 404 returned error can't find the container with id bf1c2612dfa02f87daf40bcb3f230742c2444888338fb4618acf64ca0a37966f Feb 18 14:19:57 crc kubenswrapper[4817]: I0218 14:19:57.985866 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.068080 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.244208 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8d07244-c4ab-4bd5-8963-9e2213cb3e9a" path="/var/lib/kubelet/pods/f8d07244-c4ab-4bd5-8963-9e2213cb3e9a/volumes" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.398528 4817 generic.go:334] "Generic (PLEG): container finished" podID="42f4b322-ace9-42b0-944b-c5fa3181fc54" containerID="d103343187fc0b61663b10abdb9c7a21d00e8d59353a5b51024fe08bcfe57e4d" exitCode=0 Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.398873 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-54xpq" event={"ID":"42f4b322-ace9-42b0-944b-c5fa3181fc54","Type":"ContainerDied","Data":"d103343187fc0b61663b10abdb9c7a21d00e8d59353a5b51024fe08bcfe57e4d"} Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.458378 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b542b984-8146-47e2-b20a-1b344762c302","Type":"ContainerStarted","Data":"a264bca3cef19037972d7da6cda09f75428dd7e2dc782f44301818c51f22ae21"} Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.464708 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4194006-61f7-4c39-91cf-bc5a9003a2d1","Type":"ContainerStarted","Data":"bf1c2612dfa02f87daf40bcb3f230742c2444888338fb4618acf64ca0a37966f"} Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.470107 4817 generic.go:334] "Generic (PLEG): container finished" podID="a938bc12-3666-41cf-b8e5-3fa647fe32f0" containerID="67d9a30dc1dd9c7ee86d40c3ccc79eac7095411317c649564b8d192a189712ba" exitCode=0 Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.470162 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqhxh" event={"ID":"a938bc12-3666-41cf-b8e5-3fa647fe32f0","Type":"ContainerDied","Data":"67d9a30dc1dd9c7ee86d40c3ccc79eac7095411317c649564b8d192a189712ba"} Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.478242 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"89455d4a-c424-4e7a-85c5-42163318e132","Type":"ContainerStarted","Data":"6365fdaa9f2082f8a0c3d3d05e365e3e6bb5b59c6d8eea8ede112046ab750419"} Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.483477 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 18 14:19:58 crc kubenswrapper[4817]: E0218 14:19:58.483913 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6f7d4df-bc28-4a01-a044-091894ac27c2" containerName="cloudkitty-storageinit" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.483925 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6f7d4df-bc28-4a01-a044-091894ac27c2" containerName="cloudkitty-storageinit" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.484180 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6f7d4df-bc28-4a01-a044-091894ac27c2" containerName="cloudkitty-storageinit" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.484862 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.492516 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.493568 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.493749 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.493877 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-zgqz6" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.494058 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.494167 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.655485 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh79q\" (UniqueName: \"kubernetes.io/projected/fd188298-86b3-470b-b3ab-d3d3c9e356a7-kube-api-access-lh79q\") pod \"cloudkitty-proc-0\" (UID: \"fd188298-86b3-470b-b3ab-d3d3c9e356a7\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.655574 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fd188298-86b3-470b-b3ab-d3d3c9e356a7-certs\") pod \"cloudkitty-proc-0\" (UID: \"fd188298-86b3-470b-b3ab-d3d3c9e356a7\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.655648 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd188298-86b3-470b-b3ab-d3d3c9e356a7-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"fd188298-86b3-470b-b3ab-d3d3c9e356a7\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.655822 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd188298-86b3-470b-b3ab-d3d3c9e356a7-config-data\") pod \"cloudkitty-proc-0\" (UID: \"fd188298-86b3-470b-b3ab-d3d3c9e356a7\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.655850 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd188298-86b3-470b-b3ab-d3d3c9e356a7-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"fd188298-86b3-470b-b3ab-d3d3c9e356a7\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.655865 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd188298-86b3-470b-b3ab-d3d3c9e356a7-scripts\") pod \"cloudkitty-proc-0\" (UID: \"fd188298-86b3-470b-b3ab-d3d3c9e356a7\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.754407 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76d4d7c9b7-hsgjd"] Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.765604 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76d4d7c9b7-hsgjd" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.765887 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh79q\" (UniqueName: \"kubernetes.io/projected/fd188298-86b3-470b-b3ab-d3d3c9e356a7-kube-api-access-lh79q\") pod \"cloudkitty-proc-0\" (UID: \"fd188298-86b3-470b-b3ab-d3d3c9e356a7\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.766769 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fd188298-86b3-470b-b3ab-d3d3c9e356a7-certs\") pod \"cloudkitty-proc-0\" (UID: \"fd188298-86b3-470b-b3ab-d3d3c9e356a7\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.766870 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd188298-86b3-470b-b3ab-d3d3c9e356a7-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"fd188298-86b3-470b-b3ab-d3d3c9e356a7\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.767163 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd188298-86b3-470b-b3ab-d3d3c9e356a7-config-data\") pod \"cloudkitty-proc-0\" (UID: \"fd188298-86b3-470b-b3ab-d3d3c9e356a7\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.767202 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd188298-86b3-470b-b3ab-d3d3c9e356a7-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"fd188298-86b3-470b-b3ab-d3d3c9e356a7\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.767219 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd188298-86b3-470b-b3ab-d3d3c9e356a7-scripts\") pod \"cloudkitty-proc-0\" (UID: \"fd188298-86b3-470b-b3ab-d3d3c9e356a7\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.783900 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd188298-86b3-470b-b3ab-d3d3c9e356a7-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"fd188298-86b3-470b-b3ab-d3d3c9e356a7\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.793749 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd188298-86b3-470b-b3ab-d3d3c9e356a7-scripts\") pod \"cloudkitty-proc-0\" (UID: \"fd188298-86b3-470b-b3ab-d3d3c9e356a7\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.794672 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd188298-86b3-470b-b3ab-d3d3c9e356a7-config-data\") pod \"cloudkitty-proc-0\" (UID: \"fd188298-86b3-470b-b3ab-d3d3c9e356a7\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.820623 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh79q\" (UniqueName: \"kubernetes.io/projected/fd188298-86b3-470b-b3ab-d3d3c9e356a7-kube-api-access-lh79q\") pod \"cloudkitty-proc-0\" (UID: \"fd188298-86b3-470b-b3ab-d3d3c9e356a7\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.821116 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76d4d7c9b7-hsgjd"] Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.840132 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd188298-86b3-470b-b3ab-d3d3c9e356a7-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"fd188298-86b3-470b-b3ab-d3d3c9e356a7\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.849561 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fd188298-86b3-470b-b3ab-d3d3c9e356a7-certs\") pod \"cloudkitty-proc-0\" (UID: \"fd188298-86b3-470b-b3ab-d3d3c9e356a7\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.869237 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe95924e-a4f6-4844-b496-461e91941a16-ovsdbserver-sb\") pod \"dnsmasq-dns-76d4d7c9b7-hsgjd\" (UID: \"fe95924e-a4f6-4844-b496-461e91941a16\") " pod="openstack/dnsmasq-dns-76d4d7c9b7-hsgjd" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.869338 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe95924e-a4f6-4844-b496-461e91941a16-ovsdbserver-nb\") pod \"dnsmasq-dns-76d4d7c9b7-hsgjd\" (UID: \"fe95924e-a4f6-4844-b496-461e91941a16\") " pod="openstack/dnsmasq-dns-76d4d7c9b7-hsgjd" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.869376 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe95924e-a4f6-4844-b496-461e91941a16-dns-swift-storage-0\") pod \"dnsmasq-dns-76d4d7c9b7-hsgjd\" (UID: \"fe95924e-a4f6-4844-b496-461e91941a16\") " pod="openstack/dnsmasq-dns-76d4d7c9b7-hsgjd" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.869424 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6vnl\" (UniqueName: \"kubernetes.io/projected/fe95924e-a4f6-4844-b496-461e91941a16-kube-api-access-q6vnl\") pod \"dnsmasq-dns-76d4d7c9b7-hsgjd\" (UID: \"fe95924e-a4f6-4844-b496-461e91941a16\") " pod="openstack/dnsmasq-dns-76d4d7c9b7-hsgjd" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.869459 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe95924e-a4f6-4844-b496-461e91941a16-dns-svc\") pod \"dnsmasq-dns-76d4d7c9b7-hsgjd\" (UID: \"fe95924e-a4f6-4844-b496-461e91941a16\") " pod="openstack/dnsmasq-dns-76d4d7c9b7-hsgjd" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.869478 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe95924e-a4f6-4844-b496-461e91941a16-config\") pod \"dnsmasq-dns-76d4d7c9b7-hsgjd\" (UID: \"fe95924e-a4f6-4844-b496-461e91941a16\") " pod="openstack/dnsmasq-dns-76d4d7c9b7-hsgjd" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.907226 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.974944 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6vnl\" (UniqueName: \"kubernetes.io/projected/fe95924e-a4f6-4844-b496-461e91941a16-kube-api-access-q6vnl\") pod \"dnsmasq-dns-76d4d7c9b7-hsgjd\" (UID: \"fe95924e-a4f6-4844-b496-461e91941a16\") " pod="openstack/dnsmasq-dns-76d4d7c9b7-hsgjd" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.975039 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe95924e-a4f6-4844-b496-461e91941a16-dns-svc\") pod \"dnsmasq-dns-76d4d7c9b7-hsgjd\" (UID: \"fe95924e-a4f6-4844-b496-461e91941a16\") " pod="openstack/dnsmasq-dns-76d4d7c9b7-hsgjd" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.975063 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe95924e-a4f6-4844-b496-461e91941a16-config\") pod \"dnsmasq-dns-76d4d7c9b7-hsgjd\" (UID: \"fe95924e-a4f6-4844-b496-461e91941a16\") " pod="openstack/dnsmasq-dns-76d4d7c9b7-hsgjd" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.975131 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe95924e-a4f6-4844-b496-461e91941a16-ovsdbserver-sb\") pod \"dnsmasq-dns-76d4d7c9b7-hsgjd\" (UID: \"fe95924e-a4f6-4844-b496-461e91941a16\") " pod="openstack/dnsmasq-dns-76d4d7c9b7-hsgjd" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.975258 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe95924e-a4f6-4844-b496-461e91941a16-ovsdbserver-nb\") pod \"dnsmasq-dns-76d4d7c9b7-hsgjd\" (UID: \"fe95924e-a4f6-4844-b496-461e91941a16\") " pod="openstack/dnsmasq-dns-76d4d7c9b7-hsgjd" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.975306 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe95924e-a4f6-4844-b496-461e91941a16-dns-swift-storage-0\") pod \"dnsmasq-dns-76d4d7c9b7-hsgjd\" (UID: \"fe95924e-a4f6-4844-b496-461e91941a16\") " pod="openstack/dnsmasq-dns-76d4d7c9b7-hsgjd" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.976725 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe95924e-a4f6-4844-b496-461e91941a16-dns-swift-storage-0\") pod \"dnsmasq-dns-76d4d7c9b7-hsgjd\" (UID: \"fe95924e-a4f6-4844-b496-461e91941a16\") " pod="openstack/dnsmasq-dns-76d4d7c9b7-hsgjd" Feb 18 14:19:58 crc kubenswrapper[4817]: I0218 14:19:58.976746 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe95924e-a4f6-4844-b496-461e91941a16-config\") pod \"dnsmasq-dns-76d4d7c9b7-hsgjd\" (UID: \"fe95924e-a4f6-4844-b496-461e91941a16\") " pod="openstack/dnsmasq-dns-76d4d7c9b7-hsgjd" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:58.994644 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe95924e-a4f6-4844-b496-461e91941a16-ovsdbserver-sb\") pod \"dnsmasq-dns-76d4d7c9b7-hsgjd\" (UID: \"fe95924e-a4f6-4844-b496-461e91941a16\") " pod="openstack/dnsmasq-dns-76d4d7c9b7-hsgjd" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:58.994754 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:58.995626 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe95924e-a4f6-4844-b496-461e91941a16-dns-svc\") pod \"dnsmasq-dns-76d4d7c9b7-hsgjd\" (UID: \"fe95924e-a4f6-4844-b496-461e91941a16\") " pod="openstack/dnsmasq-dns-76d4d7c9b7-hsgjd" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:58.996052 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe95924e-a4f6-4844-b496-461e91941a16-ovsdbserver-nb\") pod \"dnsmasq-dns-76d4d7c9b7-hsgjd\" (UID: \"fe95924e-a4f6-4844-b496-461e91941a16\") " pod="openstack/dnsmasq-dns-76d4d7c9b7-hsgjd" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:58.997291 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.004537 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.031123 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6vnl\" (UniqueName: \"kubernetes.io/projected/fe95924e-a4f6-4844-b496-461e91941a16-kube-api-access-q6vnl\") pod \"dnsmasq-dns-76d4d7c9b7-hsgjd\" (UID: \"fe95924e-a4f6-4844-b496-461e91941a16\") " pod="openstack/dnsmasq-dns-76d4d7c9b7-hsgjd" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.073413 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.078511 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/586418c8-0373-4f82-beba-46a811db26a7-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"586418c8-0373-4f82-beba-46a811db26a7\") " pod="openstack/cloudkitty-api-0" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.078756 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/586418c8-0373-4f82-beba-46a811db26a7-config-data\") pod \"cloudkitty-api-0\" (UID: \"586418c8-0373-4f82-beba-46a811db26a7\") " pod="openstack/cloudkitty-api-0" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.078910 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/586418c8-0373-4f82-beba-46a811db26a7-logs\") pod \"cloudkitty-api-0\" (UID: \"586418c8-0373-4f82-beba-46a811db26a7\") " pod="openstack/cloudkitty-api-0" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.079164 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/586418c8-0373-4f82-beba-46a811db26a7-certs\") pod \"cloudkitty-api-0\" (UID: \"586418c8-0373-4f82-beba-46a811db26a7\") " pod="openstack/cloudkitty-api-0" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.080041 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/586418c8-0373-4f82-beba-46a811db26a7-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"586418c8-0373-4f82-beba-46a811db26a7\") " pod="openstack/cloudkitty-api-0" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.080172 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8npd\" (UniqueName: \"kubernetes.io/projected/586418c8-0373-4f82-beba-46a811db26a7-kube-api-access-v8npd\") pod \"cloudkitty-api-0\" (UID: \"586418c8-0373-4f82-beba-46a811db26a7\") " pod="openstack/cloudkitty-api-0" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.080276 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/586418c8-0373-4f82-beba-46a811db26a7-scripts\") pod \"cloudkitty-api-0\" (UID: \"586418c8-0373-4f82-beba-46a811db26a7\") " pod="openstack/cloudkitty-api-0" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.090843 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.091602 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.186398 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/586418c8-0373-4f82-beba-46a811db26a7-certs\") pod \"cloudkitty-api-0\" (UID: \"586418c8-0373-4f82-beba-46a811db26a7\") " pod="openstack/cloudkitty-api-0" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.186771 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/586418c8-0373-4f82-beba-46a811db26a7-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"586418c8-0373-4f82-beba-46a811db26a7\") " pod="openstack/cloudkitty-api-0" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.186852 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8npd\" (UniqueName: \"kubernetes.io/projected/586418c8-0373-4f82-beba-46a811db26a7-kube-api-access-v8npd\") pod \"cloudkitty-api-0\" (UID: \"586418c8-0373-4f82-beba-46a811db26a7\") " pod="openstack/cloudkitty-api-0" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.186915 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/586418c8-0373-4f82-beba-46a811db26a7-scripts\") pod \"cloudkitty-api-0\" (UID: \"586418c8-0373-4f82-beba-46a811db26a7\") " pod="openstack/cloudkitty-api-0" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.187179 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/586418c8-0373-4f82-beba-46a811db26a7-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"586418c8-0373-4f82-beba-46a811db26a7\") " pod="openstack/cloudkitty-api-0" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.187329 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/586418c8-0373-4f82-beba-46a811db26a7-config-data\") pod \"cloudkitty-api-0\" (UID: \"586418c8-0373-4f82-beba-46a811db26a7\") " pod="openstack/cloudkitty-api-0" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.187358 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/586418c8-0373-4f82-beba-46a811db26a7-logs\") pod \"cloudkitty-api-0\" (UID: \"586418c8-0373-4f82-beba-46a811db26a7\") " pod="openstack/cloudkitty-api-0" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.187963 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/586418c8-0373-4f82-beba-46a811db26a7-logs\") pod \"cloudkitty-api-0\" (UID: \"586418c8-0373-4f82-beba-46a811db26a7\") " pod="openstack/cloudkitty-api-0" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.227592 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/586418c8-0373-4f82-beba-46a811db26a7-scripts\") pod \"cloudkitty-api-0\" (UID: \"586418c8-0373-4f82-beba-46a811db26a7\") " pod="openstack/cloudkitty-api-0" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.228054 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/586418c8-0373-4f82-beba-46a811db26a7-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"586418c8-0373-4f82-beba-46a811db26a7\") " pod="openstack/cloudkitty-api-0" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.228878 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/586418c8-0373-4f82-beba-46a811db26a7-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"586418c8-0373-4f82-beba-46a811db26a7\") " pod="openstack/cloudkitty-api-0" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.247879 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/586418c8-0373-4f82-beba-46a811db26a7-certs\") pod \"cloudkitty-api-0\" (UID: \"586418c8-0373-4f82-beba-46a811db26a7\") " pod="openstack/cloudkitty-api-0" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.250542 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8npd\" (UniqueName: \"kubernetes.io/projected/586418c8-0373-4f82-beba-46a811db26a7-kube-api-access-v8npd\") pod \"cloudkitty-api-0\" (UID: \"586418c8-0373-4f82-beba-46a811db26a7\") " pod="openstack/cloudkitty-api-0" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.266378 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.266847 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/586418c8-0373-4f82-beba-46a811db26a7-config-data\") pod \"cloudkitty-api-0\" (UID: \"586418c8-0373-4f82-beba-46a811db26a7\") " pod="openstack/cloudkitty-api-0" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.305909 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.356806 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76d4d7c9b7-hsgjd" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.450056 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.510430 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pjb2r" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.523608 4817 generic.go:334] "Generic (PLEG): container finished" podID="c5d3725d-5bb0-4edd-b707-6690d2ac99f5" containerID="285489d62434a2d1524abfb4ca7649979b15a2848e6d6d60c03d663094c8870b" exitCode=0 Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.523690 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9b47-account-create-update-cxfbq" event={"ID":"c5d3725d-5bb0-4edd-b707-6690d2ac99f5","Type":"ContainerDied","Data":"285489d62434a2d1524abfb4ca7649979b15a2848e6d6d60c03d663094c8870b"} Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.536784 4817 generic.go:334] "Generic (PLEG): container finished" podID="43f9dae0-f2ed-4f91-b922-6f3432c8997d" containerID="b7c54389cd72035e803566681b0052acb780bca86033ace971da40787be37322" exitCode=0 Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.536870 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-16c7-account-create-update-qgzsl" event={"ID":"43f9dae0-f2ed-4f91-b922-6f3432c8997d","Type":"ContainerDied","Data":"b7c54389cd72035e803566681b0052acb780bca86033ace971da40787be37322"} Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.560794 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pjb2r" event={"ID":"10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1","Type":"ContainerDied","Data":"adfefa68fbab6423c3f6b55e77b63634176c163871dff6e9b545d353d2d85695"} Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.560860 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adfefa68fbab6423c3f6b55e77b63634176c163871dff6e9b545d353d2d85695" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.561359 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pjb2r" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.562299 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.562322 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.605432 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sv2hb\" (UniqueName: \"kubernetes.io/projected/10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1-kube-api-access-sv2hb\") pod \"10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1\" (UID: \"10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1\") " Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.605608 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1-operator-scripts\") pod \"10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1\" (UID: \"10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1\") " Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.618541 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1" (UID: "10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.625571 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1-kube-api-access-sv2hb" (OuterVolumeSpecName: "kube-api-access-sv2hb") pod "10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1" (UID: "10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1"). InnerVolumeSpecName "kube-api-access-sv2hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:19:59 crc kubenswrapper[4817]: E0218 14:19:59.687868 4817 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43f9dae0_f2ed_4f91_b922_6f3432c8997d.slice/crio-conmon-b7c54389cd72035e803566681b0052acb780bca86033ace971da40787be37322.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod585ec0f8_a374_44ae_8b97_024af4983f69.slice/crio-a57f60ed37120dae989c8e17c03a3028d7155108550b32a921b8a3e1b9462836.scope\": RecentStats: unable to find data in memory cache]" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.711309 4817 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.711359 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sv2hb\" (UniqueName: \"kubernetes.io/projected/10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1-kube-api-access-sv2hb\") on node \"crc\" DevicePath \"\"" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.764645 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/neutron-f8cb89c64-cqrwn" podUID="20cb47f0-a64d-4e7d-93f0-1fed117df7ce" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.765069 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/neutron-f8cb89c64-cqrwn" podUID="20cb47f0-a64d-4e7d-93f0-1fed117df7ce" containerName="neutron-api" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.765118 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-f8cb89c64-cqrwn" podUID="20cb47f0-a64d-4e7d-93f0-1fed117df7ce" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.883644 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rk57x" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.917099 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fe876a5-4499-40ff-b468-d395efa01d26-operator-scripts\") pod \"4fe876a5-4499-40ff-b468-d395efa01d26\" (UID: \"4fe876a5-4499-40ff-b468-d395efa01d26\") " Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.917149 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5ds5\" (UniqueName: \"kubernetes.io/projected/4fe876a5-4499-40ff-b468-d395efa01d26-kube-api-access-t5ds5\") pod \"4fe876a5-4499-40ff-b468-d395efa01d26\" (UID: \"4fe876a5-4499-40ff-b468-d395efa01d26\") " Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.917863 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fe876a5-4499-40ff-b468-d395efa01d26-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4fe876a5-4499-40ff-b468-d395efa01d26" (UID: "4fe876a5-4499-40ff-b468-d395efa01d26"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.942282 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fe876a5-4499-40ff-b468-d395efa01d26-kube-api-access-t5ds5" (OuterVolumeSpecName: "kube-api-access-t5ds5") pod "4fe876a5-4499-40ff-b468-d395efa01d26" (UID: "4fe876a5-4499-40ff-b468-d395efa01d26"). InnerVolumeSpecName "kube-api-access-t5ds5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:19:59 crc kubenswrapper[4817]: I0218 14:19:59.986280 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="a414e293-71b9-44c3-8f07-20f3696f7db6" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.188:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 18 14:20:00 crc kubenswrapper[4817]: I0218 14:20:00.026314 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5ds5\" (UniqueName: \"kubernetes.io/projected/4fe876a5-4499-40ff-b468-d395efa01d26-kube-api-access-t5ds5\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:00 crc kubenswrapper[4817]: I0218 14:20:00.028372 4817 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fe876a5-4499-40ff-b468-d395efa01d26-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:00 crc kubenswrapper[4817]: I0218 14:20:00.391761 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 18 14:20:00 crc kubenswrapper[4817]: I0218 14:20:00.558877 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-54xpq" Feb 18 14:20:00 crc kubenswrapper[4817]: I0218 14:20:00.592457 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"89455d4a-c424-4e7a-85c5-42163318e132","Type":"ContainerStarted","Data":"0323959ac0584fef53b9a621878f4729f2be693824d6207cc3661dac5da211c1"} Feb 18 14:20:00 crc kubenswrapper[4817]: I0218 14:20:00.601937 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rk57x" event={"ID":"4fe876a5-4499-40ff-b468-d395efa01d26","Type":"ContainerDied","Data":"0ca1516243e78c35d92eb4d64536976d5f905d1920cf8d2bf0fcbfdb0719dfe5"} Feb 18 14:20:00 crc kubenswrapper[4817]: I0218 14:20:00.601991 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ca1516243e78c35d92eb4d64536976d5f905d1920cf8d2bf0fcbfdb0719dfe5" Feb 18 14:20:00 crc kubenswrapper[4817]: I0218 14:20:00.602058 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rk57x" Feb 18 14:20:00 crc kubenswrapper[4817]: I0218 14:20:00.608323 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"fd188298-86b3-470b-b3ab-d3d3c9e356a7","Type":"ContainerStarted","Data":"72a75732403058e1abbb8578e2ef5a39a668f1fef6d27f61200418f8673cc17f"} Feb 18 14:20:00 crc kubenswrapper[4817]: I0218 14:20:00.613148 4817 generic.go:334] "Generic (PLEG): container finished" podID="585ec0f8-a374-44ae-8b97-024af4983f69" containerID="a57f60ed37120dae989c8e17c03a3028d7155108550b32a921b8a3e1b9462836" exitCode=0 Feb 18 14:20:00 crc kubenswrapper[4817]: I0218 14:20:00.613283 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e448-account-create-update-fzq87" event={"ID":"585ec0f8-a374-44ae-8b97-024af4983f69","Type":"ContainerDied","Data":"a57f60ed37120dae989c8e17c03a3028d7155108550b32a921b8a3e1b9462836"} Feb 18 14:20:00 crc kubenswrapper[4817]: I0218 14:20:00.616871 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-54xpq" event={"ID":"42f4b322-ace9-42b0-944b-c5fa3181fc54","Type":"ContainerDied","Data":"85f9cb0a1926cc0c5c3d15d4e8da22aa02bb937d58b4999780c080f93a5cd480"} Feb 18 14:20:00 crc kubenswrapper[4817]: I0218 14:20:00.616909 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85f9cb0a1926cc0c5c3d15d4e8da22aa02bb937d58b4999780c080f93a5cd480" Feb 18 14:20:00 crc kubenswrapper[4817]: I0218 14:20:00.616964 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-54xpq" Feb 18 14:20:00 crc kubenswrapper[4817]: W0218 14:20:00.636248 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod586418c8_0373_4f82_beba_46a811db26a7.slice/crio-e0e99e324a4fb34cd20cb7a64884e6c634b8f2c04fcddef3b49fc3812c509714 WatchSource:0}: Error finding container e0e99e324a4fb34cd20cb7a64884e6c634b8f2c04fcddef3b49fc3812c509714: Status 404 returned error can't find the container with id e0e99e324a4fb34cd20cb7a64884e6c634b8f2c04fcddef3b49fc3812c509714 Feb 18 14:20:00 crc kubenswrapper[4817]: I0218 14:20:00.640582 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 18 14:20:00 crc kubenswrapper[4817]: I0218 14:20:00.645054 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b542b984-8146-47e2-b20a-1b344762c302","Type":"ContainerStarted","Data":"8047164e1b3b51bb88f17a98b30a9d1dc11b6c0e6a450f444f5ccf576c49d7ac"} Feb 18 14:20:00 crc kubenswrapper[4817]: I0218 14:20:00.646589 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=6.646570148 podStartE2EDuration="6.646570148s" podCreationTimestamp="2026-02-18 14:19:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:20:00.618078552 +0000 UTC m=+1263.193614535" watchObservedRunningTime="2026-02-18 14:20:00.646570148 +0000 UTC m=+1263.222106131" Feb 18 14:20:00 crc kubenswrapper[4817]: I0218 14:20:00.651408 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42f4b322-ace9-42b0-944b-c5fa3181fc54-operator-scripts\") pod \"42f4b322-ace9-42b0-944b-c5fa3181fc54\" (UID: \"42f4b322-ace9-42b0-944b-c5fa3181fc54\") " Feb 18 14:20:00 crc kubenswrapper[4817]: I0218 14:20:00.651502 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vc4d4\" (UniqueName: \"kubernetes.io/projected/42f4b322-ace9-42b0-944b-c5fa3181fc54-kube-api-access-vc4d4\") pod \"42f4b322-ace9-42b0-944b-c5fa3181fc54\" (UID: \"42f4b322-ace9-42b0-944b-c5fa3181fc54\") " Feb 18 14:20:00 crc kubenswrapper[4817]: I0218 14:20:00.653881 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42f4b322-ace9-42b0-944b-c5fa3181fc54-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "42f4b322-ace9-42b0-944b-c5fa3181fc54" (UID: "42f4b322-ace9-42b0-944b-c5fa3181fc54"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:20:00 crc kubenswrapper[4817]: I0218 14:20:00.659150 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76d4d7c9b7-hsgjd"] Feb 18 14:20:00 crc kubenswrapper[4817]: I0218 14:20:00.660719 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42f4b322-ace9-42b0-944b-c5fa3181fc54-kube-api-access-vc4d4" (OuterVolumeSpecName: "kube-api-access-vc4d4") pod "42f4b322-ace9-42b0-944b-c5fa3181fc54" (UID: "42f4b322-ace9-42b0-944b-c5fa3181fc54"). InnerVolumeSpecName "kube-api-access-vc4d4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:20:00 crc kubenswrapper[4817]: I0218 14:20:00.673541 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4194006-61f7-4c39-91cf-bc5a9003a2d1","Type":"ContainerStarted","Data":"4cdaa3cd6fc55060608e76a9a87f57a81c7da3be01e10038a75e033a53455414"} Feb 18 14:20:00 crc kubenswrapper[4817]: I0218 14:20:00.676376 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.676354777 podStartE2EDuration="6.676354777s" podCreationTimestamp="2026-02-18 14:19:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:20:00.675129616 +0000 UTC m=+1263.250665599" watchObservedRunningTime="2026-02-18 14:20:00.676354777 +0000 UTC m=+1263.251890760" Feb 18 14:20:00 crc kubenswrapper[4817]: I0218 14:20:00.754373 4817 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42f4b322-ace9-42b0-944b-c5fa3181fc54-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:00 crc kubenswrapper[4817]: I0218 14:20:00.760256 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vc4d4\" (UniqueName: \"kubernetes.io/projected/42f4b322-ace9-42b0-944b-c5fa3181fc54-kube-api-access-vc4d4\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:01 crc kubenswrapper[4817]: I0218 14:20:01.636763 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:20:01 crc kubenswrapper[4817]: I0218 14:20:01.705730 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"586418c8-0373-4f82-beba-46a811db26a7","Type":"ContainerStarted","Data":"e5aaa71d4e86e06b449670cb9afe3d3fcddad9879a10e5d0f4dbaa436c7643c5"} Feb 18 14:20:01 crc kubenswrapper[4817]: I0218 14:20:01.705781 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"586418c8-0373-4f82-beba-46a811db26a7","Type":"ContainerStarted","Data":"e0e99e324a4fb34cd20cb7a64884e6c634b8f2c04fcddef3b49fc3812c509714"} Feb 18 14:20:01 crc kubenswrapper[4817]: I0218 14:20:01.709016 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76d4d7c9b7-hsgjd" event={"ID":"fe95924e-a4f6-4844-b496-461e91941a16","Type":"ContainerStarted","Data":"7166b0c6410788e979b3ae08d0f032c6efe6b0fbb4ace60a557184fdaf0b89ef"} Feb 18 14:20:02 crc kubenswrapper[4817]: I0218 14:20:02.109680 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-16c7-account-create-update-qgzsl" Feb 18 14:20:02 crc kubenswrapper[4817]: I0218 14:20:02.138590 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9b47-account-create-update-cxfbq" Feb 18 14:20:02 crc kubenswrapper[4817]: I0218 14:20:02.222764 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpzq9\" (UniqueName: \"kubernetes.io/projected/43f9dae0-f2ed-4f91-b922-6f3432c8997d-kube-api-access-tpzq9\") pod \"43f9dae0-f2ed-4f91-b922-6f3432c8997d\" (UID: \"43f9dae0-f2ed-4f91-b922-6f3432c8997d\") " Feb 18 14:20:02 crc kubenswrapper[4817]: I0218 14:20:02.223261 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lvpj\" (UniqueName: \"kubernetes.io/projected/c5d3725d-5bb0-4edd-b707-6690d2ac99f5-kube-api-access-5lvpj\") pod \"c5d3725d-5bb0-4edd-b707-6690d2ac99f5\" (UID: \"c5d3725d-5bb0-4edd-b707-6690d2ac99f5\") " Feb 18 14:20:02 crc kubenswrapper[4817]: I0218 14:20:02.223409 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5d3725d-5bb0-4edd-b707-6690d2ac99f5-operator-scripts\") pod \"c5d3725d-5bb0-4edd-b707-6690d2ac99f5\" (UID: \"c5d3725d-5bb0-4edd-b707-6690d2ac99f5\") " Feb 18 14:20:02 crc kubenswrapper[4817]: I0218 14:20:02.223439 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43f9dae0-f2ed-4f91-b922-6f3432c8997d-operator-scripts\") pod \"43f9dae0-f2ed-4f91-b922-6f3432c8997d\" (UID: \"43f9dae0-f2ed-4f91-b922-6f3432c8997d\") " Feb 18 14:20:02 crc kubenswrapper[4817]: I0218 14:20:02.233503 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43f9dae0-f2ed-4f91-b922-6f3432c8997d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "43f9dae0-f2ed-4f91-b922-6f3432c8997d" (UID: "43f9dae0-f2ed-4f91-b922-6f3432c8997d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:20:02 crc kubenswrapper[4817]: I0218 14:20:02.233652 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5d3725d-5bb0-4edd-b707-6690d2ac99f5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c5d3725d-5bb0-4edd-b707-6690d2ac99f5" (UID: "c5d3725d-5bb0-4edd-b707-6690d2ac99f5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:20:02 crc kubenswrapper[4817]: I0218 14:20:02.235092 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5d3725d-5bb0-4edd-b707-6690d2ac99f5-kube-api-access-5lvpj" (OuterVolumeSpecName: "kube-api-access-5lvpj") pod "c5d3725d-5bb0-4edd-b707-6690d2ac99f5" (UID: "c5d3725d-5bb0-4edd-b707-6690d2ac99f5"). InnerVolumeSpecName "kube-api-access-5lvpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:20:02 crc kubenswrapper[4817]: I0218 14:20:02.235481 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43f9dae0-f2ed-4f91-b922-6f3432c8997d-kube-api-access-tpzq9" (OuterVolumeSpecName: "kube-api-access-tpzq9") pod "43f9dae0-f2ed-4f91-b922-6f3432c8997d" (UID: "43f9dae0-f2ed-4f91-b922-6f3432c8997d"). InnerVolumeSpecName "kube-api-access-tpzq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:20:02 crc kubenswrapper[4817]: I0218 14:20:02.332564 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lvpj\" (UniqueName: \"kubernetes.io/projected/c5d3725d-5bb0-4edd-b707-6690d2ac99f5-kube-api-access-5lvpj\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:02 crc kubenswrapper[4817]: I0218 14:20:02.332592 4817 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5d3725d-5bb0-4edd-b707-6690d2ac99f5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:02 crc kubenswrapper[4817]: I0218 14:20:02.332600 4817 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43f9dae0-f2ed-4f91-b922-6f3432c8997d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:02 crc kubenswrapper[4817]: I0218 14:20:02.332626 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpzq9\" (UniqueName: \"kubernetes.io/projected/43f9dae0-f2ed-4f91-b922-6f3432c8997d-kube-api-access-tpzq9\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:02 crc kubenswrapper[4817]: I0218 14:20:02.662612 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 18 14:20:02 crc kubenswrapper[4817]: I0218 14:20:02.737956 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-16c7-account-create-update-qgzsl" event={"ID":"43f9dae0-f2ed-4f91-b922-6f3432c8997d","Type":"ContainerDied","Data":"48689e4127dc5d8e237fcea680ce46a71ed71fdda702d4ecc48a558221cf0ce9"} Feb 18 14:20:02 crc kubenswrapper[4817]: I0218 14:20:02.738008 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48689e4127dc5d8e237fcea680ce46a71ed71fdda702d4ecc48a558221cf0ce9" Feb 18 14:20:02 crc kubenswrapper[4817]: I0218 14:20:02.738200 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-16c7-account-create-update-qgzsl" Feb 18 14:20:02 crc kubenswrapper[4817]: I0218 14:20:02.762804 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqhxh" event={"ID":"a938bc12-3666-41cf-b8e5-3fa647fe32f0","Type":"ContainerStarted","Data":"57a8e3c4eccbeb4ae0898dbd9c80d4b99913889c202df5693c9c7dc254444693"} Feb 18 14:20:02 crc kubenswrapper[4817]: I0218 14:20:02.778128 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9b47-account-create-update-cxfbq" Feb 18 14:20:02 crc kubenswrapper[4817]: I0218 14:20:02.779066 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9b47-account-create-update-cxfbq" event={"ID":"c5d3725d-5bb0-4edd-b707-6690d2ac99f5","Type":"ContainerDied","Data":"9dc2bcd640a30ec8537c76920ad6547038b920518352a865bd66f20f18757ab6"} Feb 18 14:20:02 crc kubenswrapper[4817]: I0218 14:20:02.779113 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9dc2bcd640a30ec8537c76920ad6547038b920518352a865bd66f20f18757ab6" Feb 18 14:20:02 crc kubenswrapper[4817]: I0218 14:20:02.828424 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e448-account-create-update-fzq87" Feb 18 14:20:02 crc kubenswrapper[4817]: I0218 14:20:02.953835 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nhxs\" (UniqueName: \"kubernetes.io/projected/585ec0f8-a374-44ae-8b97-024af4983f69-kube-api-access-2nhxs\") pod \"585ec0f8-a374-44ae-8b97-024af4983f69\" (UID: \"585ec0f8-a374-44ae-8b97-024af4983f69\") " Feb 18 14:20:02 crc kubenswrapper[4817]: I0218 14:20:02.954078 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/585ec0f8-a374-44ae-8b97-024af4983f69-operator-scripts\") pod \"585ec0f8-a374-44ae-8b97-024af4983f69\" (UID: \"585ec0f8-a374-44ae-8b97-024af4983f69\") " Feb 18 14:20:02 crc kubenswrapper[4817]: I0218 14:20:02.955450 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/585ec0f8-a374-44ae-8b97-024af4983f69-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "585ec0f8-a374-44ae-8b97-024af4983f69" (UID: "585ec0f8-a374-44ae-8b97-024af4983f69"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:20:02 crc kubenswrapper[4817]: I0218 14:20:02.967489 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/585ec0f8-a374-44ae-8b97-024af4983f69-kube-api-access-2nhxs" (OuterVolumeSpecName: "kube-api-access-2nhxs") pod "585ec0f8-a374-44ae-8b97-024af4983f69" (UID: "585ec0f8-a374-44ae-8b97-024af4983f69"). InnerVolumeSpecName "kube-api-access-2nhxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:20:03 crc kubenswrapper[4817]: I0218 14:20:03.056787 4817 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/585ec0f8-a374-44ae-8b97-024af4983f69-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:03 crc kubenswrapper[4817]: I0218 14:20:03.056836 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nhxs\" (UniqueName: \"kubernetes.io/projected/585ec0f8-a374-44ae-8b97-024af4983f69-kube-api-access-2nhxs\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:03 crc kubenswrapper[4817]: I0218 14:20:03.490214 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="a414e293-71b9-44c3-8f07-20f3696f7db6" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.188:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 18 14:20:03 crc kubenswrapper[4817]: I0218 14:20:03.887268 4817 generic.go:334] "Generic (PLEG): container finished" podID="a938bc12-3666-41cf-b8e5-3fa647fe32f0" containerID="57a8e3c4eccbeb4ae0898dbd9c80d4b99913889c202df5693c9c7dc254444693" exitCode=0 Feb 18 14:20:03 crc kubenswrapper[4817]: I0218 14:20:03.887379 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqhxh" event={"ID":"a938bc12-3666-41cf-b8e5-3fa647fe32f0","Type":"ContainerDied","Data":"57a8e3c4eccbeb4ae0898dbd9c80d4b99913889c202df5693c9c7dc254444693"} Feb 18 14:20:03 crc kubenswrapper[4817]: I0218 14:20:03.916023 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"586418c8-0373-4f82-beba-46a811db26a7","Type":"ContainerStarted","Data":"0de9808db8100d4ea9abd06beb1c7790c4ca966af02c95ad91f73845cbc40268"} Feb 18 14:20:03 crc kubenswrapper[4817]: I0218 14:20:03.916247 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="586418c8-0373-4f82-beba-46a811db26a7" containerName="cloudkitty-api-log" containerID="cri-o://e5aaa71d4e86e06b449670cb9afe3d3fcddad9879a10e5d0f4dbaa436c7643c5" gracePeriod=30 Feb 18 14:20:03 crc kubenswrapper[4817]: I0218 14:20:03.916330 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Feb 18 14:20:03 crc kubenswrapper[4817]: I0218 14:20:03.916357 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="586418c8-0373-4f82-beba-46a811db26a7" containerName="cloudkitty-api" containerID="cri-o://0de9808db8100d4ea9abd06beb1c7790c4ca966af02c95ad91f73845cbc40268" gracePeriod=30 Feb 18 14:20:03 crc kubenswrapper[4817]: I0218 14:20:03.968291 4817 generic.go:334] "Generic (PLEG): container finished" podID="fe95924e-a4f6-4844-b496-461e91941a16" containerID="941c80450ab2e862b065261379136eebc674624ce6ea44cb9888734acd0e551e" exitCode=0 Feb 18 14:20:03 crc kubenswrapper[4817]: I0218 14:20:03.968402 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76d4d7c9b7-hsgjd" event={"ID":"fe95924e-a4f6-4844-b496-461e91941a16","Type":"ContainerDied","Data":"941c80450ab2e862b065261379136eebc674624ce6ea44cb9888734acd0e551e"} Feb 18 14:20:03 crc kubenswrapper[4817]: I0218 14:20:03.989661 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4194006-61f7-4c39-91cf-bc5a9003a2d1","Type":"ContainerStarted","Data":"53bef78fafa0a35a8769da9ba7a098916b0691b1b9c9f335987c4c432fe29c8a"} Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.002275 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=6.002226799 podStartE2EDuration="6.002226799s" podCreationTimestamp="2026-02-18 14:19:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:20:03.953992187 +0000 UTC m=+1266.529528170" watchObservedRunningTime="2026-02-18 14:20:04.002226799 +0000 UTC m=+1266.577762782" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.007924 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e448-account-create-update-fzq87" event={"ID":"585ec0f8-a374-44ae-8b97-024af4983f69","Type":"ContainerDied","Data":"dde004a2e88eef26f861ef181840ce352289df68e44f3d0b079bcf5b728d8b28"} Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.007964 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dde004a2e88eef26f861ef181840ce352289df68e44f3d0b079bcf5b728d8b28" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.008150 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e448-account-create-update-fzq87" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.141499 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9gmj9"] Feb 18 14:20:04 crc kubenswrapper[4817]: E0218 14:20:04.142020 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fe876a5-4499-40ff-b468-d395efa01d26" containerName="mariadb-database-create" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.142044 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fe876a5-4499-40ff-b468-d395efa01d26" containerName="mariadb-database-create" Feb 18 14:20:04 crc kubenswrapper[4817]: E0218 14:20:04.142077 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42f4b322-ace9-42b0-944b-c5fa3181fc54" containerName="mariadb-database-create" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.142088 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="42f4b322-ace9-42b0-944b-c5fa3181fc54" containerName="mariadb-database-create" Feb 18 14:20:04 crc kubenswrapper[4817]: E0218 14:20:04.142118 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5d3725d-5bb0-4edd-b707-6690d2ac99f5" containerName="mariadb-account-create-update" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.142127 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5d3725d-5bb0-4edd-b707-6690d2ac99f5" containerName="mariadb-account-create-update" Feb 18 14:20:04 crc kubenswrapper[4817]: E0218 14:20:04.142142 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="585ec0f8-a374-44ae-8b97-024af4983f69" containerName="mariadb-account-create-update" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.142150 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="585ec0f8-a374-44ae-8b97-024af4983f69" containerName="mariadb-account-create-update" Feb 18 14:20:04 crc kubenswrapper[4817]: E0218 14:20:04.142167 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43f9dae0-f2ed-4f91-b922-6f3432c8997d" containerName="mariadb-account-create-update" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.142177 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="43f9dae0-f2ed-4f91-b922-6f3432c8997d" containerName="mariadb-account-create-update" Feb 18 14:20:04 crc kubenswrapper[4817]: E0218 14:20:04.142190 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1" containerName="mariadb-database-create" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.142197 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1" containerName="mariadb-database-create" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.142420 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5d3725d-5bb0-4edd-b707-6690d2ac99f5" containerName="mariadb-account-create-update" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.142437 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1" containerName="mariadb-database-create" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.142462 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="42f4b322-ace9-42b0-944b-c5fa3181fc54" containerName="mariadb-database-create" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.142477 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fe876a5-4499-40ff-b468-d395efa01d26" containerName="mariadb-database-create" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.142489 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="43f9dae0-f2ed-4f91-b922-6f3432c8997d" containerName="mariadb-account-create-update" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.142502 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="585ec0f8-a374-44ae-8b97-024af4983f69" containerName="mariadb-account-create-update" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.143405 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9gmj9" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.154387 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9gmj9"] Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.168593 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.168633 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.168989 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-kfm86" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.284610 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blsgn\" (UniqueName: \"kubernetes.io/projected/64604bbb-190b-4850-97cc-07979a94d7aa-kube-api-access-blsgn\") pod \"nova-cell0-conductor-db-sync-9gmj9\" (UID: \"64604bbb-190b-4850-97cc-07979a94d7aa\") " pod="openstack/nova-cell0-conductor-db-sync-9gmj9" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.285022 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64604bbb-190b-4850-97cc-07979a94d7aa-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-9gmj9\" (UID: \"64604bbb-190b-4850-97cc-07979a94d7aa\") " pod="openstack/nova-cell0-conductor-db-sync-9gmj9" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.285187 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64604bbb-190b-4850-97cc-07979a94d7aa-scripts\") pod \"nova-cell0-conductor-db-sync-9gmj9\" (UID: \"64604bbb-190b-4850-97cc-07979a94d7aa\") " pod="openstack/nova-cell0-conductor-db-sync-9gmj9" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.285345 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64604bbb-190b-4850-97cc-07979a94d7aa-config-data\") pod \"nova-cell0-conductor-db-sync-9gmj9\" (UID: \"64604bbb-190b-4850-97cc-07979a94d7aa\") " pod="openstack/nova-cell0-conductor-db-sync-9gmj9" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.387573 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blsgn\" (UniqueName: \"kubernetes.io/projected/64604bbb-190b-4850-97cc-07979a94d7aa-kube-api-access-blsgn\") pod \"nova-cell0-conductor-db-sync-9gmj9\" (UID: \"64604bbb-190b-4850-97cc-07979a94d7aa\") " pod="openstack/nova-cell0-conductor-db-sync-9gmj9" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.387722 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64604bbb-190b-4850-97cc-07979a94d7aa-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-9gmj9\" (UID: \"64604bbb-190b-4850-97cc-07979a94d7aa\") " pod="openstack/nova-cell0-conductor-db-sync-9gmj9" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.387799 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64604bbb-190b-4850-97cc-07979a94d7aa-scripts\") pod \"nova-cell0-conductor-db-sync-9gmj9\" (UID: \"64604bbb-190b-4850-97cc-07979a94d7aa\") " pod="openstack/nova-cell0-conductor-db-sync-9gmj9" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.387848 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64604bbb-190b-4850-97cc-07979a94d7aa-config-data\") pod \"nova-cell0-conductor-db-sync-9gmj9\" (UID: \"64604bbb-190b-4850-97cc-07979a94d7aa\") " pod="openstack/nova-cell0-conductor-db-sync-9gmj9" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.397597 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64604bbb-190b-4850-97cc-07979a94d7aa-scripts\") pod \"nova-cell0-conductor-db-sync-9gmj9\" (UID: \"64604bbb-190b-4850-97cc-07979a94d7aa\") " pod="openstack/nova-cell0-conductor-db-sync-9gmj9" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.398485 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64604bbb-190b-4850-97cc-07979a94d7aa-config-data\") pod \"nova-cell0-conductor-db-sync-9gmj9\" (UID: \"64604bbb-190b-4850-97cc-07979a94d7aa\") " pod="openstack/nova-cell0-conductor-db-sync-9gmj9" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.398788 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64604bbb-190b-4850-97cc-07979a94d7aa-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-9gmj9\" (UID: \"64604bbb-190b-4850-97cc-07979a94d7aa\") " pod="openstack/nova-cell0-conductor-db-sync-9gmj9" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.410442 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blsgn\" (UniqueName: \"kubernetes.io/projected/64604bbb-190b-4850-97cc-07979a94d7aa-kube-api-access-blsgn\") pod \"nova-cell0-conductor-db-sync-9gmj9\" (UID: \"64604bbb-190b-4850-97cc-07979a94d7aa\") " pod="openstack/nova-cell0-conductor-db-sync-9gmj9" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.493661 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9gmj9" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.771752 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.976184 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 18 14:20:04 crc kubenswrapper[4817]: I0218 14:20:04.976584 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 18 14:20:05 crc kubenswrapper[4817]: I0218 14:20:05.019105 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 18 14:20:05 crc kubenswrapper[4817]: I0218 14:20:05.019189 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 18 14:20:05 crc kubenswrapper[4817]: I0218 14:20:05.025169 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 18 14:20:05 crc kubenswrapper[4817]: I0218 14:20:05.036080 4817 generic.go:334] "Generic (PLEG): container finished" podID="586418c8-0373-4f82-beba-46a811db26a7" containerID="e5aaa71d4e86e06b449670cb9afe3d3fcddad9879a10e5d0f4dbaa436c7643c5" exitCode=143 Feb 18 14:20:05 crc kubenswrapper[4817]: I0218 14:20:05.036935 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"586418c8-0373-4f82-beba-46a811db26a7","Type":"ContainerDied","Data":"e5aaa71d4e86e06b449670cb9afe3d3fcddad9879a10e5d0f4dbaa436c7643c5"} Feb 18 14:20:05 crc kubenswrapper[4817]: I0218 14:20:05.037819 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 18 14:20:05 crc kubenswrapper[4817]: I0218 14:20:05.038049 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 18 14:20:06 crc kubenswrapper[4817]: I0218 14:20:06.424137 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9gmj9"] Feb 18 14:20:07 crc kubenswrapper[4817]: I0218 14:20:07.070921 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9gmj9" event={"ID":"64604bbb-190b-4850-97cc-07979a94d7aa","Type":"ContainerStarted","Data":"ce8f73eb108da9dd58a23ffb705d73f0968359473237244b1c528e187d1931fe"} Feb 18 14:20:07 crc kubenswrapper[4817]: I0218 14:20:07.074362 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76d4d7c9b7-hsgjd" event={"ID":"fe95924e-a4f6-4844-b496-461e91941a16","Type":"ContainerStarted","Data":"b16cfcffd52bda8befb4ab613338c503c09798ef2a97175cdc361648d68ddf89"} Feb 18 14:20:07 crc kubenswrapper[4817]: I0218 14:20:07.074413 4817 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 14:20:07 crc kubenswrapper[4817]: I0218 14:20:07.074773 4817 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 14:20:07 crc kubenswrapper[4817]: I0218 14:20:07.074887 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76d4d7c9b7-hsgjd" Feb 18 14:20:07 crc kubenswrapper[4817]: I0218 14:20:07.103328 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76d4d7c9b7-hsgjd" podStartSLOduration=9.103307832 podStartE2EDuration="9.103307832s" podCreationTimestamp="2026-02-18 14:19:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:20:07.095566427 +0000 UTC m=+1269.671102410" watchObservedRunningTime="2026-02-18 14:20:07.103307832 +0000 UTC m=+1269.678843815" Feb 18 14:20:08 crc kubenswrapper[4817]: I0218 14:20:08.090461 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4194006-61f7-4c39-91cf-bc5a9003a2d1","Type":"ContainerStarted","Data":"d42220dc3a5db62bc015f88e2567711787807bfa8400304a06effd817ce6fd53"} Feb 18 14:20:08 crc kubenswrapper[4817]: I0218 14:20:08.096673 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"fd188298-86b3-470b-b3ab-d3d3c9e356a7","Type":"ContainerStarted","Data":"da730a3033788be135c8c0fdb0570d392868504128e0895329ceaad366c8a371"} Feb 18 14:20:08 crc kubenswrapper[4817]: I0218 14:20:08.102630 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqhxh" event={"ID":"a938bc12-3666-41cf-b8e5-3fa647fe32f0","Type":"ContainerStarted","Data":"ccc70cce335eba799c9289c4deadb09cf831ff07a39cc518d9ff78cb54df61aa"} Feb 18 14:20:08 crc kubenswrapper[4817]: I0218 14:20:08.127474 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=3.9600945960000002 podStartE2EDuration="10.127450866s" podCreationTimestamp="2026-02-18 14:19:58 +0000 UTC" firstStartedPulling="2026-02-18 14:20:00.463689311 +0000 UTC m=+1263.039225294" lastFinishedPulling="2026-02-18 14:20:06.631045581 +0000 UTC m=+1269.206581564" observedRunningTime="2026-02-18 14:20:08.115816214 +0000 UTC m=+1270.691352217" watchObservedRunningTime="2026-02-18 14:20:08.127450866 +0000 UTC m=+1270.702986839" Feb 18 14:20:08 crc kubenswrapper[4817]: I0218 14:20:08.139392 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 18 14:20:08 crc kubenswrapper[4817]: I0218 14:20:08.146895 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sqhxh" podStartSLOduration=5.892554495 podStartE2EDuration="14.146874565s" podCreationTimestamp="2026-02-18 14:19:54 +0000 UTC" firstStartedPulling="2026-02-18 14:19:58.543571095 +0000 UTC m=+1261.119107078" lastFinishedPulling="2026-02-18 14:20:06.797891165 +0000 UTC m=+1269.373427148" observedRunningTime="2026-02-18 14:20:08.140582996 +0000 UTC m=+1270.716118979" watchObservedRunningTime="2026-02-18 14:20:08.146874565 +0000 UTC m=+1270.722410548" Feb 18 14:20:09 crc kubenswrapper[4817]: I0218 14:20:09.316458 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 18 14:20:09 crc kubenswrapper[4817]: I0218 14:20:09.316796 4817 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 14:20:09 crc kubenswrapper[4817]: I0218 14:20:09.381023 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 18 14:20:09 crc kubenswrapper[4817]: I0218 14:20:09.454873 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 18 14:20:09 crc kubenswrapper[4817]: I0218 14:20:09.455128 4817 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 18 14:20:09 crc kubenswrapper[4817]: I0218 14:20:09.521673 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 18 14:20:09 crc kubenswrapper[4817]: I0218 14:20:09.564094 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7c8c8d4f9c-f58g5" Feb 18 14:20:09 crc kubenswrapper[4817]: I0218 14:20:09.661566 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f8cb89c64-cqrwn"] Feb 18 14:20:09 crc kubenswrapper[4817]: I0218 14:20:09.662892 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-f8cb89c64-cqrwn" podUID="20cb47f0-a64d-4e7d-93f0-1fed117df7ce" containerName="neutron-api" containerID="cri-o://0a31c6c6302c97db31d81c9b02b8cb8ea2ed7b56fec596129d4c5fc3be5ef214" gracePeriod=30 Feb 18 14:20:09 crc kubenswrapper[4817]: I0218 14:20:09.663217 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-f8cb89c64-cqrwn" podUID="20cb47f0-a64d-4e7d-93f0-1fed117df7ce" containerName="neutron-httpd" containerID="cri-o://4c7116c6393fc406c9f817e4c5a1d598219578470179be0f40902636c9680953" gracePeriod=30 Feb 18 14:20:09 crc kubenswrapper[4817]: I0218 14:20:09.695666 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-f8cb89c64-cqrwn" podUID="20cb47f0-a64d-4e7d-93f0-1fed117df7ce" containerName="neutron-httpd" probeResult="failure" output="Get \"http://10.217.0.182:9696/\": read tcp 10.217.0.2:39752->10.217.0.182:9696: read: connection reset by peer" Feb 18 14:20:09 crc kubenswrapper[4817]: I0218 14:20:09.738047 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5scpm"] Feb 18 14:20:09 crc kubenswrapper[4817]: I0218 14:20:09.741636 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5scpm" Feb 18 14:20:09 crc kubenswrapper[4817]: I0218 14:20:09.773218 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5scpm"] Feb 18 14:20:09 crc kubenswrapper[4817]: I0218 14:20:09.861317 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z9hf\" (UniqueName: \"kubernetes.io/projected/01b17e66-ae59-413b-985f-ea5cf5e11600-kube-api-access-8z9hf\") pod \"redhat-operators-5scpm\" (UID: \"01b17e66-ae59-413b-985f-ea5cf5e11600\") " pod="openshift-marketplace/redhat-operators-5scpm" Feb 18 14:20:09 crc kubenswrapper[4817]: I0218 14:20:09.861445 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01b17e66-ae59-413b-985f-ea5cf5e11600-catalog-content\") pod \"redhat-operators-5scpm\" (UID: \"01b17e66-ae59-413b-985f-ea5cf5e11600\") " pod="openshift-marketplace/redhat-operators-5scpm" Feb 18 14:20:09 crc kubenswrapper[4817]: I0218 14:20:09.861509 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01b17e66-ae59-413b-985f-ea5cf5e11600-utilities\") pod \"redhat-operators-5scpm\" (UID: \"01b17e66-ae59-413b-985f-ea5cf5e11600\") " pod="openshift-marketplace/redhat-operators-5scpm" Feb 18 14:20:09 crc kubenswrapper[4817]: I0218 14:20:09.965578 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01b17e66-ae59-413b-985f-ea5cf5e11600-catalog-content\") pod \"redhat-operators-5scpm\" (UID: \"01b17e66-ae59-413b-985f-ea5cf5e11600\") " pod="openshift-marketplace/redhat-operators-5scpm" Feb 18 14:20:09 crc kubenswrapper[4817]: I0218 14:20:09.966071 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01b17e66-ae59-413b-985f-ea5cf5e11600-utilities\") pod \"redhat-operators-5scpm\" (UID: \"01b17e66-ae59-413b-985f-ea5cf5e11600\") " pod="openshift-marketplace/redhat-operators-5scpm" Feb 18 14:20:09 crc kubenswrapper[4817]: I0218 14:20:09.966266 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01b17e66-ae59-413b-985f-ea5cf5e11600-catalog-content\") pod \"redhat-operators-5scpm\" (UID: \"01b17e66-ae59-413b-985f-ea5cf5e11600\") " pod="openshift-marketplace/redhat-operators-5scpm" Feb 18 14:20:09 crc kubenswrapper[4817]: I0218 14:20:09.966561 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z9hf\" (UniqueName: \"kubernetes.io/projected/01b17e66-ae59-413b-985f-ea5cf5e11600-kube-api-access-8z9hf\") pod \"redhat-operators-5scpm\" (UID: \"01b17e66-ae59-413b-985f-ea5cf5e11600\") " pod="openshift-marketplace/redhat-operators-5scpm" Feb 18 14:20:09 crc kubenswrapper[4817]: I0218 14:20:09.967262 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01b17e66-ae59-413b-985f-ea5cf5e11600-utilities\") pod \"redhat-operators-5scpm\" (UID: \"01b17e66-ae59-413b-985f-ea5cf5e11600\") " pod="openshift-marketplace/redhat-operators-5scpm" Feb 18 14:20:10 crc kubenswrapper[4817]: I0218 14:20:10.000891 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z9hf\" (UniqueName: \"kubernetes.io/projected/01b17e66-ae59-413b-985f-ea5cf5e11600-kube-api-access-8z9hf\") pod \"redhat-operators-5scpm\" (UID: \"01b17e66-ae59-413b-985f-ea5cf5e11600\") " pod="openshift-marketplace/redhat-operators-5scpm" Feb 18 14:20:10 crc kubenswrapper[4817]: I0218 14:20:10.081582 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5scpm" Feb 18 14:20:10 crc kubenswrapper[4817]: I0218 14:20:10.173825 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f4194006-61f7-4c39-91cf-bc5a9003a2d1" containerName="ceilometer-central-agent" containerID="cri-o://4cdaa3cd6fc55060608e76a9a87f57a81c7da3be01e10038a75e033a53455414" gracePeriod=30 Feb 18 14:20:10 crc kubenswrapper[4817]: I0218 14:20:10.173863 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f4194006-61f7-4c39-91cf-bc5a9003a2d1" containerName="proxy-httpd" containerID="cri-o://6b3051f0fd71a3cbedba97de6b634a50d0e246d350ae1a09733ca7d57c054e61" gracePeriod=30 Feb 18 14:20:10 crc kubenswrapper[4817]: I0218 14:20:10.173924 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f4194006-61f7-4c39-91cf-bc5a9003a2d1" containerName="sg-core" containerID="cri-o://d42220dc3a5db62bc015f88e2567711787807bfa8400304a06effd817ce6fd53" gracePeriod=30 Feb 18 14:20:10 crc kubenswrapper[4817]: I0218 14:20:10.173938 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f4194006-61f7-4c39-91cf-bc5a9003a2d1" containerName="ceilometer-notification-agent" containerID="cri-o://53bef78fafa0a35a8769da9ba7a098916b0691b1b9c9f335987c4c432fe29c8a" gracePeriod=30 Feb 18 14:20:10 crc kubenswrapper[4817]: I0218 14:20:10.187940 4817 generic.go:334] "Generic (PLEG): container finished" podID="20cb47f0-a64d-4e7d-93f0-1fed117df7ce" containerID="4c7116c6393fc406c9f817e4c5a1d598219578470179be0f40902636c9680953" exitCode=0 Feb 18 14:20:10 crc kubenswrapper[4817]: I0218 14:20:10.189739 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="fd188298-86b3-470b-b3ab-d3d3c9e356a7" containerName="cloudkitty-proc" containerID="cri-o://da730a3033788be135c8c0fdb0570d392868504128e0895329ceaad366c8a371" gracePeriod=30 Feb 18 14:20:10 crc kubenswrapper[4817]: I0218 14:20:10.196788 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4194006-61f7-4c39-91cf-bc5a9003a2d1","Type":"ContainerStarted","Data":"6b3051f0fd71a3cbedba97de6b634a50d0e246d350ae1a09733ca7d57c054e61"} Feb 18 14:20:10 crc kubenswrapper[4817]: I0218 14:20:10.204497 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f8cb89c64-cqrwn" event={"ID":"20cb47f0-a64d-4e7d-93f0-1fed117df7ce","Type":"ContainerDied","Data":"4c7116c6393fc406c9f817e4c5a1d598219578470179be0f40902636c9680953"} Feb 18 14:20:10 crc kubenswrapper[4817]: I0218 14:20:10.204611 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 18 14:20:10 crc kubenswrapper[4817]: I0218 14:20:10.219942 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.127262187 podStartE2EDuration="14.219915555s" podCreationTimestamp="2026-02-18 14:19:56 +0000 UTC" firstStartedPulling="2026-02-18 14:19:57.984575993 +0000 UTC m=+1260.560111976" lastFinishedPulling="2026-02-18 14:20:09.077229361 +0000 UTC m=+1271.652765344" observedRunningTime="2026-02-18 14:20:10.204723833 +0000 UTC m=+1272.780259826" watchObservedRunningTime="2026-02-18 14:20:10.219915555 +0000 UTC m=+1272.795451538" Feb 18 14:20:10 crc kubenswrapper[4817]: I0218 14:20:10.719559 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5scpm"] Feb 18 14:20:10 crc kubenswrapper[4817]: W0218 14:20:10.728139 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01b17e66_ae59_413b_985f_ea5cf5e11600.slice/crio-0f3f1f2e1d901caa396c938299bb07cf9d23e240f22830b2c8b880fe0d38e204 WatchSource:0}: Error finding container 0f3f1f2e1d901caa396c938299bb07cf9d23e240f22830b2c8b880fe0d38e204: Status 404 returned error can't find the container with id 0f3f1f2e1d901caa396c938299bb07cf9d23e240f22830b2c8b880fe0d38e204 Feb 18 14:20:11 crc kubenswrapper[4817]: I0218 14:20:11.213095 4817 generic.go:334] "Generic (PLEG): container finished" podID="01b17e66-ae59-413b-985f-ea5cf5e11600" containerID="8d2522b0b5aed8d79a9631a4539d24467dfc33b95b51c6b97a2fc8dfeac6cdf1" exitCode=0 Feb 18 14:20:11 crc kubenswrapper[4817]: I0218 14:20:11.213407 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5scpm" event={"ID":"01b17e66-ae59-413b-985f-ea5cf5e11600","Type":"ContainerDied","Data":"8d2522b0b5aed8d79a9631a4539d24467dfc33b95b51c6b97a2fc8dfeac6cdf1"} Feb 18 14:20:11 crc kubenswrapper[4817]: I0218 14:20:11.213439 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5scpm" event={"ID":"01b17e66-ae59-413b-985f-ea5cf5e11600","Type":"ContainerStarted","Data":"0f3f1f2e1d901caa396c938299bb07cf9d23e240f22830b2c8b880fe0d38e204"} Feb 18 14:20:11 crc kubenswrapper[4817]: I0218 14:20:11.238126 4817 generic.go:334] "Generic (PLEG): container finished" podID="f4194006-61f7-4c39-91cf-bc5a9003a2d1" containerID="6b3051f0fd71a3cbedba97de6b634a50d0e246d350ae1a09733ca7d57c054e61" exitCode=0 Feb 18 14:20:11 crc kubenswrapper[4817]: I0218 14:20:11.238162 4817 generic.go:334] "Generic (PLEG): container finished" podID="f4194006-61f7-4c39-91cf-bc5a9003a2d1" containerID="d42220dc3a5db62bc015f88e2567711787807bfa8400304a06effd817ce6fd53" exitCode=2 Feb 18 14:20:11 crc kubenswrapper[4817]: I0218 14:20:11.238171 4817 generic.go:334] "Generic (PLEG): container finished" podID="f4194006-61f7-4c39-91cf-bc5a9003a2d1" containerID="53bef78fafa0a35a8769da9ba7a098916b0691b1b9c9f335987c4c432fe29c8a" exitCode=0 Feb 18 14:20:11 crc kubenswrapper[4817]: I0218 14:20:11.238189 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4194006-61f7-4c39-91cf-bc5a9003a2d1","Type":"ContainerDied","Data":"6b3051f0fd71a3cbedba97de6b634a50d0e246d350ae1a09733ca7d57c054e61"} Feb 18 14:20:11 crc kubenswrapper[4817]: I0218 14:20:11.238214 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4194006-61f7-4c39-91cf-bc5a9003a2d1","Type":"ContainerDied","Data":"d42220dc3a5db62bc015f88e2567711787807bfa8400304a06effd817ce6fd53"} Feb 18 14:20:11 crc kubenswrapper[4817]: I0218 14:20:11.238225 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4194006-61f7-4c39-91cf-bc5a9003a2d1","Type":"ContainerDied","Data":"53bef78fafa0a35a8769da9ba7a098916b0691b1b9c9f335987c4c432fe29c8a"} Feb 18 14:20:12 crc kubenswrapper[4817]: I0218 14:20:12.277007 4817 generic.go:334] "Generic (PLEG): container finished" podID="20cb47f0-a64d-4e7d-93f0-1fed117df7ce" containerID="0a31c6c6302c97db31d81c9b02b8cb8ea2ed7b56fec596129d4c5fc3be5ef214" exitCode=0 Feb 18 14:20:12 crc kubenswrapper[4817]: I0218 14:20:12.277646 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f8cb89c64-cqrwn" event={"ID":"20cb47f0-a64d-4e7d-93f0-1fed117df7ce","Type":"ContainerDied","Data":"0a31c6c6302c97db31d81c9b02b8cb8ea2ed7b56fec596129d4c5fc3be5ef214"} Feb 18 14:20:12 crc kubenswrapper[4817]: I0218 14:20:12.863676 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:20:12 crc kubenswrapper[4817]: I0218 14:20:12.864076 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:20:12 crc kubenswrapper[4817]: I0218 14:20:12.864400 4817 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" Feb 18 14:20:12 crc kubenswrapper[4817]: I0218 14:20:12.865574 4817 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bd719d9fe372437c635a5966e962ebc51e7647a95b5fd6491500726f444d522f"} pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 14:20:12 crc kubenswrapper[4817]: I0218 14:20:12.865664 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" containerID="cri-o://bd719d9fe372437c635a5966e962ebc51e7647a95b5fd6491500726f444d522f" gracePeriod=600 Feb 18 14:20:12 crc kubenswrapper[4817]: I0218 14:20:12.924544 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f8cb89c64-cqrwn" Feb 18 14:20:12 crc kubenswrapper[4817]: I0218 14:20:12.993846 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/20cb47f0-a64d-4e7d-93f0-1fed117df7ce-config\") pod \"20cb47f0-a64d-4e7d-93f0-1fed117df7ce\" (UID: \"20cb47f0-a64d-4e7d-93f0-1fed117df7ce\") " Feb 18 14:20:12 crc kubenswrapper[4817]: I0218 14:20:12.994242 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/20cb47f0-a64d-4e7d-93f0-1fed117df7ce-httpd-config\") pod \"20cb47f0-a64d-4e7d-93f0-1fed117df7ce\" (UID: \"20cb47f0-a64d-4e7d-93f0-1fed117df7ce\") " Feb 18 14:20:12 crc kubenswrapper[4817]: I0218 14:20:12.994347 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20cb47f0-a64d-4e7d-93f0-1fed117df7ce-combined-ca-bundle\") pod \"20cb47f0-a64d-4e7d-93f0-1fed117df7ce\" (UID: \"20cb47f0-a64d-4e7d-93f0-1fed117df7ce\") " Feb 18 14:20:12 crc kubenswrapper[4817]: I0218 14:20:12.994510 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/20cb47f0-a64d-4e7d-93f0-1fed117df7ce-ovndb-tls-certs\") pod \"20cb47f0-a64d-4e7d-93f0-1fed117df7ce\" (UID: \"20cb47f0-a64d-4e7d-93f0-1fed117df7ce\") " Feb 18 14:20:12 crc kubenswrapper[4817]: I0218 14:20:12.994550 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjxkv\" (UniqueName: \"kubernetes.io/projected/20cb47f0-a64d-4e7d-93f0-1fed117df7ce-kube-api-access-fjxkv\") pod \"20cb47f0-a64d-4e7d-93f0-1fed117df7ce\" (UID: \"20cb47f0-a64d-4e7d-93f0-1fed117df7ce\") " Feb 18 14:20:13 crc kubenswrapper[4817]: I0218 14:20:13.008234 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20cb47f0-a64d-4e7d-93f0-1fed117df7ce-kube-api-access-fjxkv" (OuterVolumeSpecName: "kube-api-access-fjxkv") pod "20cb47f0-a64d-4e7d-93f0-1fed117df7ce" (UID: "20cb47f0-a64d-4e7d-93f0-1fed117df7ce"). InnerVolumeSpecName "kube-api-access-fjxkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:20:13 crc kubenswrapper[4817]: I0218 14:20:13.008946 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20cb47f0-a64d-4e7d-93f0-1fed117df7ce-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "20cb47f0-a64d-4e7d-93f0-1fed117df7ce" (UID: "20cb47f0-a64d-4e7d-93f0-1fed117df7ce"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:20:13 crc kubenswrapper[4817]: I0218 14:20:13.097815 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjxkv\" (UniqueName: \"kubernetes.io/projected/20cb47f0-a64d-4e7d-93f0-1fed117df7ce-kube-api-access-fjxkv\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:13 crc kubenswrapper[4817]: I0218 14:20:13.098300 4817 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/20cb47f0-a64d-4e7d-93f0-1fed117df7ce-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:13 crc kubenswrapper[4817]: I0218 14:20:13.115249 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20cb47f0-a64d-4e7d-93f0-1fed117df7ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20cb47f0-a64d-4e7d-93f0-1fed117df7ce" (UID: "20cb47f0-a64d-4e7d-93f0-1fed117df7ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:20:13 crc kubenswrapper[4817]: I0218 14:20:13.202394 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20cb47f0-a64d-4e7d-93f0-1fed117df7ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:13 crc kubenswrapper[4817]: I0218 14:20:13.254052 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20cb47f0-a64d-4e7d-93f0-1fed117df7ce-config" (OuterVolumeSpecName: "config") pod "20cb47f0-a64d-4e7d-93f0-1fed117df7ce" (UID: "20cb47f0-a64d-4e7d-93f0-1fed117df7ce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:20:13 crc kubenswrapper[4817]: I0218 14:20:13.265735 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20cb47f0-a64d-4e7d-93f0-1fed117df7ce-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "20cb47f0-a64d-4e7d-93f0-1fed117df7ce" (UID: "20cb47f0-a64d-4e7d-93f0-1fed117df7ce"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:20:13 crc kubenswrapper[4817]: I0218 14:20:13.303772 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f8cb89c64-cqrwn" event={"ID":"20cb47f0-a64d-4e7d-93f0-1fed117df7ce","Type":"ContainerDied","Data":"1789427e05346805b6e3f5ca8b03aac7fe470c01de475e40c3bf13e1e87753f5"} Feb 18 14:20:13 crc kubenswrapper[4817]: I0218 14:20:13.305005 4817 scope.go:117] "RemoveContainer" containerID="4c7116c6393fc406c9f817e4c5a1d598219578470179be0f40902636c9680953" Feb 18 14:20:13 crc kubenswrapper[4817]: I0218 14:20:13.303866 4817 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/20cb47f0-a64d-4e7d-93f0-1fed117df7ce-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:13 crc kubenswrapper[4817]: I0218 14:20:13.305186 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/20cb47f0-a64d-4e7d-93f0-1fed117df7ce-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:13 crc kubenswrapper[4817]: I0218 14:20:13.304026 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f8cb89c64-cqrwn" Feb 18 14:20:13 crc kubenswrapper[4817]: I0218 14:20:13.314847 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5scpm" event={"ID":"01b17e66-ae59-413b-985f-ea5cf5e11600","Type":"ContainerStarted","Data":"d37b5d2d6950d880e7a78a73090a7f2e8fbb202e14e81d8701c4623c0b54582d"} Feb 18 14:20:13 crc kubenswrapper[4817]: I0218 14:20:13.337232 4817 generic.go:334] "Generic (PLEG): container finished" podID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerID="bd719d9fe372437c635a5966e962ebc51e7647a95b5fd6491500726f444d522f" exitCode=0 Feb 18 14:20:13 crc kubenswrapper[4817]: I0218 14:20:13.337295 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerDied","Data":"bd719d9fe372437c635a5966e962ebc51e7647a95b5fd6491500726f444d522f"} Feb 18 14:20:13 crc kubenswrapper[4817]: I0218 14:20:13.375015 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f8cb89c64-cqrwn"] Feb 18 14:20:13 crc kubenswrapper[4817]: I0218 14:20:13.384295 4817 scope.go:117] "RemoveContainer" containerID="0a31c6c6302c97db31d81c9b02b8cb8ea2ed7b56fec596129d4c5fc3be5ef214" Feb 18 14:20:13 crc kubenswrapper[4817]: I0218 14:20:13.421102 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f8cb89c64-cqrwn"] Feb 18 14:20:13 crc kubenswrapper[4817]: I0218 14:20:13.470265 4817 scope.go:117] "RemoveContainer" containerID="2a45288dd8059ad4005579ccd7ba9584a44ec34777e8d02ff7b0f8c874cff3f7" Feb 18 14:20:14 crc kubenswrapper[4817]: I0218 14:20:14.189955 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20cb47f0-a64d-4e7d-93f0-1fed117df7ce" path="/var/lib/kubelet/pods/20cb47f0-a64d-4e7d-93f0-1fed117df7ce/volumes" Feb 18 14:20:14 crc kubenswrapper[4817]: I0218 14:20:14.351058 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerStarted","Data":"f904d428b6eee9716ba5ad8fa384beb59b260ceb6de6d026ad8fd0ef911a200e"} Feb 18 14:20:14 crc kubenswrapper[4817]: I0218 14:20:14.359146 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76d4d7c9b7-hsgjd" Feb 18 14:20:14 crc kubenswrapper[4817]: I0218 14:20:14.465735 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-9w568"] Feb 18 14:20:14 crc kubenswrapper[4817]: I0218 14:20:14.466047 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5784cf869f-9w568" podUID="ed573b21-30a7-47b3-bdc8-7d8843074607" containerName="dnsmasq-dns" containerID="cri-o://693c3b70dfb28f9d779c518cc0749685cca27d25bf824daa09bc85e7ac834e16" gracePeriod=10 Feb 18 14:20:14 crc kubenswrapper[4817]: I0218 14:20:14.944172 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sqhxh" Feb 18 14:20:14 crc kubenswrapper[4817]: I0218 14:20:14.944519 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sqhxh" Feb 18 14:20:15 crc kubenswrapper[4817]: I0218 14:20:15.004249 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sqhxh" Feb 18 14:20:15 crc kubenswrapper[4817]: I0218 14:20:15.372083 4817 generic.go:334] "Generic (PLEG): container finished" podID="01b17e66-ae59-413b-985f-ea5cf5e11600" containerID="d37b5d2d6950d880e7a78a73090a7f2e8fbb202e14e81d8701c4623c0b54582d" exitCode=0 Feb 18 14:20:15 crc kubenswrapper[4817]: I0218 14:20:15.372148 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5scpm" event={"ID":"01b17e66-ae59-413b-985f-ea5cf5e11600","Type":"ContainerDied","Data":"d37b5d2d6950d880e7a78a73090a7f2e8fbb202e14e81d8701c4623c0b54582d"} Feb 18 14:20:15 crc kubenswrapper[4817]: I0218 14:20:15.375839 4817 generic.go:334] "Generic (PLEG): container finished" podID="ed573b21-30a7-47b3-bdc8-7d8843074607" containerID="693c3b70dfb28f9d779c518cc0749685cca27d25bf824daa09bc85e7ac834e16" exitCode=0 Feb 18 14:20:15 crc kubenswrapper[4817]: I0218 14:20:15.375932 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-9w568" event={"ID":"ed573b21-30a7-47b3-bdc8-7d8843074607","Type":"ContainerDied","Data":"693c3b70dfb28f9d779c518cc0749685cca27d25bf824daa09bc85e7ac834e16"} Feb 18 14:20:15 crc kubenswrapper[4817]: I0218 14:20:15.441749 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sqhxh" Feb 18 14:20:17 crc kubenswrapper[4817]: I0218 14:20:17.099543 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sqhxh"] Feb 18 14:20:17 crc kubenswrapper[4817]: I0218 14:20:17.404781 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sqhxh" podUID="a938bc12-3666-41cf-b8e5-3fa647fe32f0" containerName="registry-server" containerID="cri-o://ccc70cce335eba799c9289c4deadb09cf831ff07a39cc518d9ff78cb54df61aa" gracePeriod=2 Feb 18 14:20:18 crc kubenswrapper[4817]: I0218 14:20:18.418070 4817 generic.go:334] "Generic (PLEG): container finished" podID="a938bc12-3666-41cf-b8e5-3fa647fe32f0" containerID="ccc70cce335eba799c9289c4deadb09cf831ff07a39cc518d9ff78cb54df61aa" exitCode=0 Feb 18 14:20:18 crc kubenswrapper[4817]: I0218 14:20:18.418150 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqhxh" event={"ID":"a938bc12-3666-41cf-b8e5-3fa647fe32f0","Type":"ContainerDied","Data":"ccc70cce335eba799c9289c4deadb09cf831ff07a39cc518d9ff78cb54df61aa"} Feb 18 14:20:18 crc kubenswrapper[4817]: I0218 14:20:18.421581 4817 generic.go:334] "Generic (PLEG): container finished" podID="fd188298-86b3-470b-b3ab-d3d3c9e356a7" containerID="da730a3033788be135c8c0fdb0570d392868504128e0895329ceaad366c8a371" exitCode=0 Feb 18 14:20:18 crc kubenswrapper[4817]: I0218 14:20:18.421621 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"fd188298-86b3-470b-b3ab-d3d3c9e356a7","Type":"ContainerDied","Data":"da730a3033788be135c8c0fdb0570d392868504128e0895329ceaad366c8a371"} Feb 18 14:20:19 crc kubenswrapper[4817]: I0218 14:20:19.436433 4817 generic.go:334] "Generic (PLEG): container finished" podID="f4194006-61f7-4c39-91cf-bc5a9003a2d1" containerID="4cdaa3cd6fc55060608e76a9a87f57a81c7da3be01e10038a75e033a53455414" exitCode=0 Feb 18 14:20:19 crc kubenswrapper[4817]: I0218 14:20:19.436505 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4194006-61f7-4c39-91cf-bc5a9003a2d1","Type":"ContainerDied","Data":"4cdaa3cd6fc55060608e76a9a87f57a81c7da3be01e10038a75e033a53455414"} Feb 18 14:20:21 crc kubenswrapper[4817]: I0218 14:20:21.465385 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-9w568" event={"ID":"ed573b21-30a7-47b3-bdc8-7d8843074607","Type":"ContainerDied","Data":"0761755a4cd73f79ae2a2da424a8cddbf1d3fd7dc245bddcc8836a7a58abcdd4"} Feb 18 14:20:21 crc kubenswrapper[4817]: I0218 14:20:21.466355 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0761755a4cd73f79ae2a2da424a8cddbf1d3fd7dc245bddcc8836a7a58abcdd4" Feb 18 14:20:21 crc kubenswrapper[4817]: I0218 14:20:21.669794 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-9w568" Feb 18 14:20:21 crc kubenswrapper[4817]: I0218 14:20:21.702698 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed573b21-30a7-47b3-bdc8-7d8843074607-ovsdbserver-nb\") pod \"ed573b21-30a7-47b3-bdc8-7d8843074607\" (UID: \"ed573b21-30a7-47b3-bdc8-7d8843074607\") " Feb 18 14:20:21 crc kubenswrapper[4817]: I0218 14:20:21.702753 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sr6k\" (UniqueName: \"kubernetes.io/projected/ed573b21-30a7-47b3-bdc8-7d8843074607-kube-api-access-9sr6k\") pod \"ed573b21-30a7-47b3-bdc8-7d8843074607\" (UID: \"ed573b21-30a7-47b3-bdc8-7d8843074607\") " Feb 18 14:20:21 crc kubenswrapper[4817]: I0218 14:20:21.702859 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed573b21-30a7-47b3-bdc8-7d8843074607-dns-swift-storage-0\") pod \"ed573b21-30a7-47b3-bdc8-7d8843074607\" (UID: \"ed573b21-30a7-47b3-bdc8-7d8843074607\") " Feb 18 14:20:21 crc kubenswrapper[4817]: I0218 14:20:21.703002 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed573b21-30a7-47b3-bdc8-7d8843074607-dns-svc\") pod \"ed573b21-30a7-47b3-bdc8-7d8843074607\" (UID: \"ed573b21-30a7-47b3-bdc8-7d8843074607\") " Feb 18 14:20:21 crc kubenswrapper[4817]: I0218 14:20:21.703034 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed573b21-30a7-47b3-bdc8-7d8843074607-ovsdbserver-sb\") pod \"ed573b21-30a7-47b3-bdc8-7d8843074607\" (UID: \"ed573b21-30a7-47b3-bdc8-7d8843074607\") " Feb 18 14:20:21 crc kubenswrapper[4817]: I0218 14:20:21.703150 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed573b21-30a7-47b3-bdc8-7d8843074607-config\") pod \"ed573b21-30a7-47b3-bdc8-7d8843074607\" (UID: \"ed573b21-30a7-47b3-bdc8-7d8843074607\") " Feb 18 14:20:21 crc kubenswrapper[4817]: I0218 14:20:21.715268 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed573b21-30a7-47b3-bdc8-7d8843074607-kube-api-access-9sr6k" (OuterVolumeSpecName: "kube-api-access-9sr6k") pod "ed573b21-30a7-47b3-bdc8-7d8843074607" (UID: "ed573b21-30a7-47b3-bdc8-7d8843074607"). InnerVolumeSpecName "kube-api-access-9sr6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:20:21 crc kubenswrapper[4817]: I0218 14:20:21.717044 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5784cf869f-9w568" podUID="ed573b21-30a7-47b3-bdc8-7d8843074607" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.185:5353: i/o timeout" Feb 18 14:20:21 crc kubenswrapper[4817]: I0218 14:20:21.798958 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed573b21-30a7-47b3-bdc8-7d8843074607-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ed573b21-30a7-47b3-bdc8-7d8843074607" (UID: "ed573b21-30a7-47b3-bdc8-7d8843074607"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:20:21 crc kubenswrapper[4817]: I0218 14:20:21.821419 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed573b21-30a7-47b3-bdc8-7d8843074607-config" (OuterVolumeSpecName: "config") pod "ed573b21-30a7-47b3-bdc8-7d8843074607" (UID: "ed573b21-30a7-47b3-bdc8-7d8843074607"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:20:21 crc kubenswrapper[4817]: I0218 14:20:21.837725 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed573b21-30a7-47b3-bdc8-7d8843074607-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:21 crc kubenswrapper[4817]: I0218 14:20:21.837763 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sr6k\" (UniqueName: \"kubernetes.io/projected/ed573b21-30a7-47b3-bdc8-7d8843074607-kube-api-access-9sr6k\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:21 crc kubenswrapper[4817]: I0218 14:20:21.837776 4817 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed573b21-30a7-47b3-bdc8-7d8843074607-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:21 crc kubenswrapper[4817]: I0218 14:20:21.844648 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed573b21-30a7-47b3-bdc8-7d8843074607-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ed573b21-30a7-47b3-bdc8-7d8843074607" (UID: "ed573b21-30a7-47b3-bdc8-7d8843074607"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:20:21 crc kubenswrapper[4817]: I0218 14:20:21.882221 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed573b21-30a7-47b3-bdc8-7d8843074607-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ed573b21-30a7-47b3-bdc8-7d8843074607" (UID: "ed573b21-30a7-47b3-bdc8-7d8843074607"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:20:21 crc kubenswrapper[4817]: I0218 14:20:21.883477 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed573b21-30a7-47b3-bdc8-7d8843074607-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ed573b21-30a7-47b3-bdc8-7d8843074607" (UID: "ed573b21-30a7-47b3-bdc8-7d8843074607"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:20:21 crc kubenswrapper[4817]: I0218 14:20:21.940487 4817 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed573b21-30a7-47b3-bdc8-7d8843074607-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:21 crc kubenswrapper[4817]: I0218 14:20:21.940532 4817 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed573b21-30a7-47b3-bdc8-7d8843074607-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:21 crc kubenswrapper[4817]: I0218 14:20:21.940543 4817 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed573b21-30a7-47b3-bdc8-7d8843074607-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.138231 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.276633 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4194006-61f7-4c39-91cf-bc5a9003a2d1-run-httpd\") pod \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\" (UID: \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\") " Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.276794 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msdnz\" (UniqueName: \"kubernetes.io/projected/f4194006-61f7-4c39-91cf-bc5a9003a2d1-kube-api-access-msdnz\") pod \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\" (UID: \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\") " Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.276871 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4194006-61f7-4c39-91cf-bc5a9003a2d1-config-data\") pod \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\" (UID: \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\") " Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.276923 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4194006-61f7-4c39-91cf-bc5a9003a2d1-log-httpd\") pod \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\" (UID: \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\") " Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.276956 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f4194006-61f7-4c39-91cf-bc5a9003a2d1-sg-core-conf-yaml\") pod \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\" (UID: \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\") " Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.277003 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4194006-61f7-4c39-91cf-bc5a9003a2d1-scripts\") pod \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\" (UID: \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\") " Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.277134 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4194006-61f7-4c39-91cf-bc5a9003a2d1-combined-ca-bundle\") pod \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\" (UID: \"f4194006-61f7-4c39-91cf-bc5a9003a2d1\") " Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.277412 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4194006-61f7-4c39-91cf-bc5a9003a2d1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f4194006-61f7-4c39-91cf-bc5a9003a2d1" (UID: "f4194006-61f7-4c39-91cf-bc5a9003a2d1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.279658 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4194006-61f7-4c39-91cf-bc5a9003a2d1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f4194006-61f7-4c39-91cf-bc5a9003a2d1" (UID: "f4194006-61f7-4c39-91cf-bc5a9003a2d1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.282843 4817 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4194006-61f7-4c39-91cf-bc5a9003a2d1-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.306694 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4194006-61f7-4c39-91cf-bc5a9003a2d1-scripts" (OuterVolumeSpecName: "scripts") pod "f4194006-61f7-4c39-91cf-bc5a9003a2d1" (UID: "f4194006-61f7-4c39-91cf-bc5a9003a2d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.339544 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sqhxh" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.360793 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4194006-61f7-4c39-91cf-bc5a9003a2d1-kube-api-access-msdnz" (OuterVolumeSpecName: "kube-api-access-msdnz") pod "f4194006-61f7-4c39-91cf-bc5a9003a2d1" (UID: "f4194006-61f7-4c39-91cf-bc5a9003a2d1"). InnerVolumeSpecName "kube-api-access-msdnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.384495 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msdnz\" (UniqueName: \"kubernetes.io/projected/f4194006-61f7-4c39-91cf-bc5a9003a2d1-kube-api-access-msdnz\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.384530 4817 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4194006-61f7-4c39-91cf-bc5a9003a2d1-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.384540 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4194006-61f7-4c39-91cf-bc5a9003a2d1-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.487238 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"fd188298-86b3-470b-b3ab-d3d3c9e356a7","Type":"ContainerDied","Data":"72a75732403058e1abbb8578e2ef5a39a668f1fef6d27f61200418f8673cc17f"} Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.488661 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72a75732403058e1abbb8578e2ef5a39a668f1fef6d27f61200418f8673cc17f" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.490112 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a938bc12-3666-41cf-b8e5-3fa647fe32f0-utilities\") pod \"a938bc12-3666-41cf-b8e5-3fa647fe32f0\" (UID: \"a938bc12-3666-41cf-b8e5-3fa647fe32f0\") " Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.490314 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a938bc12-3666-41cf-b8e5-3fa647fe32f0-catalog-content\") pod \"a938bc12-3666-41cf-b8e5-3fa647fe32f0\" (UID: \"a938bc12-3666-41cf-b8e5-3fa647fe32f0\") " Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.490604 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zdjh\" (UniqueName: \"kubernetes.io/projected/a938bc12-3666-41cf-b8e5-3fa647fe32f0-kube-api-access-2zdjh\") pod \"a938bc12-3666-41cf-b8e5-3fa647fe32f0\" (UID: \"a938bc12-3666-41cf-b8e5-3fa647fe32f0\") " Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.495244 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a938bc12-3666-41cf-b8e5-3fa647fe32f0-utilities" (OuterVolumeSpecName: "utilities") pod "a938bc12-3666-41cf-b8e5-3fa647fe32f0" (UID: "a938bc12-3666-41cf-b8e5-3fa647fe32f0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.502326 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqhxh" event={"ID":"a938bc12-3666-41cf-b8e5-3fa647fe32f0","Type":"ContainerDied","Data":"ebf4308eacf7d19fc07c508db18b16275883c3406256abe69cbb109dc19a29b7"} Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.502412 4817 scope.go:117] "RemoveContainer" containerID="ccc70cce335eba799c9289c4deadb09cf831ff07a39cc518d9ff78cb54df61aa" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.502652 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sqhxh" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.503041 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a938bc12-3666-41cf-b8e5-3fa647fe32f0-kube-api-access-2zdjh" (OuterVolumeSpecName: "kube-api-access-2zdjh") pod "a938bc12-3666-41cf-b8e5-3fa647fe32f0" (UID: "a938bc12-3666-41cf-b8e5-3fa647fe32f0"). InnerVolumeSpecName "kube-api-access-2zdjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.509081 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4194006-61f7-4c39-91cf-bc5a9003a2d1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f4194006-61f7-4c39-91cf-bc5a9003a2d1" (UID: "f4194006-61f7-4c39-91cf-bc5a9003a2d1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.535754 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4194006-61f7-4c39-91cf-bc5a9003a2d1","Type":"ContainerDied","Data":"bf1c2612dfa02f87daf40bcb3f230742c2444888338fb4618acf64ca0a37966f"} Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.536322 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.542656 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-9w568" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.543777 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5scpm" event={"ID":"01b17e66-ae59-413b-985f-ea5cf5e11600","Type":"ContainerStarted","Data":"d5281a5ac84c58a1c66190ef7efefe08147a31a81c491d34755cb4e7412470da"} Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.569005 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a938bc12-3666-41cf-b8e5-3fa647fe32f0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a938bc12-3666-41cf-b8e5-3fa647fe32f0" (UID: "a938bc12-3666-41cf-b8e5-3fa647fe32f0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.589717 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5scpm" podStartSLOduration=3.136621105 podStartE2EDuration="13.589701327s" podCreationTimestamp="2026-02-18 14:20:09 +0000 UTC" firstStartedPulling="2026-02-18 14:20:11.21824369 +0000 UTC m=+1273.793779673" lastFinishedPulling="2026-02-18 14:20:21.671323922 +0000 UTC m=+1284.246859895" observedRunningTime="2026-02-18 14:20:22.587252965 +0000 UTC m=+1285.162788938" watchObservedRunningTime="2026-02-18 14:20:22.589701327 +0000 UTC m=+1285.165237310" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.594646 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zdjh\" (UniqueName: \"kubernetes.io/projected/a938bc12-3666-41cf-b8e5-3fa647fe32f0-kube-api-access-2zdjh\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.594679 4817 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f4194006-61f7-4c39-91cf-bc5a9003a2d1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.594688 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a938bc12-3666-41cf-b8e5-3fa647fe32f0-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.594698 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a938bc12-3666-41cf-b8e5-3fa647fe32f0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.654003 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4194006-61f7-4c39-91cf-bc5a9003a2d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4194006-61f7-4c39-91cf-bc5a9003a2d1" (UID: "f4194006-61f7-4c39-91cf-bc5a9003a2d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.690364 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4194006-61f7-4c39-91cf-bc5a9003a2d1-config-data" (OuterVolumeSpecName: "config-data") pod "f4194006-61f7-4c39-91cf-bc5a9003a2d1" (UID: "f4194006-61f7-4c39-91cf-bc5a9003a2d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.696548 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4194006-61f7-4c39-91cf-bc5a9003a2d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.696591 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4194006-61f7-4c39-91cf-bc5a9003a2d1-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.773467 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.790391 4817 scope.go:117] "RemoveContainer" containerID="57a8e3c4eccbeb4ae0898dbd9c80d4b99913889c202df5693c9c7dc254444693" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.797655 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-9w568"] Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.823480 4817 scope.go:117] "RemoveContainer" containerID="67d9a30dc1dd9c7ee86d40c3ccc79eac7095411317c649564b8d192a189712ba" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.829970 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-9w568"] Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.880901 4817 scope.go:117] "RemoveContainer" containerID="6b3051f0fd71a3cbedba97de6b634a50d0e246d350ae1a09733ca7d57c054e61" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.899455 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd188298-86b3-470b-b3ab-d3d3c9e356a7-combined-ca-bundle\") pod \"fd188298-86b3-470b-b3ab-d3d3c9e356a7\" (UID: \"fd188298-86b3-470b-b3ab-d3d3c9e356a7\") " Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.899908 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd188298-86b3-470b-b3ab-d3d3c9e356a7-scripts\") pod \"fd188298-86b3-470b-b3ab-d3d3c9e356a7\" (UID: \"fd188298-86b3-470b-b3ab-d3d3c9e356a7\") " Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.900051 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd188298-86b3-470b-b3ab-d3d3c9e356a7-config-data-custom\") pod \"fd188298-86b3-470b-b3ab-d3d3c9e356a7\" (UID: \"fd188298-86b3-470b-b3ab-d3d3c9e356a7\") " Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.900165 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fd188298-86b3-470b-b3ab-d3d3c9e356a7-certs\") pod \"fd188298-86b3-470b-b3ab-d3d3c9e356a7\" (UID: \"fd188298-86b3-470b-b3ab-d3d3c9e356a7\") " Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.900271 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd188298-86b3-470b-b3ab-d3d3c9e356a7-config-data\") pod \"fd188298-86b3-470b-b3ab-d3d3c9e356a7\" (UID: \"fd188298-86b3-470b-b3ab-d3d3c9e356a7\") " Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.900511 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lh79q\" (UniqueName: \"kubernetes.io/projected/fd188298-86b3-470b-b3ab-d3d3c9e356a7-kube-api-access-lh79q\") pod \"fd188298-86b3-470b-b3ab-d3d3c9e356a7\" (UID: \"fd188298-86b3-470b-b3ab-d3d3c9e356a7\") " Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.909955 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd188298-86b3-470b-b3ab-d3d3c9e356a7-kube-api-access-lh79q" (OuterVolumeSpecName: "kube-api-access-lh79q") pod "fd188298-86b3-470b-b3ab-d3d3c9e356a7" (UID: "fd188298-86b3-470b-b3ab-d3d3c9e356a7"). InnerVolumeSpecName "kube-api-access-lh79q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.912192 4817 scope.go:117] "RemoveContainer" containerID="d42220dc3a5db62bc015f88e2567711787807bfa8400304a06effd817ce6fd53" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.914904 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd188298-86b3-470b-b3ab-d3d3c9e356a7-certs" (OuterVolumeSpecName: "certs") pod "fd188298-86b3-470b-b3ab-d3d3c9e356a7" (UID: "fd188298-86b3-470b-b3ab-d3d3c9e356a7"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.915673 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd188298-86b3-470b-b3ab-d3d3c9e356a7-scripts" (OuterVolumeSpecName: "scripts") pod "fd188298-86b3-470b-b3ab-d3d3c9e356a7" (UID: "fd188298-86b3-470b-b3ab-d3d3c9e356a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.916256 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd188298-86b3-470b-b3ab-d3d3c9e356a7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fd188298-86b3-470b-b3ab-d3d3c9e356a7" (UID: "fd188298-86b3-470b-b3ab-d3d3c9e356a7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.921412 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sqhxh"] Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.961630 4817 scope.go:117] "RemoveContainer" containerID="53bef78fafa0a35a8769da9ba7a098916b0691b1b9c9f335987c4c432fe29c8a" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.972381 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sqhxh"] Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.980090 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd188298-86b3-470b-b3ab-d3d3c9e356a7-config-data" (OuterVolumeSpecName: "config-data") pod "fd188298-86b3-470b-b3ab-d3d3c9e356a7" (UID: "fd188298-86b3-470b-b3ab-d3d3c9e356a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.985929 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd188298-86b3-470b-b3ab-d3d3c9e356a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd188298-86b3-470b-b3ab-d3d3c9e356a7" (UID: "fd188298-86b3-470b-b3ab-d3d3c9e356a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.986824 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:20:22 crc kubenswrapper[4817]: I0218 14:20:22.997370 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.007793 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lh79q\" (UniqueName: \"kubernetes.io/projected/fd188298-86b3-470b-b3ab-d3d3c9e356a7-kube-api-access-lh79q\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.007870 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd188298-86b3-470b-b3ab-d3d3c9e356a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.007886 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd188298-86b3-470b-b3ab-d3d3c9e356a7-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.007896 4817 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd188298-86b3-470b-b3ab-d3d3c9e356a7-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.007907 4817 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fd188298-86b3-470b-b3ab-d3d3c9e356a7-certs\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.007917 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd188298-86b3-470b-b3ab-d3d3c9e356a7-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.012070 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:20:23 crc kubenswrapper[4817]: E0218 14:20:23.012676 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a938bc12-3666-41cf-b8e5-3fa647fe32f0" containerName="extract-utilities" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.012698 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="a938bc12-3666-41cf-b8e5-3fa647fe32f0" containerName="extract-utilities" Feb 18 14:20:23 crc kubenswrapper[4817]: E0218 14:20:23.012713 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed573b21-30a7-47b3-bdc8-7d8843074607" containerName="init" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.012723 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed573b21-30a7-47b3-bdc8-7d8843074607" containerName="init" Feb 18 14:20:23 crc kubenswrapper[4817]: E0218 14:20:23.012740 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20cb47f0-a64d-4e7d-93f0-1fed117df7ce" containerName="neutron-httpd" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.012748 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="20cb47f0-a64d-4e7d-93f0-1fed117df7ce" containerName="neutron-httpd" Feb 18 14:20:23 crc kubenswrapper[4817]: E0218 14:20:23.012765 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4194006-61f7-4c39-91cf-bc5a9003a2d1" containerName="ceilometer-central-agent" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.012773 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4194006-61f7-4c39-91cf-bc5a9003a2d1" containerName="ceilometer-central-agent" Feb 18 14:20:23 crc kubenswrapper[4817]: E0218 14:20:23.012793 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd188298-86b3-470b-b3ab-d3d3c9e356a7" containerName="cloudkitty-proc" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.012800 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd188298-86b3-470b-b3ab-d3d3c9e356a7" containerName="cloudkitty-proc" Feb 18 14:20:23 crc kubenswrapper[4817]: E0218 14:20:23.012818 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a938bc12-3666-41cf-b8e5-3fa647fe32f0" containerName="registry-server" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.012826 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="a938bc12-3666-41cf-b8e5-3fa647fe32f0" containerName="registry-server" Feb 18 14:20:23 crc kubenswrapper[4817]: E0218 14:20:23.012839 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20cb47f0-a64d-4e7d-93f0-1fed117df7ce" containerName="neutron-api" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.012847 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="20cb47f0-a64d-4e7d-93f0-1fed117df7ce" containerName="neutron-api" Feb 18 14:20:23 crc kubenswrapper[4817]: E0218 14:20:23.012866 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4194006-61f7-4c39-91cf-bc5a9003a2d1" containerName="sg-core" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.012873 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4194006-61f7-4c39-91cf-bc5a9003a2d1" containerName="sg-core" Feb 18 14:20:23 crc kubenswrapper[4817]: E0218 14:20:23.012899 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4194006-61f7-4c39-91cf-bc5a9003a2d1" containerName="ceilometer-notification-agent" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.012907 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4194006-61f7-4c39-91cf-bc5a9003a2d1" containerName="ceilometer-notification-agent" Feb 18 14:20:23 crc kubenswrapper[4817]: E0218 14:20:23.012919 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed573b21-30a7-47b3-bdc8-7d8843074607" containerName="dnsmasq-dns" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.012928 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed573b21-30a7-47b3-bdc8-7d8843074607" containerName="dnsmasq-dns" Feb 18 14:20:23 crc kubenswrapper[4817]: E0218 14:20:23.012943 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a938bc12-3666-41cf-b8e5-3fa647fe32f0" containerName="extract-content" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.012951 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="a938bc12-3666-41cf-b8e5-3fa647fe32f0" containerName="extract-content" Feb 18 14:20:23 crc kubenswrapper[4817]: E0218 14:20:23.012970 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4194006-61f7-4c39-91cf-bc5a9003a2d1" containerName="proxy-httpd" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.012996 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4194006-61f7-4c39-91cf-bc5a9003a2d1" containerName="proxy-httpd" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.013253 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="a938bc12-3666-41cf-b8e5-3fa647fe32f0" containerName="registry-server" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.013269 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4194006-61f7-4c39-91cf-bc5a9003a2d1" containerName="ceilometer-notification-agent" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.013280 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4194006-61f7-4c39-91cf-bc5a9003a2d1" containerName="sg-core" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.013294 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed573b21-30a7-47b3-bdc8-7d8843074607" containerName="dnsmasq-dns" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.013309 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4194006-61f7-4c39-91cf-bc5a9003a2d1" containerName="ceilometer-central-agent" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.013329 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="20cb47f0-a64d-4e7d-93f0-1fed117df7ce" containerName="neutron-api" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.013347 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd188298-86b3-470b-b3ab-d3d3c9e356a7" containerName="cloudkitty-proc" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.013361 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="20cb47f0-a64d-4e7d-93f0-1fed117df7ce" containerName="neutron-httpd" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.013372 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4194006-61f7-4c39-91cf-bc5a9003a2d1" containerName="proxy-httpd" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.019233 4817 scope.go:117] "RemoveContainer" containerID="4cdaa3cd6fc55060608e76a9a87f57a81c7da3be01e10038a75e033a53455414" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.020750 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.020882 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.025496 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.025557 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.109423 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95079ccd-b9d1-4dc1-883a-5a6008410950-config-data\") pod \"ceilometer-0\" (UID: \"95079ccd-b9d1-4dc1-883a-5a6008410950\") " pod="openstack/ceilometer-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.109485 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8f6r\" (UniqueName: \"kubernetes.io/projected/95079ccd-b9d1-4dc1-883a-5a6008410950-kube-api-access-r8f6r\") pod \"ceilometer-0\" (UID: \"95079ccd-b9d1-4dc1-883a-5a6008410950\") " pod="openstack/ceilometer-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.109614 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95079ccd-b9d1-4dc1-883a-5a6008410950-scripts\") pod \"ceilometer-0\" (UID: \"95079ccd-b9d1-4dc1-883a-5a6008410950\") " pod="openstack/ceilometer-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.109635 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95079ccd-b9d1-4dc1-883a-5a6008410950-log-httpd\") pod \"ceilometer-0\" (UID: \"95079ccd-b9d1-4dc1-883a-5a6008410950\") " pod="openstack/ceilometer-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.109697 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95079ccd-b9d1-4dc1-883a-5a6008410950-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95079ccd-b9d1-4dc1-883a-5a6008410950\") " pod="openstack/ceilometer-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.109718 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95079ccd-b9d1-4dc1-883a-5a6008410950-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95079ccd-b9d1-4dc1-883a-5a6008410950\") " pod="openstack/ceilometer-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.109951 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95079ccd-b9d1-4dc1-883a-5a6008410950-run-httpd\") pod \"ceilometer-0\" (UID: \"95079ccd-b9d1-4dc1-883a-5a6008410950\") " pod="openstack/ceilometer-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.213383 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95079ccd-b9d1-4dc1-883a-5a6008410950-config-data\") pod \"ceilometer-0\" (UID: \"95079ccd-b9d1-4dc1-883a-5a6008410950\") " pod="openstack/ceilometer-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.213441 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8f6r\" (UniqueName: \"kubernetes.io/projected/95079ccd-b9d1-4dc1-883a-5a6008410950-kube-api-access-r8f6r\") pod \"ceilometer-0\" (UID: \"95079ccd-b9d1-4dc1-883a-5a6008410950\") " pod="openstack/ceilometer-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.213534 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95079ccd-b9d1-4dc1-883a-5a6008410950-scripts\") pod \"ceilometer-0\" (UID: \"95079ccd-b9d1-4dc1-883a-5a6008410950\") " pod="openstack/ceilometer-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.213558 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95079ccd-b9d1-4dc1-883a-5a6008410950-log-httpd\") pod \"ceilometer-0\" (UID: \"95079ccd-b9d1-4dc1-883a-5a6008410950\") " pod="openstack/ceilometer-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.213626 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95079ccd-b9d1-4dc1-883a-5a6008410950-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95079ccd-b9d1-4dc1-883a-5a6008410950\") " pod="openstack/ceilometer-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.213652 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95079ccd-b9d1-4dc1-883a-5a6008410950-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95079ccd-b9d1-4dc1-883a-5a6008410950\") " pod="openstack/ceilometer-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.213711 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95079ccd-b9d1-4dc1-883a-5a6008410950-run-httpd\") pod \"ceilometer-0\" (UID: \"95079ccd-b9d1-4dc1-883a-5a6008410950\") " pod="openstack/ceilometer-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.214505 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95079ccd-b9d1-4dc1-883a-5a6008410950-log-httpd\") pod \"ceilometer-0\" (UID: \"95079ccd-b9d1-4dc1-883a-5a6008410950\") " pod="openstack/ceilometer-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.214904 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95079ccd-b9d1-4dc1-883a-5a6008410950-run-httpd\") pod \"ceilometer-0\" (UID: \"95079ccd-b9d1-4dc1-883a-5a6008410950\") " pod="openstack/ceilometer-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.218321 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95079ccd-b9d1-4dc1-883a-5a6008410950-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95079ccd-b9d1-4dc1-883a-5a6008410950\") " pod="openstack/ceilometer-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.218377 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95079ccd-b9d1-4dc1-883a-5a6008410950-scripts\") pod \"ceilometer-0\" (UID: \"95079ccd-b9d1-4dc1-883a-5a6008410950\") " pod="openstack/ceilometer-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.218663 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95079ccd-b9d1-4dc1-883a-5a6008410950-config-data\") pod \"ceilometer-0\" (UID: \"95079ccd-b9d1-4dc1-883a-5a6008410950\") " pod="openstack/ceilometer-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.219490 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95079ccd-b9d1-4dc1-883a-5a6008410950-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95079ccd-b9d1-4dc1-883a-5a6008410950\") " pod="openstack/ceilometer-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.237468 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8f6r\" (UniqueName: \"kubernetes.io/projected/95079ccd-b9d1-4dc1-883a-5a6008410950-kube-api-access-r8f6r\") pod \"ceilometer-0\" (UID: \"95079ccd-b9d1-4dc1-883a-5a6008410950\") " pod="openstack/ceilometer-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.353899 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.571700 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9gmj9" event={"ID":"64604bbb-190b-4850-97cc-07979a94d7aa","Type":"ContainerStarted","Data":"a70619c12ed1ff62eccf9a163deeae4ddb7b5cb7ff13b5191cb42343397ce290"} Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.579787 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.600323 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-9gmj9" podStartSLOduration=4.346633316 podStartE2EDuration="19.600303731s" podCreationTimestamp="2026-02-18 14:20:04 +0000 UTC" firstStartedPulling="2026-02-18 14:20:06.419112783 +0000 UTC m=+1268.994648766" lastFinishedPulling="2026-02-18 14:20:21.672783198 +0000 UTC m=+1284.248319181" observedRunningTime="2026-02-18 14:20:23.597226933 +0000 UTC m=+1286.172762936" watchObservedRunningTime="2026-02-18 14:20:23.600303731 +0000 UTC m=+1286.175839724" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.641627 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.660735 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.670776 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.672036 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.675654 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.694208 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.832213 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/542029d6-ef61-49d1-88a7-c206c64e193a-config-data\") pod \"cloudkitty-proc-0\" (UID: \"542029d6-ef61-49d1-88a7-c206c64e193a\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.832622 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/542029d6-ef61-49d1-88a7-c206c64e193a-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"542029d6-ef61-49d1-88a7-c206c64e193a\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.832734 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kkcz\" (UniqueName: \"kubernetes.io/projected/542029d6-ef61-49d1-88a7-c206c64e193a-kube-api-access-9kkcz\") pod \"cloudkitty-proc-0\" (UID: \"542029d6-ef61-49d1-88a7-c206c64e193a\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.832796 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/542029d6-ef61-49d1-88a7-c206c64e193a-scripts\") pod \"cloudkitty-proc-0\" (UID: \"542029d6-ef61-49d1-88a7-c206c64e193a\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.832831 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/542029d6-ef61-49d1-88a7-c206c64e193a-certs\") pod \"cloudkitty-proc-0\" (UID: \"542029d6-ef61-49d1-88a7-c206c64e193a\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.832859 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/542029d6-ef61-49d1-88a7-c206c64e193a-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"542029d6-ef61-49d1-88a7-c206c64e193a\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.934704 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kkcz\" (UniqueName: \"kubernetes.io/projected/542029d6-ef61-49d1-88a7-c206c64e193a-kube-api-access-9kkcz\") pod \"cloudkitty-proc-0\" (UID: \"542029d6-ef61-49d1-88a7-c206c64e193a\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.934812 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/542029d6-ef61-49d1-88a7-c206c64e193a-scripts\") pod \"cloudkitty-proc-0\" (UID: \"542029d6-ef61-49d1-88a7-c206c64e193a\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.934848 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/542029d6-ef61-49d1-88a7-c206c64e193a-certs\") pod \"cloudkitty-proc-0\" (UID: \"542029d6-ef61-49d1-88a7-c206c64e193a\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.934878 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/542029d6-ef61-49d1-88a7-c206c64e193a-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"542029d6-ef61-49d1-88a7-c206c64e193a\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.934911 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/542029d6-ef61-49d1-88a7-c206c64e193a-config-data\") pod \"cloudkitty-proc-0\" (UID: \"542029d6-ef61-49d1-88a7-c206c64e193a\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.935089 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/542029d6-ef61-49d1-88a7-c206c64e193a-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"542029d6-ef61-49d1-88a7-c206c64e193a\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.950714 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/542029d6-ef61-49d1-88a7-c206c64e193a-certs\") pod \"cloudkitty-proc-0\" (UID: \"542029d6-ef61-49d1-88a7-c206c64e193a\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.953247 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/542029d6-ef61-49d1-88a7-c206c64e193a-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"542029d6-ef61-49d1-88a7-c206c64e193a\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.953668 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/542029d6-ef61-49d1-88a7-c206c64e193a-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"542029d6-ef61-49d1-88a7-c206c64e193a\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.970531 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/542029d6-ef61-49d1-88a7-c206c64e193a-scripts\") pod \"cloudkitty-proc-0\" (UID: \"542029d6-ef61-49d1-88a7-c206c64e193a\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.971287 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.972095 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/542029d6-ef61-49d1-88a7-c206c64e193a-config-data\") pod \"cloudkitty-proc-0\" (UID: \"542029d6-ef61-49d1-88a7-c206c64e193a\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:20:23 crc kubenswrapper[4817]: I0218 14:20:23.974676 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kkcz\" (UniqueName: \"kubernetes.io/projected/542029d6-ef61-49d1-88a7-c206c64e193a-kube-api-access-9kkcz\") pod \"cloudkitty-proc-0\" (UID: \"542029d6-ef61-49d1-88a7-c206c64e193a\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:20:24 crc kubenswrapper[4817]: I0218 14:20:24.004709 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 18 14:20:24 crc kubenswrapper[4817]: I0218 14:20:24.188522 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a938bc12-3666-41cf-b8e5-3fa647fe32f0" path="/var/lib/kubelet/pods/a938bc12-3666-41cf-b8e5-3fa647fe32f0/volumes" Feb 18 14:20:24 crc kubenswrapper[4817]: I0218 14:20:24.190495 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed573b21-30a7-47b3-bdc8-7d8843074607" path="/var/lib/kubelet/pods/ed573b21-30a7-47b3-bdc8-7d8843074607/volumes" Feb 18 14:20:24 crc kubenswrapper[4817]: I0218 14:20:24.191664 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4194006-61f7-4c39-91cf-bc5a9003a2d1" path="/var/lib/kubelet/pods/f4194006-61f7-4c39-91cf-bc5a9003a2d1/volumes" Feb 18 14:20:24 crc kubenswrapper[4817]: I0218 14:20:24.193320 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd188298-86b3-470b-b3ab-d3d3c9e356a7" path="/var/lib/kubelet/pods/fd188298-86b3-470b-b3ab-d3d3c9e356a7/volumes" Feb 18 14:20:24 crc kubenswrapper[4817]: I0218 14:20:24.510918 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 18 14:20:24 crc kubenswrapper[4817]: W0218 14:20:24.522478 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod542029d6_ef61_49d1_88a7_c206c64e193a.slice/crio-1d038fe129143d145b72ed4c3401bd5ad4b2053bdcda35c44413a585ce7000b8 WatchSource:0}: Error finding container 1d038fe129143d145b72ed4c3401bd5ad4b2053bdcda35c44413a585ce7000b8: Status 404 returned error can't find the container with id 1d038fe129143d145b72ed4c3401bd5ad4b2053bdcda35c44413a585ce7000b8 Feb 18 14:20:24 crc kubenswrapper[4817]: I0218 14:20:24.624417 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"542029d6-ef61-49d1-88a7-c206c64e193a","Type":"ContainerStarted","Data":"1d038fe129143d145b72ed4c3401bd5ad4b2053bdcda35c44413a585ce7000b8"} Feb 18 14:20:24 crc kubenswrapper[4817]: I0218 14:20:24.627834 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95079ccd-b9d1-4dc1-883a-5a6008410950","Type":"ContainerStarted","Data":"41244c7097aadd0e8670e09212d7c7db78cb8bce5913a3faddc518eb5015652f"} Feb 18 14:20:25 crc kubenswrapper[4817]: I0218 14:20:25.638715 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"542029d6-ef61-49d1-88a7-c206c64e193a","Type":"ContainerStarted","Data":"ed628f07772e2e05d3087689a31bbe40ceac0ae62b9496713f1d17d667347f91"} Feb 18 14:20:25 crc kubenswrapper[4817]: I0218 14:20:25.644144 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95079ccd-b9d1-4dc1-883a-5a6008410950","Type":"ContainerStarted","Data":"d35f45974faf0b8175b3a58e59857fe1387c7a30ddcbd478b4070d0d2cb2bc53"} Feb 18 14:20:25 crc kubenswrapper[4817]: I0218 14:20:25.644217 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95079ccd-b9d1-4dc1-883a-5a6008410950","Type":"ContainerStarted","Data":"89fd86e07eefacfa61a4a36819bd10a4cc33854bfac7dac7532c8204a1f8d04e"} Feb 18 14:20:25 crc kubenswrapper[4817]: I0218 14:20:25.676705 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.6766693139999997 podStartE2EDuration="2.676669314s" podCreationTimestamp="2026-02-18 14:20:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:20:25.662436527 +0000 UTC m=+1288.237972510" watchObservedRunningTime="2026-02-18 14:20:25.676669314 +0000 UTC m=+1288.252205297" Feb 18 14:20:26 crc kubenswrapper[4817]: I0218 14:20:26.658407 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95079ccd-b9d1-4dc1-883a-5a6008410950","Type":"ContainerStarted","Data":"0ddb4ce4b3a2d18c0ff384466318dc446b3058664add19d3798e6e55e99d57c6"} Feb 18 14:20:30 crc kubenswrapper[4817]: I0218 14:20:30.082512 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5scpm" Feb 18 14:20:30 crc kubenswrapper[4817]: I0218 14:20:30.083128 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5scpm" Feb 18 14:20:31 crc kubenswrapper[4817]: I0218 14:20:31.143889 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5scpm" podUID="01b17e66-ae59-413b-985f-ea5cf5e11600" containerName="registry-server" probeResult="failure" output=< Feb 18 14:20:31 crc kubenswrapper[4817]: timeout: failed to connect service ":50051" within 1s Feb 18 14:20:31 crc kubenswrapper[4817]: > Feb 18 14:20:31 crc kubenswrapper[4817]: I0218 14:20:31.735547 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95079ccd-b9d1-4dc1-883a-5a6008410950","Type":"ContainerStarted","Data":"d2c9ca9b0372fb80ba6f121f84498a84bcdfd45daee5639ee8f8feed64928058"} Feb 18 14:20:31 crc kubenswrapper[4817]: I0218 14:20:31.736763 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 18 14:20:31 crc kubenswrapper[4817]: I0218 14:20:31.768325 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.302560973 podStartE2EDuration="9.768309043s" podCreationTimestamp="2026-02-18 14:20:22 +0000 UTC" firstStartedPulling="2026-02-18 14:20:23.99370666 +0000 UTC m=+1286.569242643" lastFinishedPulling="2026-02-18 14:20:30.45945473 +0000 UTC m=+1293.034990713" observedRunningTime="2026-02-18 14:20:31.766302831 +0000 UTC m=+1294.341838824" watchObservedRunningTime="2026-02-18 14:20:31.768309043 +0000 UTC m=+1294.343845026" Feb 18 14:20:32 crc kubenswrapper[4817]: I0218 14:20:32.814753 4817 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod43f9dae0-f2ed-4f91-b922-6f3432c8997d"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod43f9dae0-f2ed-4f91-b922-6f3432c8997d] : Timed out while waiting for systemd to remove kubepods-besteffort-pod43f9dae0_f2ed_4f91_b922_6f3432c8997d.slice" Feb 18 14:20:32 crc kubenswrapper[4817]: I0218 14:20:32.815819 4817 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podc5d3725d-5bb0-4edd-b707-6690d2ac99f5"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podc5d3725d-5bb0-4edd-b707-6690d2ac99f5] : Timed out while waiting for systemd to remove kubepods-besteffort-podc5d3725d_5bb0_4edd_b707_6690d2ac99f5.slice" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.491935 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.628921 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/586418c8-0373-4f82-beba-46a811db26a7-logs\") pod \"586418c8-0373-4f82-beba-46a811db26a7\" (UID: \"586418c8-0373-4f82-beba-46a811db26a7\") " Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.629043 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/586418c8-0373-4f82-beba-46a811db26a7-certs\") pod \"586418c8-0373-4f82-beba-46a811db26a7\" (UID: \"586418c8-0373-4f82-beba-46a811db26a7\") " Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.629095 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/586418c8-0373-4f82-beba-46a811db26a7-scripts\") pod \"586418c8-0373-4f82-beba-46a811db26a7\" (UID: \"586418c8-0373-4f82-beba-46a811db26a7\") " Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.629154 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/586418c8-0373-4f82-beba-46a811db26a7-config-data\") pod \"586418c8-0373-4f82-beba-46a811db26a7\" (UID: \"586418c8-0373-4f82-beba-46a811db26a7\") " Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.629315 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/586418c8-0373-4f82-beba-46a811db26a7-combined-ca-bundle\") pod \"586418c8-0373-4f82-beba-46a811db26a7\" (UID: \"586418c8-0373-4f82-beba-46a811db26a7\") " Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.629351 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/586418c8-0373-4f82-beba-46a811db26a7-config-data-custom\") pod \"586418c8-0373-4f82-beba-46a811db26a7\" (UID: \"586418c8-0373-4f82-beba-46a811db26a7\") " Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.629420 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8npd\" (UniqueName: \"kubernetes.io/projected/586418c8-0373-4f82-beba-46a811db26a7-kube-api-access-v8npd\") pod \"586418c8-0373-4f82-beba-46a811db26a7\" (UID: \"586418c8-0373-4f82-beba-46a811db26a7\") " Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.629414 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/586418c8-0373-4f82-beba-46a811db26a7-logs" (OuterVolumeSpecName: "logs") pod "586418c8-0373-4f82-beba-46a811db26a7" (UID: "586418c8-0373-4f82-beba-46a811db26a7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.630188 4817 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/586418c8-0373-4f82-beba-46a811db26a7-logs\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.635406 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/586418c8-0373-4f82-beba-46a811db26a7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "586418c8-0373-4f82-beba-46a811db26a7" (UID: "586418c8-0373-4f82-beba-46a811db26a7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.635522 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/586418c8-0373-4f82-beba-46a811db26a7-scripts" (OuterVolumeSpecName: "scripts") pod "586418c8-0373-4f82-beba-46a811db26a7" (UID: "586418c8-0373-4f82-beba-46a811db26a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.635541 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/586418c8-0373-4f82-beba-46a811db26a7-certs" (OuterVolumeSpecName: "certs") pod "586418c8-0373-4f82-beba-46a811db26a7" (UID: "586418c8-0373-4f82-beba-46a811db26a7"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.635612 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/586418c8-0373-4f82-beba-46a811db26a7-kube-api-access-v8npd" (OuterVolumeSpecName: "kube-api-access-v8npd") pod "586418c8-0373-4f82-beba-46a811db26a7" (UID: "586418c8-0373-4f82-beba-46a811db26a7"). InnerVolumeSpecName "kube-api-access-v8npd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.664251 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/586418c8-0373-4f82-beba-46a811db26a7-config-data" (OuterVolumeSpecName: "config-data") pod "586418c8-0373-4f82-beba-46a811db26a7" (UID: "586418c8-0373-4f82-beba-46a811db26a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.670548 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/586418c8-0373-4f82-beba-46a811db26a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "586418c8-0373-4f82-beba-46a811db26a7" (UID: "586418c8-0373-4f82-beba-46a811db26a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.732484 4817 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/586418c8-0373-4f82-beba-46a811db26a7-certs\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.732515 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/586418c8-0373-4f82-beba-46a811db26a7-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.732525 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/586418c8-0373-4f82-beba-46a811db26a7-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.732535 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/586418c8-0373-4f82-beba-46a811db26a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.732552 4817 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/586418c8-0373-4f82-beba-46a811db26a7-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.732562 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8npd\" (UniqueName: \"kubernetes.io/projected/586418c8-0373-4f82-beba-46a811db26a7-kube-api-access-v8npd\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.765972 4817 generic.go:334] "Generic (PLEG): container finished" podID="586418c8-0373-4f82-beba-46a811db26a7" containerID="0de9808db8100d4ea9abd06beb1c7790c4ca966af02c95ad91f73845cbc40268" exitCode=137 Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.766038 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"586418c8-0373-4f82-beba-46a811db26a7","Type":"ContainerDied","Data":"0de9808db8100d4ea9abd06beb1c7790c4ca966af02c95ad91f73845cbc40268"} Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.766068 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"586418c8-0373-4f82-beba-46a811db26a7","Type":"ContainerDied","Data":"e0e99e324a4fb34cd20cb7a64884e6c634b8f2c04fcddef3b49fc3812c509714"} Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.766088 4817 scope.go:117] "RemoveContainer" containerID="0de9808db8100d4ea9abd06beb1c7790c4ca966af02c95ad91f73845cbc40268" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.766228 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.805549 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.814796 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.818479 4817 scope.go:117] "RemoveContainer" containerID="e5aaa71d4e86e06b449670cb9afe3d3fcddad9879a10e5d0f4dbaa436c7643c5" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.840312 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Feb 18 14:20:34 crc kubenswrapper[4817]: E0218 14:20:34.840713 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="586418c8-0373-4f82-beba-46a811db26a7" containerName="cloudkitty-api" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.840730 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="586418c8-0373-4f82-beba-46a811db26a7" containerName="cloudkitty-api" Feb 18 14:20:34 crc kubenswrapper[4817]: E0218 14:20:34.840744 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="586418c8-0373-4f82-beba-46a811db26a7" containerName="cloudkitty-api-log" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.840751 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="586418c8-0373-4f82-beba-46a811db26a7" containerName="cloudkitty-api-log" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.840925 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="586418c8-0373-4f82-beba-46a811db26a7" containerName="cloudkitty-api-log" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.840939 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="586418c8-0373-4f82-beba-46a811db26a7" containerName="cloudkitty-api" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.842100 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.853409 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.854270 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.854474 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.885967 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.891035 4817 scope.go:117] "RemoveContainer" containerID="0de9808db8100d4ea9abd06beb1c7790c4ca966af02c95ad91f73845cbc40268" Feb 18 14:20:34 crc kubenswrapper[4817]: E0218 14:20:34.901115 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0de9808db8100d4ea9abd06beb1c7790c4ca966af02c95ad91f73845cbc40268\": container with ID starting with 0de9808db8100d4ea9abd06beb1c7790c4ca966af02c95ad91f73845cbc40268 not found: ID does not exist" containerID="0de9808db8100d4ea9abd06beb1c7790c4ca966af02c95ad91f73845cbc40268" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.901165 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0de9808db8100d4ea9abd06beb1c7790c4ca966af02c95ad91f73845cbc40268"} err="failed to get container status \"0de9808db8100d4ea9abd06beb1c7790c4ca966af02c95ad91f73845cbc40268\": rpc error: code = NotFound desc = could not find container \"0de9808db8100d4ea9abd06beb1c7790c4ca966af02c95ad91f73845cbc40268\": container with ID starting with 0de9808db8100d4ea9abd06beb1c7790c4ca966af02c95ad91f73845cbc40268 not found: ID does not exist" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.901196 4817 scope.go:117] "RemoveContainer" containerID="e5aaa71d4e86e06b449670cb9afe3d3fcddad9879a10e5d0f4dbaa436c7643c5" Feb 18 14:20:34 crc kubenswrapper[4817]: E0218 14:20:34.901697 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5aaa71d4e86e06b449670cb9afe3d3fcddad9879a10e5d0f4dbaa436c7643c5\": container with ID starting with e5aaa71d4e86e06b449670cb9afe3d3fcddad9879a10e5d0f4dbaa436c7643c5 not found: ID does not exist" containerID="e5aaa71d4e86e06b449670cb9afe3d3fcddad9879a10e5d0f4dbaa436c7643c5" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.901725 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5aaa71d4e86e06b449670cb9afe3d3fcddad9879a10e5d0f4dbaa436c7643c5"} err="failed to get container status \"e5aaa71d4e86e06b449670cb9afe3d3fcddad9879a10e5d0f4dbaa436c7643c5\": rpc error: code = NotFound desc = could not find container \"e5aaa71d4e86e06b449670cb9afe3d3fcddad9879a10e5d0f4dbaa436c7643c5\": container with ID starting with e5aaa71d4e86e06b449670cb9afe3d3fcddad9879a10e5d0f4dbaa436c7643c5 not found: ID does not exist" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.937655 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-logs\") pod \"cloudkitty-api-0\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " pod="openstack/cloudkitty-api-0" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.937769 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-config-data\") pod \"cloudkitty-api-0\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " pod="openstack/cloudkitty-api-0" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.938024 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2cdm\" (UniqueName: \"kubernetes.io/projected/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-kube-api-access-n2cdm\") pod \"cloudkitty-api-0\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " pod="openstack/cloudkitty-api-0" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.938136 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-certs\") pod \"cloudkitty-api-0\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " pod="openstack/cloudkitty-api-0" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.938200 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " pod="openstack/cloudkitty-api-0" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.938448 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-scripts\") pod \"cloudkitty-api-0\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " pod="openstack/cloudkitty-api-0" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.938477 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " pod="openstack/cloudkitty-api-0" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.938510 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " pod="openstack/cloudkitty-api-0" Feb 18 14:20:34 crc kubenswrapper[4817]: I0218 14:20:34.938526 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " pod="openstack/cloudkitty-api-0" Feb 18 14:20:35 crc kubenswrapper[4817]: I0218 14:20:35.040559 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-scripts\") pod \"cloudkitty-api-0\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " pod="openstack/cloudkitty-api-0" Feb 18 14:20:35 crc kubenswrapper[4817]: I0218 14:20:35.040598 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " pod="openstack/cloudkitty-api-0" Feb 18 14:20:35 crc kubenswrapper[4817]: I0218 14:20:35.040626 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " pod="openstack/cloudkitty-api-0" Feb 18 14:20:35 crc kubenswrapper[4817]: I0218 14:20:35.040644 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " pod="openstack/cloudkitty-api-0" Feb 18 14:20:35 crc kubenswrapper[4817]: I0218 14:20:35.040699 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-logs\") pod \"cloudkitty-api-0\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " pod="openstack/cloudkitty-api-0" Feb 18 14:20:35 crc kubenswrapper[4817]: I0218 14:20:35.040730 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-config-data\") pod \"cloudkitty-api-0\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " pod="openstack/cloudkitty-api-0" Feb 18 14:20:35 crc kubenswrapper[4817]: I0218 14:20:35.040780 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2cdm\" (UniqueName: \"kubernetes.io/projected/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-kube-api-access-n2cdm\") pod \"cloudkitty-api-0\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " pod="openstack/cloudkitty-api-0" Feb 18 14:20:35 crc kubenswrapper[4817]: I0218 14:20:35.040819 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-certs\") pod \"cloudkitty-api-0\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " pod="openstack/cloudkitty-api-0" Feb 18 14:20:35 crc kubenswrapper[4817]: I0218 14:20:35.040842 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " pod="openstack/cloudkitty-api-0" Feb 18 14:20:35 crc kubenswrapper[4817]: I0218 14:20:35.041826 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-logs\") pod \"cloudkitty-api-0\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " pod="openstack/cloudkitty-api-0" Feb 18 14:20:35 crc kubenswrapper[4817]: I0218 14:20:35.044703 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " pod="openstack/cloudkitty-api-0" Feb 18 14:20:35 crc kubenswrapper[4817]: I0218 14:20:35.044718 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " pod="openstack/cloudkitty-api-0" Feb 18 14:20:35 crc kubenswrapper[4817]: I0218 14:20:35.045273 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-scripts\") pod \"cloudkitty-api-0\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " pod="openstack/cloudkitty-api-0" Feb 18 14:20:35 crc kubenswrapper[4817]: I0218 14:20:35.046029 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " pod="openstack/cloudkitty-api-0" Feb 18 14:20:35 crc kubenswrapper[4817]: I0218 14:20:35.059553 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-certs\") pod \"cloudkitty-api-0\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " pod="openstack/cloudkitty-api-0" Feb 18 14:20:35 crc kubenswrapper[4817]: I0218 14:20:35.060107 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " pod="openstack/cloudkitty-api-0" Feb 18 14:20:35 crc kubenswrapper[4817]: I0218 14:20:35.060224 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-config-data\") pod \"cloudkitty-api-0\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " pod="openstack/cloudkitty-api-0" Feb 18 14:20:35 crc kubenswrapper[4817]: I0218 14:20:35.079723 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2cdm\" (UniqueName: \"kubernetes.io/projected/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-kube-api-access-n2cdm\") pod \"cloudkitty-api-0\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " pod="openstack/cloudkitty-api-0" Feb 18 14:20:35 crc kubenswrapper[4817]: I0218 14:20:35.175949 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 18 14:20:35 crc kubenswrapper[4817]: I0218 14:20:35.658900 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 18 14:20:35 crc kubenswrapper[4817]: W0218 14:20:35.659910 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeaa4f949_7d98_4a16_a5ed_ba88f9e89820.slice/crio-5bd3db7d0de001e87bd71405af9e1e34d5442053f8913579094238d1dbf59dac WatchSource:0}: Error finding container 5bd3db7d0de001e87bd71405af9e1e34d5442053f8913579094238d1dbf59dac: Status 404 returned error can't find the container with id 5bd3db7d0de001e87bd71405af9e1e34d5442053f8913579094238d1dbf59dac Feb 18 14:20:35 crc kubenswrapper[4817]: I0218 14:20:35.778438 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"eaa4f949-7d98-4a16-a5ed-ba88f9e89820","Type":"ContainerStarted","Data":"5bd3db7d0de001e87bd71405af9e1e34d5442053f8913579094238d1dbf59dac"} Feb 18 14:20:36 crc kubenswrapper[4817]: I0218 14:20:36.198143 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="586418c8-0373-4f82-beba-46a811db26a7" path="/var/lib/kubelet/pods/586418c8-0373-4f82-beba-46a811db26a7/volumes" Feb 18 14:20:36 crc kubenswrapper[4817]: I0218 14:20:36.791996 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"eaa4f949-7d98-4a16-a5ed-ba88f9e89820","Type":"ContainerStarted","Data":"fec6236b0102236a0ffc04af329220acf3c2ccbfcacfee8aa9a74f16d2df810a"} Feb 18 14:20:36 crc kubenswrapper[4817]: I0218 14:20:36.793191 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"eaa4f949-7d98-4a16-a5ed-ba88f9e89820","Type":"ContainerStarted","Data":"64325c38df09c7e7258bae291e3bc9a3bc32ce81b5be56316cf60579570582a1"} Feb 18 14:20:36 crc kubenswrapper[4817]: I0218 14:20:36.793290 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Feb 18 14:20:36 crc kubenswrapper[4817]: I0218 14:20:36.814410 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=2.81438888 podStartE2EDuration="2.81438888s" podCreationTimestamp="2026-02-18 14:20:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:20:36.811120925 +0000 UTC m=+1299.386656918" watchObservedRunningTime="2026-02-18 14:20:36.81438888 +0000 UTC m=+1299.389924863" Feb 18 14:20:39 crc kubenswrapper[4817]: I0218 14:20:39.456373 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-api-0" podUID="586418c8-0373-4f82-beba-46a811db26a7" containerName="cloudkitty-api" probeResult="failure" output="Get \"http://10.217.0.205:8889/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 18 14:20:41 crc kubenswrapper[4817]: I0218 14:20:41.154276 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5scpm" podUID="01b17e66-ae59-413b-985f-ea5cf5e11600" containerName="registry-server" probeResult="failure" output=< Feb 18 14:20:41 crc kubenswrapper[4817]: timeout: failed to connect service ":50051" within 1s Feb 18 14:20:41 crc kubenswrapper[4817]: > Feb 18 14:20:44 crc kubenswrapper[4817]: I0218 14:20:44.597003 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:20:44 crc kubenswrapper[4817]: I0218 14:20:44.597890 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95079ccd-b9d1-4dc1-883a-5a6008410950" containerName="ceilometer-central-agent" containerID="cri-o://89fd86e07eefacfa61a4a36819bd10a4cc33854bfac7dac7532c8204a1f8d04e" gracePeriod=30 Feb 18 14:20:44 crc kubenswrapper[4817]: I0218 14:20:44.598733 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95079ccd-b9d1-4dc1-883a-5a6008410950" containerName="proxy-httpd" containerID="cri-o://d2c9ca9b0372fb80ba6f121f84498a84bcdfd45daee5639ee8f8feed64928058" gracePeriod=30 Feb 18 14:20:44 crc kubenswrapper[4817]: I0218 14:20:44.598803 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95079ccd-b9d1-4dc1-883a-5a6008410950" containerName="sg-core" containerID="cri-o://0ddb4ce4b3a2d18c0ff384466318dc446b3058664add19d3798e6e55e99d57c6" gracePeriod=30 Feb 18 14:20:44 crc kubenswrapper[4817]: I0218 14:20:44.598850 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95079ccd-b9d1-4dc1-883a-5a6008410950" containerName="ceilometer-notification-agent" containerID="cri-o://d35f45974faf0b8175b3a58e59857fe1387c7a30ddcbd478b4070d0d2cb2bc53" gracePeriod=30 Feb 18 14:20:44 crc kubenswrapper[4817]: I0218 14:20:44.611490 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="95079ccd-b9d1-4dc1-883a-5a6008410950" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.208:3000/\": EOF" Feb 18 14:20:44 crc kubenswrapper[4817]: I0218 14:20:44.886839 4817 generic.go:334] "Generic (PLEG): container finished" podID="95079ccd-b9d1-4dc1-883a-5a6008410950" containerID="d2c9ca9b0372fb80ba6f121f84498a84bcdfd45daee5639ee8f8feed64928058" exitCode=0 Feb 18 14:20:44 crc kubenswrapper[4817]: I0218 14:20:44.886903 4817 generic.go:334] "Generic (PLEG): container finished" podID="95079ccd-b9d1-4dc1-883a-5a6008410950" containerID="0ddb4ce4b3a2d18c0ff384466318dc446b3058664add19d3798e6e55e99d57c6" exitCode=2 Feb 18 14:20:44 crc kubenswrapper[4817]: I0218 14:20:44.886968 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95079ccd-b9d1-4dc1-883a-5a6008410950","Type":"ContainerDied","Data":"d2c9ca9b0372fb80ba6f121f84498a84bcdfd45daee5639ee8f8feed64928058"} Feb 18 14:20:44 crc kubenswrapper[4817]: I0218 14:20:44.887068 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95079ccd-b9d1-4dc1-883a-5a6008410950","Type":"ContainerDied","Data":"0ddb4ce4b3a2d18c0ff384466318dc446b3058664add19d3798e6e55e99d57c6"} Feb 18 14:20:45 crc kubenswrapper[4817]: I0218 14:20:45.900629 4817 generic.go:334] "Generic (PLEG): container finished" podID="95079ccd-b9d1-4dc1-883a-5a6008410950" containerID="89fd86e07eefacfa61a4a36819bd10a4cc33854bfac7dac7532c8204a1f8d04e" exitCode=0 Feb 18 14:20:45 crc kubenswrapper[4817]: I0218 14:20:45.900686 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95079ccd-b9d1-4dc1-883a-5a6008410950","Type":"ContainerDied","Data":"89fd86e07eefacfa61a4a36819bd10a4cc33854bfac7dac7532c8204a1f8d04e"} Feb 18 14:20:48 crc kubenswrapper[4817]: I0218 14:20:48.954470 4817 generic.go:334] "Generic (PLEG): container finished" podID="95079ccd-b9d1-4dc1-883a-5a6008410950" containerID="d35f45974faf0b8175b3a58e59857fe1387c7a30ddcbd478b4070d0d2cb2bc53" exitCode=0 Feb 18 14:20:48 crc kubenswrapper[4817]: I0218 14:20:48.954872 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95079ccd-b9d1-4dc1-883a-5a6008410950","Type":"ContainerDied","Data":"d35f45974faf0b8175b3a58e59857fe1387c7a30ddcbd478b4070d0d2cb2bc53"} Feb 18 14:20:49 crc kubenswrapper[4817]: I0218 14:20:49.357832 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:20:49 crc kubenswrapper[4817]: I0218 14:20:49.458773 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95079ccd-b9d1-4dc1-883a-5a6008410950-sg-core-conf-yaml\") pod \"95079ccd-b9d1-4dc1-883a-5a6008410950\" (UID: \"95079ccd-b9d1-4dc1-883a-5a6008410950\") " Feb 18 14:20:49 crc kubenswrapper[4817]: I0218 14:20:49.458889 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95079ccd-b9d1-4dc1-883a-5a6008410950-run-httpd\") pod \"95079ccd-b9d1-4dc1-883a-5a6008410950\" (UID: \"95079ccd-b9d1-4dc1-883a-5a6008410950\") " Feb 18 14:20:49 crc kubenswrapper[4817]: I0218 14:20:49.458916 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95079ccd-b9d1-4dc1-883a-5a6008410950-config-data\") pod \"95079ccd-b9d1-4dc1-883a-5a6008410950\" (UID: \"95079ccd-b9d1-4dc1-883a-5a6008410950\") " Feb 18 14:20:49 crc kubenswrapper[4817]: I0218 14:20:49.458964 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95079ccd-b9d1-4dc1-883a-5a6008410950-log-httpd\") pod \"95079ccd-b9d1-4dc1-883a-5a6008410950\" (UID: \"95079ccd-b9d1-4dc1-883a-5a6008410950\") " Feb 18 14:20:49 crc kubenswrapper[4817]: I0218 14:20:49.459011 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95079ccd-b9d1-4dc1-883a-5a6008410950-combined-ca-bundle\") pod \"95079ccd-b9d1-4dc1-883a-5a6008410950\" (UID: \"95079ccd-b9d1-4dc1-883a-5a6008410950\") " Feb 18 14:20:49 crc kubenswrapper[4817]: I0218 14:20:49.459070 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95079ccd-b9d1-4dc1-883a-5a6008410950-scripts\") pod \"95079ccd-b9d1-4dc1-883a-5a6008410950\" (UID: \"95079ccd-b9d1-4dc1-883a-5a6008410950\") " Feb 18 14:20:49 crc kubenswrapper[4817]: I0218 14:20:49.459153 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8f6r\" (UniqueName: \"kubernetes.io/projected/95079ccd-b9d1-4dc1-883a-5a6008410950-kube-api-access-r8f6r\") pod \"95079ccd-b9d1-4dc1-883a-5a6008410950\" (UID: \"95079ccd-b9d1-4dc1-883a-5a6008410950\") " Feb 18 14:20:49 crc kubenswrapper[4817]: I0218 14:20:49.459333 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95079ccd-b9d1-4dc1-883a-5a6008410950-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "95079ccd-b9d1-4dc1-883a-5a6008410950" (UID: "95079ccd-b9d1-4dc1-883a-5a6008410950"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:20:49 crc kubenswrapper[4817]: I0218 14:20:49.459592 4817 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95079ccd-b9d1-4dc1-883a-5a6008410950-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:49 crc kubenswrapper[4817]: I0218 14:20:49.459611 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95079ccd-b9d1-4dc1-883a-5a6008410950-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "95079ccd-b9d1-4dc1-883a-5a6008410950" (UID: "95079ccd-b9d1-4dc1-883a-5a6008410950"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:20:49 crc kubenswrapper[4817]: I0218 14:20:49.503262 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95079ccd-b9d1-4dc1-883a-5a6008410950-kube-api-access-r8f6r" (OuterVolumeSpecName: "kube-api-access-r8f6r") pod "95079ccd-b9d1-4dc1-883a-5a6008410950" (UID: "95079ccd-b9d1-4dc1-883a-5a6008410950"). InnerVolumeSpecName "kube-api-access-r8f6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:20:49 crc kubenswrapper[4817]: I0218 14:20:49.510057 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95079ccd-b9d1-4dc1-883a-5a6008410950-scripts" (OuterVolumeSpecName: "scripts") pod "95079ccd-b9d1-4dc1-883a-5a6008410950" (UID: "95079ccd-b9d1-4dc1-883a-5a6008410950"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:20:49 crc kubenswrapper[4817]: I0218 14:20:49.539241 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95079ccd-b9d1-4dc1-883a-5a6008410950-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "95079ccd-b9d1-4dc1-883a-5a6008410950" (UID: "95079ccd-b9d1-4dc1-883a-5a6008410950"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:20:49 crc kubenswrapper[4817]: I0218 14:20:49.563394 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95079ccd-b9d1-4dc1-883a-5a6008410950-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:49 crc kubenswrapper[4817]: I0218 14:20:49.563435 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8f6r\" (UniqueName: \"kubernetes.io/projected/95079ccd-b9d1-4dc1-883a-5a6008410950-kube-api-access-r8f6r\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:49 crc kubenswrapper[4817]: I0218 14:20:49.563451 4817 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95079ccd-b9d1-4dc1-883a-5a6008410950-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:49 crc kubenswrapper[4817]: I0218 14:20:49.563461 4817 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95079ccd-b9d1-4dc1-883a-5a6008410950-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:49 crc kubenswrapper[4817]: I0218 14:20:49.610103 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95079ccd-b9d1-4dc1-883a-5a6008410950-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95079ccd-b9d1-4dc1-883a-5a6008410950" (UID: "95079ccd-b9d1-4dc1-883a-5a6008410950"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:20:49 crc kubenswrapper[4817]: I0218 14:20:49.638540 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95079ccd-b9d1-4dc1-883a-5a6008410950-config-data" (OuterVolumeSpecName: "config-data") pod "95079ccd-b9d1-4dc1-883a-5a6008410950" (UID: "95079ccd-b9d1-4dc1-883a-5a6008410950"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:20:49 crc kubenswrapper[4817]: I0218 14:20:49.665505 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95079ccd-b9d1-4dc1-883a-5a6008410950-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:49 crc kubenswrapper[4817]: I0218 14:20:49.665804 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95079ccd-b9d1-4dc1-883a-5a6008410950-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:49 crc kubenswrapper[4817]: I0218 14:20:49.969690 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95079ccd-b9d1-4dc1-883a-5a6008410950","Type":"ContainerDied","Data":"41244c7097aadd0e8670e09212d7c7db78cb8bce5913a3faddc518eb5015652f"} Feb 18 14:20:49 crc kubenswrapper[4817]: I0218 14:20:49.969755 4817 scope.go:117] "RemoveContainer" containerID="d2c9ca9b0372fb80ba6f121f84498a84bcdfd45daee5639ee8f8feed64928058" Feb 18 14:20:49 crc kubenswrapper[4817]: I0218 14:20:49.969931 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.023449 4817 scope.go:117] "RemoveContainer" containerID="0ddb4ce4b3a2d18c0ff384466318dc446b3058664add19d3798e6e55e99d57c6" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.032063 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.063546 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.074001 4817 scope.go:117] "RemoveContainer" containerID="d35f45974faf0b8175b3a58e59857fe1387c7a30ddcbd478b4070d0d2cb2bc53" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.073999 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:20:50 crc kubenswrapper[4817]: E0218 14:20:50.074762 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95079ccd-b9d1-4dc1-883a-5a6008410950" containerName="ceilometer-notification-agent" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.074781 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="95079ccd-b9d1-4dc1-883a-5a6008410950" containerName="ceilometer-notification-agent" Feb 18 14:20:50 crc kubenswrapper[4817]: E0218 14:20:50.074817 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95079ccd-b9d1-4dc1-883a-5a6008410950" containerName="proxy-httpd" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.074825 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="95079ccd-b9d1-4dc1-883a-5a6008410950" containerName="proxy-httpd" Feb 18 14:20:50 crc kubenswrapper[4817]: E0218 14:20:50.074841 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95079ccd-b9d1-4dc1-883a-5a6008410950" containerName="sg-core" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.074850 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="95079ccd-b9d1-4dc1-883a-5a6008410950" containerName="sg-core" Feb 18 14:20:50 crc kubenswrapper[4817]: E0218 14:20:50.074869 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95079ccd-b9d1-4dc1-883a-5a6008410950" containerName="ceilometer-central-agent" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.074876 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="95079ccd-b9d1-4dc1-883a-5a6008410950" containerName="ceilometer-central-agent" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.075124 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="95079ccd-b9d1-4dc1-883a-5a6008410950" containerName="proxy-httpd" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.075142 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="95079ccd-b9d1-4dc1-883a-5a6008410950" containerName="sg-core" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.075176 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="95079ccd-b9d1-4dc1-883a-5a6008410950" containerName="ceilometer-notification-agent" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.075187 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="95079ccd-b9d1-4dc1-883a-5a6008410950" containerName="ceilometer-central-agent" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.078239 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.084383 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.084611 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.094154 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.147434 4817 scope.go:117] "RemoveContainer" containerID="89fd86e07eefacfa61a4a36819bd10a4cc33854bfac7dac7532c8204a1f8d04e" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.180445 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/768902a3-877f-4e72-bca3-7d04bfd49396-run-httpd\") pod \"ceilometer-0\" (UID: \"768902a3-877f-4e72-bca3-7d04bfd49396\") " pod="openstack/ceilometer-0" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.180541 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/768902a3-877f-4e72-bca3-7d04bfd49396-scripts\") pod \"ceilometer-0\" (UID: \"768902a3-877f-4e72-bca3-7d04bfd49396\") " pod="openstack/ceilometer-0" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.180563 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd75h\" (UniqueName: \"kubernetes.io/projected/768902a3-877f-4e72-bca3-7d04bfd49396-kube-api-access-gd75h\") pod \"ceilometer-0\" (UID: \"768902a3-877f-4e72-bca3-7d04bfd49396\") " pod="openstack/ceilometer-0" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.180578 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768902a3-877f-4e72-bca3-7d04bfd49396-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"768902a3-877f-4e72-bca3-7d04bfd49396\") " pod="openstack/ceilometer-0" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.180601 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/768902a3-877f-4e72-bca3-7d04bfd49396-config-data\") pod \"ceilometer-0\" (UID: \"768902a3-877f-4e72-bca3-7d04bfd49396\") " pod="openstack/ceilometer-0" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.180645 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/768902a3-877f-4e72-bca3-7d04bfd49396-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"768902a3-877f-4e72-bca3-7d04bfd49396\") " pod="openstack/ceilometer-0" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.180682 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/768902a3-877f-4e72-bca3-7d04bfd49396-log-httpd\") pod \"ceilometer-0\" (UID: \"768902a3-877f-4e72-bca3-7d04bfd49396\") " pod="openstack/ceilometer-0" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.187037 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95079ccd-b9d1-4dc1-883a-5a6008410950" path="/var/lib/kubelet/pods/95079ccd-b9d1-4dc1-883a-5a6008410950/volumes" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.282356 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/768902a3-877f-4e72-bca3-7d04bfd49396-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"768902a3-877f-4e72-bca3-7d04bfd49396\") " pod="openstack/ceilometer-0" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.282429 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/768902a3-877f-4e72-bca3-7d04bfd49396-log-httpd\") pod \"ceilometer-0\" (UID: \"768902a3-877f-4e72-bca3-7d04bfd49396\") " pod="openstack/ceilometer-0" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.282494 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/768902a3-877f-4e72-bca3-7d04bfd49396-run-httpd\") pod \"ceilometer-0\" (UID: \"768902a3-877f-4e72-bca3-7d04bfd49396\") " pod="openstack/ceilometer-0" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.282581 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/768902a3-877f-4e72-bca3-7d04bfd49396-scripts\") pod \"ceilometer-0\" (UID: \"768902a3-877f-4e72-bca3-7d04bfd49396\") " pod="openstack/ceilometer-0" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.282610 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd75h\" (UniqueName: \"kubernetes.io/projected/768902a3-877f-4e72-bca3-7d04bfd49396-kube-api-access-gd75h\") pod \"ceilometer-0\" (UID: \"768902a3-877f-4e72-bca3-7d04bfd49396\") " pod="openstack/ceilometer-0" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.282631 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768902a3-877f-4e72-bca3-7d04bfd49396-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"768902a3-877f-4e72-bca3-7d04bfd49396\") " pod="openstack/ceilometer-0" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.282657 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/768902a3-877f-4e72-bca3-7d04bfd49396-config-data\") pod \"ceilometer-0\" (UID: \"768902a3-877f-4e72-bca3-7d04bfd49396\") " pod="openstack/ceilometer-0" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.283412 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/768902a3-877f-4e72-bca3-7d04bfd49396-log-httpd\") pod \"ceilometer-0\" (UID: \"768902a3-877f-4e72-bca3-7d04bfd49396\") " pod="openstack/ceilometer-0" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.284383 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/768902a3-877f-4e72-bca3-7d04bfd49396-run-httpd\") pod \"ceilometer-0\" (UID: \"768902a3-877f-4e72-bca3-7d04bfd49396\") " pod="openstack/ceilometer-0" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.288091 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768902a3-877f-4e72-bca3-7d04bfd49396-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"768902a3-877f-4e72-bca3-7d04bfd49396\") " pod="openstack/ceilometer-0" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.288127 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/768902a3-877f-4e72-bca3-7d04bfd49396-config-data\") pod \"ceilometer-0\" (UID: \"768902a3-877f-4e72-bca3-7d04bfd49396\") " pod="openstack/ceilometer-0" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.289682 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/768902a3-877f-4e72-bca3-7d04bfd49396-scripts\") pod \"ceilometer-0\" (UID: \"768902a3-877f-4e72-bca3-7d04bfd49396\") " pod="openstack/ceilometer-0" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.302343 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/768902a3-877f-4e72-bca3-7d04bfd49396-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"768902a3-877f-4e72-bca3-7d04bfd49396\") " pod="openstack/ceilometer-0" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.310485 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd75h\" (UniqueName: \"kubernetes.io/projected/768902a3-877f-4e72-bca3-7d04bfd49396-kube-api-access-gd75h\") pod \"ceilometer-0\" (UID: \"768902a3-877f-4e72-bca3-7d04bfd49396\") " pod="openstack/ceilometer-0" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.410682 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.944909 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:20:50 crc kubenswrapper[4817]: I0218 14:20:50.980496 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"768902a3-877f-4e72-bca3-7d04bfd49396","Type":"ContainerStarted","Data":"df0f3bcd37fb9489bbb2376952ce0a4aa7a72a3469ab6c601b175e1a09cd24c7"} Feb 18 14:20:51 crc kubenswrapper[4817]: I0218 14:20:51.158529 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5scpm" podUID="01b17e66-ae59-413b-985f-ea5cf5e11600" containerName="registry-server" probeResult="failure" output=< Feb 18 14:20:51 crc kubenswrapper[4817]: timeout: failed to connect service ":50051" within 1s Feb 18 14:20:51 crc kubenswrapper[4817]: > Feb 18 14:20:51 crc kubenswrapper[4817]: I0218 14:20:51.521231 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:20:52 crc kubenswrapper[4817]: I0218 14:20:52.005270 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"768902a3-877f-4e72-bca3-7d04bfd49396","Type":"ContainerStarted","Data":"07acca9164f268e0bc4da606278d502655291a0231a5471d6ddb18f14d1ea3c5"} Feb 18 14:20:54 crc kubenswrapper[4817]: I0218 14:20:54.029098 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"768902a3-877f-4e72-bca3-7d04bfd49396","Type":"ContainerStarted","Data":"82a5daf9b4058820b240159bca701d34d7bc4f37ef6df9d84150361900e52a49"} Feb 18 14:20:55 crc kubenswrapper[4817]: I0218 14:20:55.040929 4817 generic.go:334] "Generic (PLEG): container finished" podID="64604bbb-190b-4850-97cc-07979a94d7aa" containerID="a70619c12ed1ff62eccf9a163deeae4ddb7b5cb7ff13b5191cb42343397ce290" exitCode=0 Feb 18 14:20:55 crc kubenswrapper[4817]: I0218 14:20:55.041051 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9gmj9" event={"ID":"64604bbb-190b-4850-97cc-07979a94d7aa","Type":"ContainerDied","Data":"a70619c12ed1ff62eccf9a163deeae4ddb7b5cb7ff13b5191cb42343397ce290"} Feb 18 14:20:55 crc kubenswrapper[4817]: I0218 14:20:55.043825 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"768902a3-877f-4e72-bca3-7d04bfd49396","Type":"ContainerStarted","Data":"388c62a91c8817c4b6fc666ccb5bbf84f5741b6facf7a7ba27f35565487c94e8"} Feb 18 14:20:56 crc kubenswrapper[4817]: I0218 14:20:56.465510 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9gmj9" Feb 18 14:20:56 crc kubenswrapper[4817]: I0218 14:20:56.609824 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blsgn\" (UniqueName: \"kubernetes.io/projected/64604bbb-190b-4850-97cc-07979a94d7aa-kube-api-access-blsgn\") pod \"64604bbb-190b-4850-97cc-07979a94d7aa\" (UID: \"64604bbb-190b-4850-97cc-07979a94d7aa\") " Feb 18 14:20:56 crc kubenswrapper[4817]: I0218 14:20:56.610506 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64604bbb-190b-4850-97cc-07979a94d7aa-scripts\") pod \"64604bbb-190b-4850-97cc-07979a94d7aa\" (UID: \"64604bbb-190b-4850-97cc-07979a94d7aa\") " Feb 18 14:20:56 crc kubenswrapper[4817]: I0218 14:20:56.610643 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64604bbb-190b-4850-97cc-07979a94d7aa-config-data\") pod \"64604bbb-190b-4850-97cc-07979a94d7aa\" (UID: \"64604bbb-190b-4850-97cc-07979a94d7aa\") " Feb 18 14:20:56 crc kubenswrapper[4817]: I0218 14:20:56.610752 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64604bbb-190b-4850-97cc-07979a94d7aa-combined-ca-bundle\") pod \"64604bbb-190b-4850-97cc-07979a94d7aa\" (UID: \"64604bbb-190b-4850-97cc-07979a94d7aa\") " Feb 18 14:20:56 crc kubenswrapper[4817]: I0218 14:20:56.615209 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64604bbb-190b-4850-97cc-07979a94d7aa-scripts" (OuterVolumeSpecName: "scripts") pod "64604bbb-190b-4850-97cc-07979a94d7aa" (UID: "64604bbb-190b-4850-97cc-07979a94d7aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:20:56 crc kubenswrapper[4817]: I0218 14:20:56.615600 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64604bbb-190b-4850-97cc-07979a94d7aa-kube-api-access-blsgn" (OuterVolumeSpecName: "kube-api-access-blsgn") pod "64604bbb-190b-4850-97cc-07979a94d7aa" (UID: "64604bbb-190b-4850-97cc-07979a94d7aa"). InnerVolumeSpecName "kube-api-access-blsgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:20:56 crc kubenswrapper[4817]: I0218 14:20:56.638727 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64604bbb-190b-4850-97cc-07979a94d7aa-config-data" (OuterVolumeSpecName: "config-data") pod "64604bbb-190b-4850-97cc-07979a94d7aa" (UID: "64604bbb-190b-4850-97cc-07979a94d7aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:20:56 crc kubenswrapper[4817]: I0218 14:20:56.655360 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64604bbb-190b-4850-97cc-07979a94d7aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64604bbb-190b-4850-97cc-07979a94d7aa" (UID: "64604bbb-190b-4850-97cc-07979a94d7aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:20:56 crc kubenswrapper[4817]: I0218 14:20:56.713672 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blsgn\" (UniqueName: \"kubernetes.io/projected/64604bbb-190b-4850-97cc-07979a94d7aa-kube-api-access-blsgn\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:56 crc kubenswrapper[4817]: I0218 14:20:56.714239 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64604bbb-190b-4850-97cc-07979a94d7aa-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:56 crc kubenswrapper[4817]: I0218 14:20:56.714357 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64604bbb-190b-4850-97cc-07979a94d7aa-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:56 crc kubenswrapper[4817]: I0218 14:20:56.714466 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64604bbb-190b-4850-97cc-07979a94d7aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:20:57 crc kubenswrapper[4817]: I0218 14:20:57.098570 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9gmj9" event={"ID":"64604bbb-190b-4850-97cc-07979a94d7aa","Type":"ContainerDied","Data":"ce8f73eb108da9dd58a23ffb705d73f0968359473237244b1c528e187d1931fe"} Feb 18 14:20:57 crc kubenswrapper[4817]: I0218 14:20:57.098811 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce8f73eb108da9dd58a23ffb705d73f0968359473237244b1c528e187d1931fe" Feb 18 14:20:57 crc kubenswrapper[4817]: I0218 14:20:57.098942 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9gmj9" Feb 18 14:20:57 crc kubenswrapper[4817]: I0218 14:20:57.211484 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 18 14:20:57 crc kubenswrapper[4817]: E0218 14:20:57.212249 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64604bbb-190b-4850-97cc-07979a94d7aa" containerName="nova-cell0-conductor-db-sync" Feb 18 14:20:57 crc kubenswrapper[4817]: I0218 14:20:57.212266 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="64604bbb-190b-4850-97cc-07979a94d7aa" containerName="nova-cell0-conductor-db-sync" Feb 18 14:20:57 crc kubenswrapper[4817]: I0218 14:20:57.212483 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="64604bbb-190b-4850-97cc-07979a94d7aa" containerName="nova-cell0-conductor-db-sync" Feb 18 14:20:57 crc kubenswrapper[4817]: I0218 14:20:57.213256 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 18 14:20:57 crc kubenswrapper[4817]: I0218 14:20:57.216717 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-kfm86" Feb 18 14:20:57 crc kubenswrapper[4817]: I0218 14:20:57.216939 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 18 14:20:57 crc kubenswrapper[4817]: I0218 14:20:57.228145 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 18 14:20:57 crc kubenswrapper[4817]: I0218 14:20:57.334347 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b99418c-8ded-4927-afdf-a9a6edbabf84-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4b99418c-8ded-4927-afdf-a9a6edbabf84\") " pod="openstack/nova-cell0-conductor-0" Feb 18 14:20:57 crc kubenswrapper[4817]: I0218 14:20:57.334441 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4vkj\" (UniqueName: \"kubernetes.io/projected/4b99418c-8ded-4927-afdf-a9a6edbabf84-kube-api-access-k4vkj\") pod \"nova-cell0-conductor-0\" (UID: \"4b99418c-8ded-4927-afdf-a9a6edbabf84\") " pod="openstack/nova-cell0-conductor-0" Feb 18 14:20:57 crc kubenswrapper[4817]: I0218 14:20:57.334496 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b99418c-8ded-4927-afdf-a9a6edbabf84-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4b99418c-8ded-4927-afdf-a9a6edbabf84\") " pod="openstack/nova-cell0-conductor-0" Feb 18 14:20:57 crc kubenswrapper[4817]: I0218 14:20:57.436365 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b99418c-8ded-4927-afdf-a9a6edbabf84-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4b99418c-8ded-4927-afdf-a9a6edbabf84\") " pod="openstack/nova-cell0-conductor-0" Feb 18 14:20:57 crc kubenswrapper[4817]: I0218 14:20:57.436448 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4vkj\" (UniqueName: \"kubernetes.io/projected/4b99418c-8ded-4927-afdf-a9a6edbabf84-kube-api-access-k4vkj\") pod \"nova-cell0-conductor-0\" (UID: \"4b99418c-8ded-4927-afdf-a9a6edbabf84\") " pod="openstack/nova-cell0-conductor-0" Feb 18 14:20:57 crc kubenswrapper[4817]: I0218 14:20:57.436486 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b99418c-8ded-4927-afdf-a9a6edbabf84-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4b99418c-8ded-4927-afdf-a9a6edbabf84\") " pod="openstack/nova-cell0-conductor-0" Feb 18 14:20:57 crc kubenswrapper[4817]: I0218 14:20:57.445456 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b99418c-8ded-4927-afdf-a9a6edbabf84-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4b99418c-8ded-4927-afdf-a9a6edbabf84\") " pod="openstack/nova-cell0-conductor-0" Feb 18 14:20:57 crc kubenswrapper[4817]: I0218 14:20:57.460947 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b99418c-8ded-4927-afdf-a9a6edbabf84-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4b99418c-8ded-4927-afdf-a9a6edbabf84\") " pod="openstack/nova-cell0-conductor-0" Feb 18 14:20:57 crc kubenswrapper[4817]: I0218 14:20:57.469603 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4vkj\" (UniqueName: \"kubernetes.io/projected/4b99418c-8ded-4927-afdf-a9a6edbabf84-kube-api-access-k4vkj\") pod \"nova-cell0-conductor-0\" (UID: \"4b99418c-8ded-4927-afdf-a9a6edbabf84\") " pod="openstack/nova-cell0-conductor-0" Feb 18 14:20:57 crc kubenswrapper[4817]: I0218 14:20:57.555582 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 18 14:20:58 crc kubenswrapper[4817]: I0218 14:20:58.077671 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 18 14:20:58 crc kubenswrapper[4817]: W0218 14:20:58.079226 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b99418c_8ded_4927_afdf_a9a6edbabf84.slice/crio-032fb24258bec140ff1c59b293ffa3b5f03008e17592ad59e7e02d03851d5ae4 WatchSource:0}: Error finding container 032fb24258bec140ff1c59b293ffa3b5f03008e17592ad59e7e02d03851d5ae4: Status 404 returned error can't find the container with id 032fb24258bec140ff1c59b293ffa3b5f03008e17592ad59e7e02d03851d5ae4 Feb 18 14:20:58 crc kubenswrapper[4817]: I0218 14:20:58.110171 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4b99418c-8ded-4927-afdf-a9a6edbabf84","Type":"ContainerStarted","Data":"032fb24258bec140ff1c59b293ffa3b5f03008e17592ad59e7e02d03851d5ae4"} Feb 18 14:20:58 crc kubenswrapper[4817]: I0218 14:20:58.113423 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"768902a3-877f-4e72-bca3-7d04bfd49396","Type":"ContainerStarted","Data":"7fc9cad40ba25ae87ca41fb577fbb12dc04321eab07444af7d833c3811547b2a"} Feb 18 14:20:58 crc kubenswrapper[4817]: I0218 14:20:58.113646 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="768902a3-877f-4e72-bca3-7d04bfd49396" containerName="ceilometer-central-agent" containerID="cri-o://07acca9164f268e0bc4da606278d502655291a0231a5471d6ddb18f14d1ea3c5" gracePeriod=30 Feb 18 14:20:58 crc kubenswrapper[4817]: I0218 14:20:58.113668 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="768902a3-877f-4e72-bca3-7d04bfd49396" containerName="proxy-httpd" containerID="cri-o://7fc9cad40ba25ae87ca41fb577fbb12dc04321eab07444af7d833c3811547b2a" gracePeriod=30 Feb 18 14:20:58 crc kubenswrapper[4817]: I0218 14:20:58.113749 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="768902a3-877f-4e72-bca3-7d04bfd49396" containerName="sg-core" containerID="cri-o://388c62a91c8817c4b6fc666ccb5bbf84f5741b6facf7a7ba27f35565487c94e8" gracePeriod=30 Feb 18 14:20:58 crc kubenswrapper[4817]: I0218 14:20:58.113792 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="768902a3-877f-4e72-bca3-7d04bfd49396" containerName="ceilometer-notification-agent" containerID="cri-o://82a5daf9b4058820b240159bca701d34d7bc4f37ef6df9d84150361900e52a49" gracePeriod=30 Feb 18 14:20:58 crc kubenswrapper[4817]: I0218 14:20:58.113999 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 18 14:20:58 crc kubenswrapper[4817]: I0218 14:20:58.150935 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.124080396 podStartE2EDuration="8.15091666s" podCreationTimestamp="2026-02-18 14:20:50 +0000 UTC" firstStartedPulling="2026-02-18 14:20:50.952886952 +0000 UTC m=+1313.528422935" lastFinishedPulling="2026-02-18 14:20:56.979723215 +0000 UTC m=+1319.555259199" observedRunningTime="2026-02-18 14:20:58.138347005 +0000 UTC m=+1320.713882998" watchObservedRunningTime="2026-02-18 14:20:58.15091666 +0000 UTC m=+1320.726452633" Feb 18 14:20:59 crc kubenswrapper[4817]: I0218 14:20:59.127131 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4b99418c-8ded-4927-afdf-a9a6edbabf84","Type":"ContainerStarted","Data":"5154026160ce594c2c47fb26f1b73fa42e2431d4c2823082deb5e635efecb2d4"} Feb 18 14:20:59 crc kubenswrapper[4817]: I0218 14:20:59.127707 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 18 14:20:59 crc kubenswrapper[4817]: I0218 14:20:59.131425 4817 generic.go:334] "Generic (PLEG): container finished" podID="768902a3-877f-4e72-bca3-7d04bfd49396" containerID="7fc9cad40ba25ae87ca41fb577fbb12dc04321eab07444af7d833c3811547b2a" exitCode=0 Feb 18 14:20:59 crc kubenswrapper[4817]: I0218 14:20:59.131471 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"768902a3-877f-4e72-bca3-7d04bfd49396","Type":"ContainerDied","Data":"7fc9cad40ba25ae87ca41fb577fbb12dc04321eab07444af7d833c3811547b2a"} Feb 18 14:20:59 crc kubenswrapper[4817]: I0218 14:20:59.131532 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"768902a3-877f-4e72-bca3-7d04bfd49396","Type":"ContainerDied","Data":"388c62a91c8817c4b6fc666ccb5bbf84f5741b6facf7a7ba27f35565487c94e8"} Feb 18 14:20:59 crc kubenswrapper[4817]: I0218 14:20:59.131493 4817 generic.go:334] "Generic (PLEG): container finished" podID="768902a3-877f-4e72-bca3-7d04bfd49396" containerID="388c62a91c8817c4b6fc666ccb5bbf84f5741b6facf7a7ba27f35565487c94e8" exitCode=2 Feb 18 14:20:59 crc kubenswrapper[4817]: I0218 14:20:59.131571 4817 generic.go:334] "Generic (PLEG): container finished" podID="768902a3-877f-4e72-bca3-7d04bfd49396" containerID="82a5daf9b4058820b240159bca701d34d7bc4f37ef6df9d84150361900e52a49" exitCode=0 Feb 18 14:20:59 crc kubenswrapper[4817]: I0218 14:20:59.131595 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"768902a3-877f-4e72-bca3-7d04bfd49396","Type":"ContainerDied","Data":"82a5daf9b4058820b240159bca701d34d7bc4f37ef6df9d84150361900e52a49"} Feb 18 14:20:59 crc kubenswrapper[4817]: I0218 14:20:59.150252 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.150233808 podStartE2EDuration="2.150233808s" podCreationTimestamp="2026-02-18 14:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:20:59.149325474 +0000 UTC m=+1321.724861467" watchObservedRunningTime="2026-02-18 14:20:59.150233808 +0000 UTC m=+1321.725769791" Feb 18 14:21:01 crc kubenswrapper[4817]: I0218 14:21:01.149961 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5scpm" podUID="01b17e66-ae59-413b-985f-ea5cf5e11600" containerName="registry-server" probeResult="failure" output=< Feb 18 14:21:01 crc kubenswrapper[4817]: timeout: failed to connect service ":50051" within 1s Feb 18 14:21:01 crc kubenswrapper[4817]: > Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.186915 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.194275 4817 generic.go:334] "Generic (PLEG): container finished" podID="768902a3-877f-4e72-bca3-7d04bfd49396" containerID="07acca9164f268e0bc4da606278d502655291a0231a5471d6ddb18f14d1ea3c5" exitCode=0 Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.194348 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"768902a3-877f-4e72-bca3-7d04bfd49396","Type":"ContainerDied","Data":"07acca9164f268e0bc4da606278d502655291a0231a5471d6ddb18f14d1ea3c5"} Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.194386 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"768902a3-877f-4e72-bca3-7d04bfd49396","Type":"ContainerDied","Data":"df0f3bcd37fb9489bbb2376952ce0a4aa7a72a3469ab6c601b175e1a09cd24c7"} Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.194391 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.194419 4817 scope.go:117] "RemoveContainer" containerID="7fc9cad40ba25ae87ca41fb577fbb12dc04321eab07444af7d833c3811547b2a" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.223193 4817 scope.go:117] "RemoveContainer" containerID="388c62a91c8817c4b6fc666ccb5bbf84f5741b6facf7a7ba27f35565487c94e8" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.257453 4817 scope.go:117] "RemoveContainer" containerID="82a5daf9b4058820b240159bca701d34d7bc4f37ef6df9d84150361900e52a49" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.275277 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/768902a3-877f-4e72-bca3-7d04bfd49396-sg-core-conf-yaml\") pod \"768902a3-877f-4e72-bca3-7d04bfd49396\" (UID: \"768902a3-877f-4e72-bca3-7d04bfd49396\") " Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.275355 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/768902a3-877f-4e72-bca3-7d04bfd49396-log-httpd\") pod \"768902a3-877f-4e72-bca3-7d04bfd49396\" (UID: \"768902a3-877f-4e72-bca3-7d04bfd49396\") " Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.275395 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768902a3-877f-4e72-bca3-7d04bfd49396-combined-ca-bundle\") pod \"768902a3-877f-4e72-bca3-7d04bfd49396\" (UID: \"768902a3-877f-4e72-bca3-7d04bfd49396\") " Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.275428 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/768902a3-877f-4e72-bca3-7d04bfd49396-config-data\") pod \"768902a3-877f-4e72-bca3-7d04bfd49396\" (UID: \"768902a3-877f-4e72-bca3-7d04bfd49396\") " Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.275486 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/768902a3-877f-4e72-bca3-7d04bfd49396-scripts\") pod \"768902a3-877f-4e72-bca3-7d04bfd49396\" (UID: \"768902a3-877f-4e72-bca3-7d04bfd49396\") " Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.275522 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd75h\" (UniqueName: \"kubernetes.io/projected/768902a3-877f-4e72-bca3-7d04bfd49396-kube-api-access-gd75h\") pod \"768902a3-877f-4e72-bca3-7d04bfd49396\" (UID: \"768902a3-877f-4e72-bca3-7d04bfd49396\") " Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.275547 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/768902a3-877f-4e72-bca3-7d04bfd49396-run-httpd\") pod \"768902a3-877f-4e72-bca3-7d04bfd49396\" (UID: \"768902a3-877f-4e72-bca3-7d04bfd49396\") " Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.277378 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/768902a3-877f-4e72-bca3-7d04bfd49396-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "768902a3-877f-4e72-bca3-7d04bfd49396" (UID: "768902a3-877f-4e72-bca3-7d04bfd49396"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.277941 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/768902a3-877f-4e72-bca3-7d04bfd49396-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "768902a3-877f-4e72-bca3-7d04bfd49396" (UID: "768902a3-877f-4e72-bca3-7d04bfd49396"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.282616 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/768902a3-877f-4e72-bca3-7d04bfd49396-kube-api-access-gd75h" (OuterVolumeSpecName: "kube-api-access-gd75h") pod "768902a3-877f-4e72-bca3-7d04bfd49396" (UID: "768902a3-877f-4e72-bca3-7d04bfd49396"). InnerVolumeSpecName "kube-api-access-gd75h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.283162 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/768902a3-877f-4e72-bca3-7d04bfd49396-scripts" (OuterVolumeSpecName: "scripts") pod "768902a3-877f-4e72-bca3-7d04bfd49396" (UID: "768902a3-877f-4e72-bca3-7d04bfd49396"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.287563 4817 scope.go:117] "RemoveContainer" containerID="07acca9164f268e0bc4da606278d502655291a0231a5471d6ddb18f14d1ea3c5" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.329267 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/768902a3-877f-4e72-bca3-7d04bfd49396-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "768902a3-877f-4e72-bca3-7d04bfd49396" (UID: "768902a3-877f-4e72-bca3-7d04bfd49396"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.380828 4817 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/768902a3-877f-4e72-bca3-7d04bfd49396-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.380857 4817 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/768902a3-877f-4e72-bca3-7d04bfd49396-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.380871 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/768902a3-877f-4e72-bca3-7d04bfd49396-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.380880 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gd75h\" (UniqueName: \"kubernetes.io/projected/768902a3-877f-4e72-bca3-7d04bfd49396-kube-api-access-gd75h\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.380891 4817 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/768902a3-877f-4e72-bca3-7d04bfd49396-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.382697 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/768902a3-877f-4e72-bca3-7d04bfd49396-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "768902a3-877f-4e72-bca3-7d04bfd49396" (UID: "768902a3-877f-4e72-bca3-7d04bfd49396"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.392861 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/768902a3-877f-4e72-bca3-7d04bfd49396-config-data" (OuterVolumeSpecName: "config-data") pod "768902a3-877f-4e72-bca3-7d04bfd49396" (UID: "768902a3-877f-4e72-bca3-7d04bfd49396"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.401661 4817 scope.go:117] "RemoveContainer" containerID="7fc9cad40ba25ae87ca41fb577fbb12dc04321eab07444af7d833c3811547b2a" Feb 18 14:21:03 crc kubenswrapper[4817]: E0218 14:21:03.402172 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fc9cad40ba25ae87ca41fb577fbb12dc04321eab07444af7d833c3811547b2a\": container with ID starting with 7fc9cad40ba25ae87ca41fb577fbb12dc04321eab07444af7d833c3811547b2a not found: ID does not exist" containerID="7fc9cad40ba25ae87ca41fb577fbb12dc04321eab07444af7d833c3811547b2a" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.402203 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fc9cad40ba25ae87ca41fb577fbb12dc04321eab07444af7d833c3811547b2a"} err="failed to get container status \"7fc9cad40ba25ae87ca41fb577fbb12dc04321eab07444af7d833c3811547b2a\": rpc error: code = NotFound desc = could not find container \"7fc9cad40ba25ae87ca41fb577fbb12dc04321eab07444af7d833c3811547b2a\": container with ID starting with 7fc9cad40ba25ae87ca41fb577fbb12dc04321eab07444af7d833c3811547b2a not found: ID does not exist" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.402224 4817 scope.go:117] "RemoveContainer" containerID="388c62a91c8817c4b6fc666ccb5bbf84f5741b6facf7a7ba27f35565487c94e8" Feb 18 14:21:03 crc kubenswrapper[4817]: E0218 14:21:03.402480 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"388c62a91c8817c4b6fc666ccb5bbf84f5741b6facf7a7ba27f35565487c94e8\": container with ID starting with 388c62a91c8817c4b6fc666ccb5bbf84f5741b6facf7a7ba27f35565487c94e8 not found: ID does not exist" containerID="388c62a91c8817c4b6fc666ccb5bbf84f5741b6facf7a7ba27f35565487c94e8" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.402502 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"388c62a91c8817c4b6fc666ccb5bbf84f5741b6facf7a7ba27f35565487c94e8"} err="failed to get container status \"388c62a91c8817c4b6fc666ccb5bbf84f5741b6facf7a7ba27f35565487c94e8\": rpc error: code = NotFound desc = could not find container \"388c62a91c8817c4b6fc666ccb5bbf84f5741b6facf7a7ba27f35565487c94e8\": container with ID starting with 388c62a91c8817c4b6fc666ccb5bbf84f5741b6facf7a7ba27f35565487c94e8 not found: ID does not exist" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.402515 4817 scope.go:117] "RemoveContainer" containerID="82a5daf9b4058820b240159bca701d34d7bc4f37ef6df9d84150361900e52a49" Feb 18 14:21:03 crc kubenswrapper[4817]: E0218 14:21:03.402865 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82a5daf9b4058820b240159bca701d34d7bc4f37ef6df9d84150361900e52a49\": container with ID starting with 82a5daf9b4058820b240159bca701d34d7bc4f37ef6df9d84150361900e52a49 not found: ID does not exist" containerID="82a5daf9b4058820b240159bca701d34d7bc4f37ef6df9d84150361900e52a49" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.402884 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82a5daf9b4058820b240159bca701d34d7bc4f37ef6df9d84150361900e52a49"} err="failed to get container status \"82a5daf9b4058820b240159bca701d34d7bc4f37ef6df9d84150361900e52a49\": rpc error: code = NotFound desc = could not find container \"82a5daf9b4058820b240159bca701d34d7bc4f37ef6df9d84150361900e52a49\": container with ID starting with 82a5daf9b4058820b240159bca701d34d7bc4f37ef6df9d84150361900e52a49 not found: ID does not exist" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.402922 4817 scope.go:117] "RemoveContainer" containerID="07acca9164f268e0bc4da606278d502655291a0231a5471d6ddb18f14d1ea3c5" Feb 18 14:21:03 crc kubenswrapper[4817]: E0218 14:21:03.403257 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07acca9164f268e0bc4da606278d502655291a0231a5471d6ddb18f14d1ea3c5\": container with ID starting with 07acca9164f268e0bc4da606278d502655291a0231a5471d6ddb18f14d1ea3c5 not found: ID does not exist" containerID="07acca9164f268e0bc4da606278d502655291a0231a5471d6ddb18f14d1ea3c5" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.403277 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07acca9164f268e0bc4da606278d502655291a0231a5471d6ddb18f14d1ea3c5"} err="failed to get container status \"07acca9164f268e0bc4da606278d502655291a0231a5471d6ddb18f14d1ea3c5\": rpc error: code = NotFound desc = could not find container \"07acca9164f268e0bc4da606278d502655291a0231a5471d6ddb18f14d1ea3c5\": container with ID starting with 07acca9164f268e0bc4da606278d502655291a0231a5471d6ddb18f14d1ea3c5 not found: ID does not exist" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.482388 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768902a3-877f-4e72-bca3-7d04bfd49396-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.482424 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/768902a3-877f-4e72-bca3-7d04bfd49396-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.551389 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.565198 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.576901 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:21:03 crc kubenswrapper[4817]: E0218 14:21:03.577328 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="768902a3-877f-4e72-bca3-7d04bfd49396" containerName="proxy-httpd" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.577345 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="768902a3-877f-4e72-bca3-7d04bfd49396" containerName="proxy-httpd" Feb 18 14:21:03 crc kubenswrapper[4817]: E0218 14:21:03.577362 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="768902a3-877f-4e72-bca3-7d04bfd49396" containerName="sg-core" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.577368 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="768902a3-877f-4e72-bca3-7d04bfd49396" containerName="sg-core" Feb 18 14:21:03 crc kubenswrapper[4817]: E0218 14:21:03.577425 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="768902a3-877f-4e72-bca3-7d04bfd49396" containerName="ceilometer-notification-agent" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.577432 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="768902a3-877f-4e72-bca3-7d04bfd49396" containerName="ceilometer-notification-agent" Feb 18 14:21:03 crc kubenswrapper[4817]: E0218 14:21:03.577462 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="768902a3-877f-4e72-bca3-7d04bfd49396" containerName="ceilometer-central-agent" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.577468 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="768902a3-877f-4e72-bca3-7d04bfd49396" containerName="ceilometer-central-agent" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.577646 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="768902a3-877f-4e72-bca3-7d04bfd49396" containerName="ceilometer-central-agent" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.577674 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="768902a3-877f-4e72-bca3-7d04bfd49396" containerName="proxy-httpd" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.577692 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="768902a3-877f-4e72-bca3-7d04bfd49396" containerName="sg-core" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.577701 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="768902a3-877f-4e72-bca3-7d04bfd49396" containerName="ceilometer-notification-agent" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.583738 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.584908 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.586327 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.586760 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.686075 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed71e1c9-fa52-4a17-901c-5efc187043fb-scripts\") pod \"ceilometer-0\" (UID: \"ed71e1c9-fa52-4a17-901c-5efc187043fb\") " pod="openstack/ceilometer-0" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.686133 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgmld\" (UniqueName: \"kubernetes.io/projected/ed71e1c9-fa52-4a17-901c-5efc187043fb-kube-api-access-mgmld\") pod \"ceilometer-0\" (UID: \"ed71e1c9-fa52-4a17-901c-5efc187043fb\") " pod="openstack/ceilometer-0" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.686181 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed71e1c9-fa52-4a17-901c-5efc187043fb-config-data\") pod \"ceilometer-0\" (UID: \"ed71e1c9-fa52-4a17-901c-5efc187043fb\") " pod="openstack/ceilometer-0" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.686206 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed71e1c9-fa52-4a17-901c-5efc187043fb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed71e1c9-fa52-4a17-901c-5efc187043fb\") " pod="openstack/ceilometer-0" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.686223 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed71e1c9-fa52-4a17-901c-5efc187043fb-run-httpd\") pod \"ceilometer-0\" (UID: \"ed71e1c9-fa52-4a17-901c-5efc187043fb\") " pod="openstack/ceilometer-0" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.686648 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed71e1c9-fa52-4a17-901c-5efc187043fb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed71e1c9-fa52-4a17-901c-5efc187043fb\") " pod="openstack/ceilometer-0" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.686719 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed71e1c9-fa52-4a17-901c-5efc187043fb-log-httpd\") pod \"ceilometer-0\" (UID: \"ed71e1c9-fa52-4a17-901c-5efc187043fb\") " pod="openstack/ceilometer-0" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.789016 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed71e1c9-fa52-4a17-901c-5efc187043fb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed71e1c9-fa52-4a17-901c-5efc187043fb\") " pod="openstack/ceilometer-0" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.789070 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed71e1c9-fa52-4a17-901c-5efc187043fb-log-httpd\") pod \"ceilometer-0\" (UID: \"ed71e1c9-fa52-4a17-901c-5efc187043fb\") " pod="openstack/ceilometer-0" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.789158 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed71e1c9-fa52-4a17-901c-5efc187043fb-scripts\") pod \"ceilometer-0\" (UID: \"ed71e1c9-fa52-4a17-901c-5efc187043fb\") " pod="openstack/ceilometer-0" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.789196 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgmld\" (UniqueName: \"kubernetes.io/projected/ed71e1c9-fa52-4a17-901c-5efc187043fb-kube-api-access-mgmld\") pod \"ceilometer-0\" (UID: \"ed71e1c9-fa52-4a17-901c-5efc187043fb\") " pod="openstack/ceilometer-0" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.789232 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed71e1c9-fa52-4a17-901c-5efc187043fb-config-data\") pod \"ceilometer-0\" (UID: \"ed71e1c9-fa52-4a17-901c-5efc187043fb\") " pod="openstack/ceilometer-0" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.789288 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed71e1c9-fa52-4a17-901c-5efc187043fb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed71e1c9-fa52-4a17-901c-5efc187043fb\") " pod="openstack/ceilometer-0" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.789306 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed71e1c9-fa52-4a17-901c-5efc187043fb-run-httpd\") pod \"ceilometer-0\" (UID: \"ed71e1c9-fa52-4a17-901c-5efc187043fb\") " pod="openstack/ceilometer-0" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.789780 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed71e1c9-fa52-4a17-901c-5efc187043fb-log-httpd\") pod \"ceilometer-0\" (UID: \"ed71e1c9-fa52-4a17-901c-5efc187043fb\") " pod="openstack/ceilometer-0" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.789817 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed71e1c9-fa52-4a17-901c-5efc187043fb-run-httpd\") pod \"ceilometer-0\" (UID: \"ed71e1c9-fa52-4a17-901c-5efc187043fb\") " pod="openstack/ceilometer-0" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.794078 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed71e1c9-fa52-4a17-901c-5efc187043fb-scripts\") pod \"ceilometer-0\" (UID: \"ed71e1c9-fa52-4a17-901c-5efc187043fb\") " pod="openstack/ceilometer-0" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.794159 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed71e1c9-fa52-4a17-901c-5efc187043fb-config-data\") pod \"ceilometer-0\" (UID: \"ed71e1c9-fa52-4a17-901c-5efc187043fb\") " pod="openstack/ceilometer-0" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.794540 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed71e1c9-fa52-4a17-901c-5efc187043fb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed71e1c9-fa52-4a17-901c-5efc187043fb\") " pod="openstack/ceilometer-0" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.794907 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed71e1c9-fa52-4a17-901c-5efc187043fb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed71e1c9-fa52-4a17-901c-5efc187043fb\") " pod="openstack/ceilometer-0" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.816956 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgmld\" (UniqueName: \"kubernetes.io/projected/ed71e1c9-fa52-4a17-901c-5efc187043fb-kube-api-access-mgmld\") pod \"ceilometer-0\" (UID: \"ed71e1c9-fa52-4a17-901c-5efc187043fb\") " pod="openstack/ceilometer-0" Feb 18 14:21:03 crc kubenswrapper[4817]: I0218 14:21:03.908522 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:21:04 crc kubenswrapper[4817]: I0218 14:21:04.198380 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="768902a3-877f-4e72-bca3-7d04bfd49396" path="/var/lib/kubelet/pods/768902a3-877f-4e72-bca3-7d04bfd49396/volumes" Feb 18 14:21:04 crc kubenswrapper[4817]: I0218 14:21:04.675918 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:21:05 crc kubenswrapper[4817]: I0218 14:21:05.223556 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed71e1c9-fa52-4a17-901c-5efc187043fb","Type":"ContainerStarted","Data":"3371ff3ccddbfc64e59a5801f72c55462a8654d0f2e2f43a2878767a9012e200"} Feb 18 14:21:06 crc kubenswrapper[4817]: I0218 14:21:06.235631 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed71e1c9-fa52-4a17-901c-5efc187043fb","Type":"ContainerStarted","Data":"9131995f4e638ba36056894268ca2779e276a03848ff6db8fc441c6ccae4f3f2"} Feb 18 14:21:06 crc kubenswrapper[4817]: I0218 14:21:06.236168 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed71e1c9-fa52-4a17-901c-5efc187043fb","Type":"ContainerStarted","Data":"64bc7c486971d176034a65a33fd115616be7cfb1c7d68b9a6b896d9f077ac6b2"} Feb 18 14:21:07 crc kubenswrapper[4817]: I0218 14:21:07.252728 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed71e1c9-fa52-4a17-901c-5efc187043fb","Type":"ContainerStarted","Data":"55e6a2f3b20af16f8244c617e69436a1c43036f167d97e14c5e8caed1384ce67"} Feb 18 14:21:07 crc kubenswrapper[4817]: I0218 14:21:07.589444 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.118805 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-mr9dl"] Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.120710 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mr9dl" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.122520 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.122862 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.139344 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mr9dl"] Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.197937 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl66r\" (UniqueName: \"kubernetes.io/projected/76e9a054-69ef-45a0-b901-7ba80c2c2f46-kube-api-access-pl66r\") pod \"nova-cell0-cell-mapping-mr9dl\" (UID: \"76e9a054-69ef-45a0-b901-7ba80c2c2f46\") " pod="openstack/nova-cell0-cell-mapping-mr9dl" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.198395 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e9a054-69ef-45a0-b901-7ba80c2c2f46-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mr9dl\" (UID: \"76e9a054-69ef-45a0-b901-7ba80c2c2f46\") " pod="openstack/nova-cell0-cell-mapping-mr9dl" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.198549 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76e9a054-69ef-45a0-b901-7ba80c2c2f46-config-data\") pod \"nova-cell0-cell-mapping-mr9dl\" (UID: \"76e9a054-69ef-45a0-b901-7ba80c2c2f46\") " pod="openstack/nova-cell0-cell-mapping-mr9dl" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.198665 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76e9a054-69ef-45a0-b901-7ba80c2c2f46-scripts\") pod \"nova-cell0-cell-mapping-mr9dl\" (UID: \"76e9a054-69ef-45a0-b901-7ba80c2c2f46\") " pod="openstack/nova-cell0-cell-mapping-mr9dl" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.299923 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl66r\" (UniqueName: \"kubernetes.io/projected/76e9a054-69ef-45a0-b901-7ba80c2c2f46-kube-api-access-pl66r\") pod \"nova-cell0-cell-mapping-mr9dl\" (UID: \"76e9a054-69ef-45a0-b901-7ba80c2c2f46\") " pod="openstack/nova-cell0-cell-mapping-mr9dl" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.300091 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e9a054-69ef-45a0-b901-7ba80c2c2f46-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mr9dl\" (UID: \"76e9a054-69ef-45a0-b901-7ba80c2c2f46\") " pod="openstack/nova-cell0-cell-mapping-mr9dl" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.300158 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76e9a054-69ef-45a0-b901-7ba80c2c2f46-config-data\") pod \"nova-cell0-cell-mapping-mr9dl\" (UID: \"76e9a054-69ef-45a0-b901-7ba80c2c2f46\") " pod="openstack/nova-cell0-cell-mapping-mr9dl" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.300183 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76e9a054-69ef-45a0-b901-7ba80c2c2f46-scripts\") pod \"nova-cell0-cell-mapping-mr9dl\" (UID: \"76e9a054-69ef-45a0-b901-7ba80c2c2f46\") " pod="openstack/nova-cell0-cell-mapping-mr9dl" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.316595 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76e9a054-69ef-45a0-b901-7ba80c2c2f46-config-data\") pod \"nova-cell0-cell-mapping-mr9dl\" (UID: \"76e9a054-69ef-45a0-b901-7ba80c2c2f46\") " pod="openstack/nova-cell0-cell-mapping-mr9dl" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.336700 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76e9a054-69ef-45a0-b901-7ba80c2c2f46-scripts\") pod \"nova-cell0-cell-mapping-mr9dl\" (UID: \"76e9a054-69ef-45a0-b901-7ba80c2c2f46\") " pod="openstack/nova-cell0-cell-mapping-mr9dl" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.339581 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e9a054-69ef-45a0-b901-7ba80c2c2f46-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mr9dl\" (UID: \"76e9a054-69ef-45a0-b901-7ba80c2c2f46\") " pod="openstack/nova-cell0-cell-mapping-mr9dl" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.396785 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl66r\" (UniqueName: \"kubernetes.io/projected/76e9a054-69ef-45a0-b901-7ba80c2c2f46-kube-api-access-pl66r\") pod \"nova-cell0-cell-mapping-mr9dl\" (UID: \"76e9a054-69ef-45a0-b901-7ba80c2c2f46\") " pod="openstack/nova-cell0-cell-mapping-mr9dl" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.448108 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mr9dl" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.459694 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.461828 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.467525 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.506790 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp64w\" (UniqueName: \"kubernetes.io/projected/4a6de9a3-608a-46a3-b157-5776f0590b63-kube-api-access-dp64w\") pod \"nova-metadata-0\" (UID: \"4a6de9a3-608a-46a3-b157-5776f0590b63\") " pod="openstack/nova-metadata-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.506877 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a6de9a3-608a-46a3-b157-5776f0590b63-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4a6de9a3-608a-46a3-b157-5776f0590b63\") " pod="openstack/nova-metadata-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.506941 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a6de9a3-608a-46a3-b157-5776f0590b63-logs\") pod \"nova-metadata-0\" (UID: \"4a6de9a3-608a-46a3-b157-5776f0590b63\") " pod="openstack/nova-metadata-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.507007 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a6de9a3-608a-46a3-b157-5776f0590b63-config-data\") pod \"nova-metadata-0\" (UID: \"4a6de9a3-608a-46a3-b157-5776f0590b63\") " pod="openstack/nova-metadata-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.531050 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.533149 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.542849 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.613235 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.619946 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55106d39-246d-49ea-ab48-d7b703b72eef-config-data\") pod \"nova-api-0\" (UID: \"55106d39-246d-49ea-ab48-d7b703b72eef\") " pod="openstack/nova-api-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.620337 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a6de9a3-608a-46a3-b157-5776f0590b63-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4a6de9a3-608a-46a3-b157-5776f0590b63\") " pod="openstack/nova-metadata-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.620359 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55106d39-246d-49ea-ab48-d7b703b72eef-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"55106d39-246d-49ea-ab48-d7b703b72eef\") " pod="openstack/nova-api-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.620456 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8trv\" (UniqueName: \"kubernetes.io/projected/55106d39-246d-49ea-ab48-d7b703b72eef-kube-api-access-h8trv\") pod \"nova-api-0\" (UID: \"55106d39-246d-49ea-ab48-d7b703b72eef\") " pod="openstack/nova-api-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.620490 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a6de9a3-608a-46a3-b157-5776f0590b63-logs\") pod \"nova-metadata-0\" (UID: \"4a6de9a3-608a-46a3-b157-5776f0590b63\") " pod="openstack/nova-metadata-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.620591 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a6de9a3-608a-46a3-b157-5776f0590b63-config-data\") pod \"nova-metadata-0\" (UID: \"4a6de9a3-608a-46a3-b157-5776f0590b63\") " pod="openstack/nova-metadata-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.620623 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55106d39-246d-49ea-ab48-d7b703b72eef-logs\") pod \"nova-api-0\" (UID: \"55106d39-246d-49ea-ab48-d7b703b72eef\") " pod="openstack/nova-api-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.620873 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp64w\" (UniqueName: \"kubernetes.io/projected/4a6de9a3-608a-46a3-b157-5776f0590b63-kube-api-access-dp64w\") pod \"nova-metadata-0\" (UID: \"4a6de9a3-608a-46a3-b157-5776f0590b63\") " pod="openstack/nova-metadata-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.621452 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a6de9a3-608a-46a3-b157-5776f0590b63-logs\") pod \"nova-metadata-0\" (UID: \"4a6de9a3-608a-46a3-b157-5776f0590b63\") " pod="openstack/nova-metadata-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.627672 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a6de9a3-608a-46a3-b157-5776f0590b63-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4a6de9a3-608a-46a3-b157-5776f0590b63\") " pod="openstack/nova-metadata-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.641677 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.649758 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a6de9a3-608a-46a3-b157-5776f0590b63-config-data\") pod \"nova-metadata-0\" (UID: \"4a6de9a3-608a-46a3-b157-5776f0590b63\") " pod="openstack/nova-metadata-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.680414 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp64w\" (UniqueName: \"kubernetes.io/projected/4a6de9a3-608a-46a3-b157-5776f0590b63-kube-api-access-dp64w\") pod \"nova-metadata-0\" (UID: \"4a6de9a3-608a-46a3-b157-5776f0590b63\") " pod="openstack/nova-metadata-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.723023 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.724470 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55106d39-246d-49ea-ab48-d7b703b72eef-config-data\") pod \"nova-api-0\" (UID: \"55106d39-246d-49ea-ab48-d7b703b72eef\") " pod="openstack/nova-api-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.724520 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55106d39-246d-49ea-ab48-d7b703b72eef-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"55106d39-246d-49ea-ab48-d7b703b72eef\") " pod="openstack/nova-api-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.724600 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8trv\" (UniqueName: \"kubernetes.io/projected/55106d39-246d-49ea-ab48-d7b703b72eef-kube-api-access-h8trv\") pod \"nova-api-0\" (UID: \"55106d39-246d-49ea-ab48-d7b703b72eef\") " pod="openstack/nova-api-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.724664 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55106d39-246d-49ea-ab48-d7b703b72eef-logs\") pod \"nova-api-0\" (UID: \"55106d39-246d-49ea-ab48-d7b703b72eef\") " pod="openstack/nova-api-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.725446 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55106d39-246d-49ea-ab48-d7b703b72eef-logs\") pod \"nova-api-0\" (UID: \"55106d39-246d-49ea-ab48-d7b703b72eef\") " pod="openstack/nova-api-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.734543 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55106d39-246d-49ea-ab48-d7b703b72eef-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"55106d39-246d-49ea-ab48-d7b703b72eef\") " pod="openstack/nova-api-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.734603 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.736027 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.739609 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55106d39-246d-49ea-ab48-d7b703b72eef-config-data\") pod \"nova-api-0\" (UID: \"55106d39-246d-49ea-ab48-d7b703b72eef\") " pod="openstack/nova-api-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.745800 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.762334 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8trv\" (UniqueName: \"kubernetes.io/projected/55106d39-246d-49ea-ab48-d7b703b72eef-kube-api-access-h8trv\") pod \"nova-api-0\" (UID: \"55106d39-246d-49ea-ab48-d7b703b72eef\") " pod="openstack/nova-api-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.781772 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.829023 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.830189 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwzjp\" (UniqueName: \"kubernetes.io/projected/6d50c64d-0681-400c-8daf-d061888e2576-kube-api-access-nwzjp\") pod \"nova-scheduler-0\" (UID: \"6d50c64d-0681-400c-8daf-d061888e2576\") " pod="openstack/nova-scheduler-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.830251 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d50c64d-0681-400c-8daf-d061888e2576-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6d50c64d-0681-400c-8daf-d061888e2576\") " pod="openstack/nova-scheduler-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.830322 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d50c64d-0681-400c-8daf-d061888e2576-config-data\") pod \"nova-scheduler-0\" (UID: \"6d50c64d-0681-400c-8daf-d061888e2576\") " pod="openstack/nova-scheduler-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.855408 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.856737 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.864042 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.872614 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.905996 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d578b86f9-th88v"] Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.909268 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d578b86f9-th88v" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.924816 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d578b86f9-th88v"] Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.931953 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6b021a8-f766-4296-87b8-88b45f99a5ba-ovsdbserver-sb\") pod \"dnsmasq-dns-5d578b86f9-th88v\" (UID: \"e6b021a8-f766-4296-87b8-88b45f99a5ba\") " pod="openstack/dnsmasq-dns-5d578b86f9-th88v" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.932023 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/543ef68b-ffe2-4b3f-91b2-b34b458751f7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"543ef68b-ffe2-4b3f-91b2-b34b458751f7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.932075 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6b021a8-f766-4296-87b8-88b45f99a5ba-config\") pod \"dnsmasq-dns-5d578b86f9-th88v\" (UID: \"e6b021a8-f766-4296-87b8-88b45f99a5ba\") " pod="openstack/dnsmasq-dns-5d578b86f9-th88v" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.932128 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwzjp\" (UniqueName: \"kubernetes.io/projected/6d50c64d-0681-400c-8daf-d061888e2576-kube-api-access-nwzjp\") pod \"nova-scheduler-0\" (UID: \"6d50c64d-0681-400c-8daf-d061888e2576\") " pod="openstack/nova-scheduler-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.932168 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/543ef68b-ffe2-4b3f-91b2-b34b458751f7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"543ef68b-ffe2-4b3f-91b2-b34b458751f7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.932195 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d50c64d-0681-400c-8daf-d061888e2576-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6d50c64d-0681-400c-8daf-d061888e2576\") " pod="openstack/nova-scheduler-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.932267 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d50c64d-0681-400c-8daf-d061888e2576-config-data\") pod \"nova-scheduler-0\" (UID: \"6d50c64d-0681-400c-8daf-d061888e2576\") " pod="openstack/nova-scheduler-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.932284 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkf8b\" (UniqueName: \"kubernetes.io/projected/e6b021a8-f766-4296-87b8-88b45f99a5ba-kube-api-access-xkf8b\") pod \"dnsmasq-dns-5d578b86f9-th88v\" (UID: \"e6b021a8-f766-4296-87b8-88b45f99a5ba\") " pod="openstack/dnsmasq-dns-5d578b86f9-th88v" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.932300 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zm57\" (UniqueName: \"kubernetes.io/projected/543ef68b-ffe2-4b3f-91b2-b34b458751f7-kube-api-access-7zm57\") pod \"nova-cell1-novncproxy-0\" (UID: \"543ef68b-ffe2-4b3f-91b2-b34b458751f7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.932324 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6b021a8-f766-4296-87b8-88b45f99a5ba-dns-svc\") pod \"dnsmasq-dns-5d578b86f9-th88v\" (UID: \"e6b021a8-f766-4296-87b8-88b45f99a5ba\") " pod="openstack/dnsmasq-dns-5d578b86f9-th88v" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.932341 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e6b021a8-f766-4296-87b8-88b45f99a5ba-dns-swift-storage-0\") pod \"dnsmasq-dns-5d578b86f9-th88v\" (UID: \"e6b021a8-f766-4296-87b8-88b45f99a5ba\") " pod="openstack/dnsmasq-dns-5d578b86f9-th88v" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.932366 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6b021a8-f766-4296-87b8-88b45f99a5ba-ovsdbserver-nb\") pod \"dnsmasq-dns-5d578b86f9-th88v\" (UID: \"e6b021a8-f766-4296-87b8-88b45f99a5ba\") " pod="openstack/dnsmasq-dns-5d578b86f9-th88v" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.937900 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d50c64d-0681-400c-8daf-d061888e2576-config-data\") pod \"nova-scheduler-0\" (UID: \"6d50c64d-0681-400c-8daf-d061888e2576\") " pod="openstack/nova-scheduler-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.938665 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d50c64d-0681-400c-8daf-d061888e2576-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6d50c64d-0681-400c-8daf-d061888e2576\") " pod="openstack/nova-scheduler-0" Feb 18 14:21:08 crc kubenswrapper[4817]: I0218 14:21:08.954219 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwzjp\" (UniqueName: \"kubernetes.io/projected/6d50c64d-0681-400c-8daf-d061888e2576-kube-api-access-nwzjp\") pod \"nova-scheduler-0\" (UID: \"6d50c64d-0681-400c-8daf-d061888e2576\") " pod="openstack/nova-scheduler-0" Feb 18 14:21:09 crc kubenswrapper[4817]: I0218 14:21:09.034341 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6b021a8-f766-4296-87b8-88b45f99a5ba-ovsdbserver-nb\") pod \"dnsmasq-dns-5d578b86f9-th88v\" (UID: \"e6b021a8-f766-4296-87b8-88b45f99a5ba\") " pod="openstack/dnsmasq-dns-5d578b86f9-th88v" Feb 18 14:21:09 crc kubenswrapper[4817]: I0218 14:21:09.034424 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6b021a8-f766-4296-87b8-88b45f99a5ba-ovsdbserver-sb\") pod \"dnsmasq-dns-5d578b86f9-th88v\" (UID: \"e6b021a8-f766-4296-87b8-88b45f99a5ba\") " pod="openstack/dnsmasq-dns-5d578b86f9-th88v" Feb 18 14:21:09 crc kubenswrapper[4817]: I0218 14:21:09.034464 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/543ef68b-ffe2-4b3f-91b2-b34b458751f7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"543ef68b-ffe2-4b3f-91b2-b34b458751f7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:09 crc kubenswrapper[4817]: I0218 14:21:09.034493 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6b021a8-f766-4296-87b8-88b45f99a5ba-config\") pod \"dnsmasq-dns-5d578b86f9-th88v\" (UID: \"e6b021a8-f766-4296-87b8-88b45f99a5ba\") " pod="openstack/dnsmasq-dns-5d578b86f9-th88v" Feb 18 14:21:09 crc kubenswrapper[4817]: I0218 14:21:09.034545 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/543ef68b-ffe2-4b3f-91b2-b34b458751f7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"543ef68b-ffe2-4b3f-91b2-b34b458751f7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:09 crc kubenswrapper[4817]: I0218 14:21:09.034603 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkf8b\" (UniqueName: \"kubernetes.io/projected/e6b021a8-f766-4296-87b8-88b45f99a5ba-kube-api-access-xkf8b\") pod \"dnsmasq-dns-5d578b86f9-th88v\" (UID: \"e6b021a8-f766-4296-87b8-88b45f99a5ba\") " pod="openstack/dnsmasq-dns-5d578b86f9-th88v" Feb 18 14:21:09 crc kubenswrapper[4817]: I0218 14:21:09.034619 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zm57\" (UniqueName: \"kubernetes.io/projected/543ef68b-ffe2-4b3f-91b2-b34b458751f7-kube-api-access-7zm57\") pod \"nova-cell1-novncproxy-0\" (UID: \"543ef68b-ffe2-4b3f-91b2-b34b458751f7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:09 crc kubenswrapper[4817]: I0218 14:21:09.034637 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6b021a8-f766-4296-87b8-88b45f99a5ba-dns-svc\") pod \"dnsmasq-dns-5d578b86f9-th88v\" (UID: \"e6b021a8-f766-4296-87b8-88b45f99a5ba\") " pod="openstack/dnsmasq-dns-5d578b86f9-th88v" Feb 18 14:21:09 crc kubenswrapper[4817]: I0218 14:21:09.034653 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e6b021a8-f766-4296-87b8-88b45f99a5ba-dns-swift-storage-0\") pod \"dnsmasq-dns-5d578b86f9-th88v\" (UID: \"e6b021a8-f766-4296-87b8-88b45f99a5ba\") " pod="openstack/dnsmasq-dns-5d578b86f9-th88v" Feb 18 14:21:09 crc kubenswrapper[4817]: I0218 14:21:09.035521 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e6b021a8-f766-4296-87b8-88b45f99a5ba-dns-swift-storage-0\") pod \"dnsmasq-dns-5d578b86f9-th88v\" (UID: \"e6b021a8-f766-4296-87b8-88b45f99a5ba\") " pod="openstack/dnsmasq-dns-5d578b86f9-th88v" Feb 18 14:21:09 crc kubenswrapper[4817]: I0218 14:21:09.036098 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6b021a8-f766-4296-87b8-88b45f99a5ba-ovsdbserver-nb\") pod \"dnsmasq-dns-5d578b86f9-th88v\" (UID: \"e6b021a8-f766-4296-87b8-88b45f99a5ba\") " pod="openstack/dnsmasq-dns-5d578b86f9-th88v" Feb 18 14:21:09 crc kubenswrapper[4817]: I0218 14:21:09.036877 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6b021a8-f766-4296-87b8-88b45f99a5ba-dns-svc\") pod \"dnsmasq-dns-5d578b86f9-th88v\" (UID: \"e6b021a8-f766-4296-87b8-88b45f99a5ba\") " pod="openstack/dnsmasq-dns-5d578b86f9-th88v" Feb 18 14:21:09 crc kubenswrapper[4817]: I0218 14:21:09.038873 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6b021a8-f766-4296-87b8-88b45f99a5ba-ovsdbserver-sb\") pod \"dnsmasq-dns-5d578b86f9-th88v\" (UID: \"e6b021a8-f766-4296-87b8-88b45f99a5ba\") " pod="openstack/dnsmasq-dns-5d578b86f9-th88v" Feb 18 14:21:09 crc kubenswrapper[4817]: I0218 14:21:09.039475 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6b021a8-f766-4296-87b8-88b45f99a5ba-config\") pod \"dnsmasq-dns-5d578b86f9-th88v\" (UID: \"e6b021a8-f766-4296-87b8-88b45f99a5ba\") " pod="openstack/dnsmasq-dns-5d578b86f9-th88v" Feb 18 14:21:09 crc kubenswrapper[4817]: I0218 14:21:09.059915 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkf8b\" (UniqueName: \"kubernetes.io/projected/e6b021a8-f766-4296-87b8-88b45f99a5ba-kube-api-access-xkf8b\") pod \"dnsmasq-dns-5d578b86f9-th88v\" (UID: \"e6b021a8-f766-4296-87b8-88b45f99a5ba\") " pod="openstack/dnsmasq-dns-5d578b86f9-th88v" Feb 18 14:21:09 crc kubenswrapper[4817]: I0218 14:21:09.060138 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/543ef68b-ffe2-4b3f-91b2-b34b458751f7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"543ef68b-ffe2-4b3f-91b2-b34b458751f7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:09 crc kubenswrapper[4817]: I0218 14:21:09.061483 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/543ef68b-ffe2-4b3f-91b2-b34b458751f7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"543ef68b-ffe2-4b3f-91b2-b34b458751f7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:09 crc kubenswrapper[4817]: I0218 14:21:09.078819 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zm57\" (UniqueName: \"kubernetes.io/projected/543ef68b-ffe2-4b3f-91b2-b34b458751f7-kube-api-access-7zm57\") pod \"nova-cell1-novncproxy-0\" (UID: \"543ef68b-ffe2-4b3f-91b2-b34b458751f7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:09 crc kubenswrapper[4817]: I0218 14:21:09.163788 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 14:21:09 crc kubenswrapper[4817]: I0218 14:21:09.209453 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:09 crc kubenswrapper[4817]: I0218 14:21:09.246873 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d578b86f9-th88v" Feb 18 14:21:09 crc kubenswrapper[4817]: I0218 14:21:09.317571 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mr9dl"] Feb 18 14:21:09 crc kubenswrapper[4817]: I0218 14:21:09.603208 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 18 14:21:09 crc kubenswrapper[4817]: I0218 14:21:09.826196 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 14:21:09 crc kubenswrapper[4817]: I0218 14:21:09.962102 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hptxc"] Feb 18 14:21:09 crc kubenswrapper[4817]: I0218 14:21:09.964041 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hptxc" Feb 18 14:21:09 crc kubenswrapper[4817]: I0218 14:21:09.968023 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 18 14:21:09 crc kubenswrapper[4817]: I0218 14:21:09.968333 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 18 14:21:09 crc kubenswrapper[4817]: I0218 14:21:09.986346 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 14:21:10 crc kubenswrapper[4817]: I0218 14:21:10.005595 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hptxc"] Feb 18 14:21:10 crc kubenswrapper[4817]: I0218 14:21:10.088270 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvmgg\" (UniqueName: \"kubernetes.io/projected/e176b326-3d3d-4b95-8a7e-e18448de49ae-kube-api-access-tvmgg\") pod \"nova-cell1-conductor-db-sync-hptxc\" (UID: \"e176b326-3d3d-4b95-8a7e-e18448de49ae\") " pod="openstack/nova-cell1-conductor-db-sync-hptxc" Feb 18 14:21:10 crc kubenswrapper[4817]: I0218 14:21:10.088327 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e176b326-3d3d-4b95-8a7e-e18448de49ae-scripts\") pod \"nova-cell1-conductor-db-sync-hptxc\" (UID: \"e176b326-3d3d-4b95-8a7e-e18448de49ae\") " pod="openstack/nova-cell1-conductor-db-sync-hptxc" Feb 18 14:21:10 crc kubenswrapper[4817]: I0218 14:21:10.088352 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e176b326-3d3d-4b95-8a7e-e18448de49ae-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hptxc\" (UID: \"e176b326-3d3d-4b95-8a7e-e18448de49ae\") " pod="openstack/nova-cell1-conductor-db-sync-hptxc" Feb 18 14:21:10 crc kubenswrapper[4817]: I0218 14:21:10.088441 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e176b326-3d3d-4b95-8a7e-e18448de49ae-config-data\") pod \"nova-cell1-conductor-db-sync-hptxc\" (UID: \"e176b326-3d3d-4b95-8a7e-e18448de49ae\") " pod="openstack/nova-cell1-conductor-db-sync-hptxc" Feb 18 14:21:10 crc kubenswrapper[4817]: I0218 14:21:10.111675 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 14:21:10 crc kubenswrapper[4817]: W0218 14:21:10.112270 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d50c64d_0681_400c_8daf_d061888e2576.slice/crio-d24222a1e8644534e4299a6b5e4783e5a0ff39ce16b019435b797f4832cd7722 WatchSource:0}: Error finding container d24222a1e8644534e4299a6b5e4783e5a0ff39ce16b019435b797f4832cd7722: Status 404 returned error can't find the container with id d24222a1e8644534e4299a6b5e4783e5a0ff39ce16b019435b797f4832cd7722 Feb 18 14:21:10 crc kubenswrapper[4817]: I0218 14:21:10.125479 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d578b86f9-th88v"] Feb 18 14:21:10 crc kubenswrapper[4817]: I0218 14:21:10.190660 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvmgg\" (UniqueName: \"kubernetes.io/projected/e176b326-3d3d-4b95-8a7e-e18448de49ae-kube-api-access-tvmgg\") pod \"nova-cell1-conductor-db-sync-hptxc\" (UID: \"e176b326-3d3d-4b95-8a7e-e18448de49ae\") " pod="openstack/nova-cell1-conductor-db-sync-hptxc" Feb 18 14:21:10 crc kubenswrapper[4817]: I0218 14:21:10.190721 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e176b326-3d3d-4b95-8a7e-e18448de49ae-scripts\") pod \"nova-cell1-conductor-db-sync-hptxc\" (UID: \"e176b326-3d3d-4b95-8a7e-e18448de49ae\") " pod="openstack/nova-cell1-conductor-db-sync-hptxc" Feb 18 14:21:10 crc kubenswrapper[4817]: I0218 14:21:10.190745 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e176b326-3d3d-4b95-8a7e-e18448de49ae-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hptxc\" (UID: \"e176b326-3d3d-4b95-8a7e-e18448de49ae\") " pod="openstack/nova-cell1-conductor-db-sync-hptxc" Feb 18 14:21:10 crc kubenswrapper[4817]: I0218 14:21:10.190844 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e176b326-3d3d-4b95-8a7e-e18448de49ae-config-data\") pod \"nova-cell1-conductor-db-sync-hptxc\" (UID: \"e176b326-3d3d-4b95-8a7e-e18448de49ae\") " pod="openstack/nova-cell1-conductor-db-sync-hptxc" Feb 18 14:21:10 crc kubenswrapper[4817]: I0218 14:21:10.198354 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e176b326-3d3d-4b95-8a7e-e18448de49ae-scripts\") pod \"nova-cell1-conductor-db-sync-hptxc\" (UID: \"e176b326-3d3d-4b95-8a7e-e18448de49ae\") " pod="openstack/nova-cell1-conductor-db-sync-hptxc" Feb 18 14:21:10 crc kubenswrapper[4817]: I0218 14:21:10.199002 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e176b326-3d3d-4b95-8a7e-e18448de49ae-config-data\") pod \"nova-cell1-conductor-db-sync-hptxc\" (UID: \"e176b326-3d3d-4b95-8a7e-e18448de49ae\") " pod="openstack/nova-cell1-conductor-db-sync-hptxc" Feb 18 14:21:10 crc kubenswrapper[4817]: I0218 14:21:10.206509 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e176b326-3d3d-4b95-8a7e-e18448de49ae-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hptxc\" (UID: \"e176b326-3d3d-4b95-8a7e-e18448de49ae\") " pod="openstack/nova-cell1-conductor-db-sync-hptxc" Feb 18 14:21:10 crc kubenswrapper[4817]: I0218 14:21:10.208369 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvmgg\" (UniqueName: \"kubernetes.io/projected/e176b326-3d3d-4b95-8a7e-e18448de49ae-kube-api-access-tvmgg\") pod \"nova-cell1-conductor-db-sync-hptxc\" (UID: \"e176b326-3d3d-4b95-8a7e-e18448de49ae\") " pod="openstack/nova-cell1-conductor-db-sync-hptxc" Feb 18 14:21:10 crc kubenswrapper[4817]: I0218 14:21:10.303875 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hptxc" Feb 18 14:21:10 crc kubenswrapper[4817]: I0218 14:21:10.376159 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"543ef68b-ffe2-4b3f-91b2-b34b458751f7","Type":"ContainerStarted","Data":"2846ee1a927f6a6f8b981c4b569b8bec9932f4f7ef4e42562dd1d10eab89fda3"} Feb 18 14:21:10 crc kubenswrapper[4817]: I0218 14:21:10.382970 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6d50c64d-0681-400c-8daf-d061888e2576","Type":"ContainerStarted","Data":"d24222a1e8644534e4299a6b5e4783e5a0ff39ce16b019435b797f4832cd7722"} Feb 18 14:21:10 crc kubenswrapper[4817]: I0218 14:21:10.384783 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4a6de9a3-608a-46a3-b157-5776f0590b63","Type":"ContainerStarted","Data":"199d423dbfcecd5d8250eb330f6ccca501f2b2101551d5b1fe1655305691bca9"} Feb 18 14:21:10 crc kubenswrapper[4817]: I0218 14:21:10.386206 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"55106d39-246d-49ea-ab48-d7b703b72eef","Type":"ContainerStarted","Data":"ce87577891535cb7c2c0d65950546740fe6cccedf060d44d1da0cb861f7f1beb"} Feb 18 14:21:10 crc kubenswrapper[4817]: I0218 14:21:10.388154 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d578b86f9-th88v" event={"ID":"e6b021a8-f766-4296-87b8-88b45f99a5ba","Type":"ContainerStarted","Data":"2c5d33fd94566ebdc2a2cfb229a950124e4d811986e19a29d0d3d4df7cd422f0"} Feb 18 14:21:10 crc kubenswrapper[4817]: I0218 14:21:10.389827 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mr9dl" event={"ID":"76e9a054-69ef-45a0-b901-7ba80c2c2f46","Type":"ContainerStarted","Data":"6068d26a7aff6d6af7977aa0851d1c0f08658d980729bbd3920d67f8b455ea51"} Feb 18 14:21:10 crc kubenswrapper[4817]: I0218 14:21:10.389868 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mr9dl" event={"ID":"76e9a054-69ef-45a0-b901-7ba80c2c2f46","Type":"ContainerStarted","Data":"4665c1192e053ed90bfcac093678e9e38aa76cf54d6bfa77597c56c7890ddc33"} Feb 18 14:21:10 crc kubenswrapper[4817]: I0218 14:21:10.418287 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-mr9dl" podStartSLOduration=2.418266548 podStartE2EDuration="2.418266548s" podCreationTimestamp="2026-02-18 14:21:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:21:10.406852232 +0000 UTC m=+1332.982388215" watchObservedRunningTime="2026-02-18 14:21:10.418266548 +0000 UTC m=+1332.993802531" Feb 18 14:21:11 crc kubenswrapper[4817]: I0218 14:21:11.016667 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hptxc"] Feb 18 14:21:11 crc kubenswrapper[4817]: W0218 14:21:11.018440 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode176b326_3d3d_4b95_8a7e_e18448de49ae.slice/crio-344195bdba5eb0382780eb0c52a435e8ff3de1ff1649f7cac555d49251e20624 WatchSource:0}: Error finding container 344195bdba5eb0382780eb0c52a435e8ff3de1ff1649f7cac555d49251e20624: Status 404 returned error can't find the container with id 344195bdba5eb0382780eb0c52a435e8ff3de1ff1649f7cac555d49251e20624 Feb 18 14:21:11 crc kubenswrapper[4817]: I0218 14:21:11.164257 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5scpm" podUID="01b17e66-ae59-413b-985f-ea5cf5e11600" containerName="registry-server" probeResult="failure" output=< Feb 18 14:21:11 crc kubenswrapper[4817]: timeout: failed to connect service ":50051" within 1s Feb 18 14:21:11 crc kubenswrapper[4817]: > Feb 18 14:21:11 crc kubenswrapper[4817]: I0218 14:21:11.438171 4817 generic.go:334] "Generic (PLEG): container finished" podID="e6b021a8-f766-4296-87b8-88b45f99a5ba" containerID="9f4e4287c85bd791f9c7fcd06f427262a323952dc37bc8614cbd4785daa1688a" exitCode=0 Feb 18 14:21:11 crc kubenswrapper[4817]: I0218 14:21:11.438232 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d578b86f9-th88v" event={"ID":"e6b021a8-f766-4296-87b8-88b45f99a5ba","Type":"ContainerDied","Data":"9f4e4287c85bd791f9c7fcd06f427262a323952dc37bc8614cbd4785daa1688a"} Feb 18 14:21:11 crc kubenswrapper[4817]: I0218 14:21:11.503751 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hptxc" event={"ID":"e176b326-3d3d-4b95-8a7e-e18448de49ae","Type":"ContainerStarted","Data":"47a50248f30af6ef7ab21313783ae7fc54be9ceb9b4f064a1b9e653f3b841298"} Feb 18 14:21:11 crc kubenswrapper[4817]: I0218 14:21:11.504141 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hptxc" event={"ID":"e176b326-3d3d-4b95-8a7e-e18448de49ae","Type":"ContainerStarted","Data":"344195bdba5eb0382780eb0c52a435e8ff3de1ff1649f7cac555d49251e20624"} Feb 18 14:21:12 crc kubenswrapper[4817]: I0218 14:21:12.511434 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-hptxc" podStartSLOduration=3.511412417 podStartE2EDuration="3.511412417s" podCreationTimestamp="2026-02-18 14:21:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:21:11.548320677 +0000 UTC m=+1334.123856660" watchObservedRunningTime="2026-02-18 14:21:12.511412417 +0000 UTC m=+1335.086948400" Feb 18 14:21:12 crc kubenswrapper[4817]: I0218 14:21:12.527108 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 14:21:12 crc kubenswrapper[4817]: I0218 14:21:12.535858 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d578b86f9-th88v" event={"ID":"e6b021a8-f766-4296-87b8-88b45f99a5ba","Type":"ContainerStarted","Data":"e0d235e105fd3072304bce38d7b920551c82803f91c13815c70e98c0c9045e74"} Feb 18 14:21:12 crc kubenswrapper[4817]: I0218 14:21:12.536024 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d578b86f9-th88v" Feb 18 14:21:12 crc kubenswrapper[4817]: I0218 14:21:12.542758 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 14:21:12 crc kubenswrapper[4817]: I0218 14:21:12.543260 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed71e1c9-fa52-4a17-901c-5efc187043fb","Type":"ContainerStarted","Data":"1c6159cba34b7d2a5ffaad168d6f5bac0e2cd56bda4f68e75ae287def34f86ad"} Feb 18 14:21:12 crc kubenswrapper[4817]: I0218 14:21:12.566542 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d578b86f9-th88v" podStartSLOduration=4.566522583 podStartE2EDuration="4.566522583s" podCreationTimestamp="2026-02-18 14:21:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:21:12.565363533 +0000 UTC m=+1335.140899526" watchObservedRunningTime="2026-02-18 14:21:12.566522583 +0000 UTC m=+1335.142058566" Feb 18 14:21:12 crc kubenswrapper[4817]: I0218 14:21:12.601248 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.075022826 podStartE2EDuration="9.601224201s" podCreationTimestamp="2026-02-18 14:21:03 +0000 UTC" firstStartedPulling="2026-02-18 14:21:04.68000103 +0000 UTC m=+1327.255537013" lastFinishedPulling="2026-02-18 14:21:11.206202405 +0000 UTC m=+1333.781738388" observedRunningTime="2026-02-18 14:21:12.589505418 +0000 UTC m=+1335.165041411" watchObservedRunningTime="2026-02-18 14:21:12.601224201 +0000 UTC m=+1335.176760184" Feb 18 14:21:13 crc kubenswrapper[4817]: I0218 14:21:13.555529 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 18 14:21:14 crc kubenswrapper[4817]: I0218 14:21:14.967923 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Feb 18 14:21:16 crc kubenswrapper[4817]: I0218 14:21:16.595373 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4a6de9a3-608a-46a3-b157-5776f0590b63","Type":"ContainerStarted","Data":"1054a3b36e63d766c6a6f04df0c2a83127ab59e396223456f704f95987e223d1"} Feb 18 14:21:16 crc kubenswrapper[4817]: I0218 14:21:16.595473 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4a6de9a3-608a-46a3-b157-5776f0590b63" containerName="nova-metadata-log" containerID="cri-o://ab2dd39383e0c35d657da151de272bb50dee34228360a8ff82e882d962561d26" gracePeriod=30 Feb 18 14:21:16 crc kubenswrapper[4817]: I0218 14:21:16.595534 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4a6de9a3-608a-46a3-b157-5776f0590b63" containerName="nova-metadata-metadata" containerID="cri-o://1054a3b36e63d766c6a6f04df0c2a83127ab59e396223456f704f95987e223d1" gracePeriod=30 Feb 18 14:21:16 crc kubenswrapper[4817]: I0218 14:21:16.597179 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4a6de9a3-608a-46a3-b157-5776f0590b63","Type":"ContainerStarted","Data":"ab2dd39383e0c35d657da151de272bb50dee34228360a8ff82e882d962561d26"} Feb 18 14:21:16 crc kubenswrapper[4817]: I0218 14:21:16.598175 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"55106d39-246d-49ea-ab48-d7b703b72eef","Type":"ContainerStarted","Data":"e03bd11a3537a78ff9be585a7445496834cf74ad39d8d91756a8dee8b32da0e6"} Feb 18 14:21:16 crc kubenswrapper[4817]: I0218 14:21:16.598234 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"55106d39-246d-49ea-ab48-d7b703b72eef","Type":"ContainerStarted","Data":"6b90228165c1ae793adfa4c2ca91523d60491c9007a99446739444fa3db7a5ac"} Feb 18 14:21:16 crc kubenswrapper[4817]: I0218 14:21:16.604026 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"543ef68b-ffe2-4b3f-91b2-b34b458751f7","Type":"ContainerStarted","Data":"9c2bf5bd31790eaed02449111a9156317ae8e76208ab4773d58e99c8fb04ab3d"} Feb 18 14:21:16 crc kubenswrapper[4817]: I0218 14:21:16.604074 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="543ef68b-ffe2-4b3f-91b2-b34b458751f7" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://9c2bf5bd31790eaed02449111a9156317ae8e76208ab4773d58e99c8fb04ab3d" gracePeriod=30 Feb 18 14:21:16 crc kubenswrapper[4817]: I0218 14:21:16.607126 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6d50c64d-0681-400c-8daf-d061888e2576","Type":"ContainerStarted","Data":"0bc6c5603aab187ef95120f9c4d06035f9c7fdc4758d45260e310ac6aa1f6c55"} Feb 18 14:21:16 crc kubenswrapper[4817]: I0218 14:21:16.630363 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.37254428 podStartE2EDuration="8.630337634s" podCreationTimestamp="2026-02-18 14:21:08 +0000 UTC" firstStartedPulling="2026-02-18 14:21:09.842421398 +0000 UTC m=+1332.417957381" lastFinishedPulling="2026-02-18 14:21:15.100214752 +0000 UTC m=+1337.675750735" observedRunningTime="2026-02-18 14:21:16.624960065 +0000 UTC m=+1339.200496048" watchObservedRunningTime="2026-02-18 14:21:16.630337634 +0000 UTC m=+1339.205873617" Feb 18 14:21:16 crc kubenswrapper[4817]: I0218 14:21:16.656673 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.689660176 podStartE2EDuration="8.656653995s" podCreationTimestamp="2026-02-18 14:21:08 +0000 UTC" firstStartedPulling="2026-02-18 14:21:10.134168647 +0000 UTC m=+1332.709704640" lastFinishedPulling="2026-02-18 14:21:15.101162476 +0000 UTC m=+1337.676698459" observedRunningTime="2026-02-18 14:21:16.639128712 +0000 UTC m=+1339.214664705" watchObservedRunningTime="2026-02-18 14:21:16.656653995 +0000 UTC m=+1339.232189968" Feb 18 14:21:16 crc kubenswrapper[4817]: I0218 14:21:16.682229 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.200800506 podStartE2EDuration="8.682207856s" podCreationTimestamp="2026-02-18 14:21:08 +0000 UTC" firstStartedPulling="2026-02-18 14:21:09.619561321 +0000 UTC m=+1332.195097304" lastFinishedPulling="2026-02-18 14:21:15.100968671 +0000 UTC m=+1337.676504654" observedRunningTime="2026-02-18 14:21:16.667424924 +0000 UTC m=+1339.242960927" watchObservedRunningTime="2026-02-18 14:21:16.682207856 +0000 UTC m=+1339.257743839" Feb 18 14:21:16 crc kubenswrapper[4817]: I0218 14:21:16.692958 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.569566047 podStartE2EDuration="8.692936524s" podCreationTimestamp="2026-02-18 14:21:08 +0000 UTC" firstStartedPulling="2026-02-18 14:21:09.993718982 +0000 UTC m=+1332.569254965" lastFinishedPulling="2026-02-18 14:21:15.117089459 +0000 UTC m=+1337.692625442" observedRunningTime="2026-02-18 14:21:16.691303972 +0000 UTC m=+1339.266839955" watchObservedRunningTime="2026-02-18 14:21:16.692936524 +0000 UTC m=+1339.268472507" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.293279 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.383395 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dp64w\" (UniqueName: \"kubernetes.io/projected/4a6de9a3-608a-46a3-b157-5776f0590b63-kube-api-access-dp64w\") pod \"4a6de9a3-608a-46a3-b157-5776f0590b63\" (UID: \"4a6de9a3-608a-46a3-b157-5776f0590b63\") " Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.383454 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a6de9a3-608a-46a3-b157-5776f0590b63-config-data\") pod \"4a6de9a3-608a-46a3-b157-5776f0590b63\" (UID: \"4a6de9a3-608a-46a3-b157-5776f0590b63\") " Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.383513 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a6de9a3-608a-46a3-b157-5776f0590b63-logs\") pod \"4a6de9a3-608a-46a3-b157-5776f0590b63\" (UID: \"4a6de9a3-608a-46a3-b157-5776f0590b63\") " Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.383699 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a6de9a3-608a-46a3-b157-5776f0590b63-combined-ca-bundle\") pod \"4a6de9a3-608a-46a3-b157-5776f0590b63\" (UID: \"4a6de9a3-608a-46a3-b157-5776f0590b63\") " Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.384059 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a6de9a3-608a-46a3-b157-5776f0590b63-logs" (OuterVolumeSpecName: "logs") pod "4a6de9a3-608a-46a3-b157-5776f0590b63" (UID: "4a6de9a3-608a-46a3-b157-5776f0590b63"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.384520 4817 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a6de9a3-608a-46a3-b157-5776f0590b63-logs\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.401365 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a6de9a3-608a-46a3-b157-5776f0590b63-kube-api-access-dp64w" (OuterVolumeSpecName: "kube-api-access-dp64w") pod "4a6de9a3-608a-46a3-b157-5776f0590b63" (UID: "4a6de9a3-608a-46a3-b157-5776f0590b63"). InnerVolumeSpecName "kube-api-access-dp64w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.418089 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a6de9a3-608a-46a3-b157-5776f0590b63-config-data" (OuterVolumeSpecName: "config-data") pod "4a6de9a3-608a-46a3-b157-5776f0590b63" (UID: "4a6de9a3-608a-46a3-b157-5776f0590b63"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.428257 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a6de9a3-608a-46a3-b157-5776f0590b63-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a6de9a3-608a-46a3-b157-5776f0590b63" (UID: "4a6de9a3-608a-46a3-b157-5776f0590b63"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.486821 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a6de9a3-608a-46a3-b157-5776f0590b63-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.486871 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dp64w\" (UniqueName: \"kubernetes.io/projected/4a6de9a3-608a-46a3-b157-5776f0590b63-kube-api-access-dp64w\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.486887 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a6de9a3-608a-46a3-b157-5776f0590b63-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.622080 4817 generic.go:334] "Generic (PLEG): container finished" podID="4a6de9a3-608a-46a3-b157-5776f0590b63" containerID="1054a3b36e63d766c6a6f04df0c2a83127ab59e396223456f704f95987e223d1" exitCode=0 Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.622116 4817 generic.go:334] "Generic (PLEG): container finished" podID="4a6de9a3-608a-46a3-b157-5776f0590b63" containerID="ab2dd39383e0c35d657da151de272bb50dee34228360a8ff82e882d962561d26" exitCode=143 Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.622464 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4a6de9a3-608a-46a3-b157-5776f0590b63","Type":"ContainerDied","Data":"1054a3b36e63d766c6a6f04df0c2a83127ab59e396223456f704f95987e223d1"} Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.622521 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4a6de9a3-608a-46a3-b157-5776f0590b63","Type":"ContainerDied","Data":"ab2dd39383e0c35d657da151de272bb50dee34228360a8ff82e882d962561d26"} Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.622535 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4a6de9a3-608a-46a3-b157-5776f0590b63","Type":"ContainerDied","Data":"199d423dbfcecd5d8250eb330f6ccca501f2b2101551d5b1fe1655305691bca9"} Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.622555 4817 scope.go:117] "RemoveContainer" containerID="1054a3b36e63d766c6a6f04df0c2a83127ab59e396223456f704f95987e223d1" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.622707 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.648356 4817 scope.go:117] "RemoveContainer" containerID="ab2dd39383e0c35d657da151de272bb50dee34228360a8ff82e882d962561d26" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.688116 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.710436 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.715319 4817 scope.go:117] "RemoveContainer" containerID="1054a3b36e63d766c6a6f04df0c2a83127ab59e396223456f704f95987e223d1" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.723735 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 18 14:21:17 crc kubenswrapper[4817]: E0218 14:21:17.724222 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a6de9a3-608a-46a3-b157-5776f0590b63" containerName="nova-metadata-metadata" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.724237 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a6de9a3-608a-46a3-b157-5776f0590b63" containerName="nova-metadata-metadata" Feb 18 14:21:17 crc kubenswrapper[4817]: E0218 14:21:17.724296 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a6de9a3-608a-46a3-b157-5776f0590b63" containerName="nova-metadata-log" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.724304 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a6de9a3-608a-46a3-b157-5776f0590b63" containerName="nova-metadata-log" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.724528 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a6de9a3-608a-46a3-b157-5776f0590b63" containerName="nova-metadata-log" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.724559 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a6de9a3-608a-46a3-b157-5776f0590b63" containerName="nova-metadata-metadata" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.725966 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 14:21:17 crc kubenswrapper[4817]: E0218 14:21:17.731321 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1054a3b36e63d766c6a6f04df0c2a83127ab59e396223456f704f95987e223d1\": container with ID starting with 1054a3b36e63d766c6a6f04df0c2a83127ab59e396223456f704f95987e223d1 not found: ID does not exist" containerID="1054a3b36e63d766c6a6f04df0c2a83127ab59e396223456f704f95987e223d1" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.731358 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1054a3b36e63d766c6a6f04df0c2a83127ab59e396223456f704f95987e223d1"} err="failed to get container status \"1054a3b36e63d766c6a6f04df0c2a83127ab59e396223456f704f95987e223d1\": rpc error: code = NotFound desc = could not find container \"1054a3b36e63d766c6a6f04df0c2a83127ab59e396223456f704f95987e223d1\": container with ID starting with 1054a3b36e63d766c6a6f04df0c2a83127ab59e396223456f704f95987e223d1 not found: ID does not exist" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.731386 4817 scope.go:117] "RemoveContainer" containerID="ab2dd39383e0c35d657da151de272bb50dee34228360a8ff82e882d962561d26" Feb 18 14:21:17 crc kubenswrapper[4817]: E0218 14:21:17.732214 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab2dd39383e0c35d657da151de272bb50dee34228360a8ff82e882d962561d26\": container with ID starting with ab2dd39383e0c35d657da151de272bb50dee34228360a8ff82e882d962561d26 not found: ID does not exist" containerID="ab2dd39383e0c35d657da151de272bb50dee34228360a8ff82e882d962561d26" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.732243 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab2dd39383e0c35d657da151de272bb50dee34228360a8ff82e882d962561d26"} err="failed to get container status \"ab2dd39383e0c35d657da151de272bb50dee34228360a8ff82e882d962561d26\": rpc error: code = NotFound desc = could not find container \"ab2dd39383e0c35d657da151de272bb50dee34228360a8ff82e882d962561d26\": container with ID starting with ab2dd39383e0c35d657da151de272bb50dee34228360a8ff82e882d962561d26 not found: ID does not exist" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.732266 4817 scope.go:117] "RemoveContainer" containerID="1054a3b36e63d766c6a6f04df0c2a83127ab59e396223456f704f95987e223d1" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.732508 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.734656 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.734869 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.737713 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1054a3b36e63d766c6a6f04df0c2a83127ab59e396223456f704f95987e223d1"} err="failed to get container status \"1054a3b36e63d766c6a6f04df0c2a83127ab59e396223456f704f95987e223d1\": rpc error: code = NotFound desc = could not find container \"1054a3b36e63d766c6a6f04df0c2a83127ab59e396223456f704f95987e223d1\": container with ID starting with 1054a3b36e63d766c6a6f04df0c2a83127ab59e396223456f704f95987e223d1 not found: ID does not exist" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.737747 4817 scope.go:117] "RemoveContainer" containerID="ab2dd39383e0c35d657da151de272bb50dee34228360a8ff82e882d962561d26" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.741896 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab2dd39383e0c35d657da151de272bb50dee34228360a8ff82e882d962561d26"} err="failed to get container status \"ab2dd39383e0c35d657da151de272bb50dee34228360a8ff82e882d962561d26\": rpc error: code = NotFound desc = could not find container \"ab2dd39383e0c35d657da151de272bb50dee34228360a8ff82e882d962561d26\": container with ID starting with ab2dd39383e0c35d657da151de272bb50dee34228360a8ff82e882d962561d26 not found: ID does not exist" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.894119 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/804e8c84-e2fb-4a69-8d21-c5d8bb291be2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"804e8c84-e2fb-4a69-8d21-c5d8bb291be2\") " pod="openstack/nova-metadata-0" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.894249 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkw8n\" (UniqueName: \"kubernetes.io/projected/804e8c84-e2fb-4a69-8d21-c5d8bb291be2-kube-api-access-qkw8n\") pod \"nova-metadata-0\" (UID: \"804e8c84-e2fb-4a69-8d21-c5d8bb291be2\") " pod="openstack/nova-metadata-0" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.894320 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/804e8c84-e2fb-4a69-8d21-c5d8bb291be2-config-data\") pod \"nova-metadata-0\" (UID: \"804e8c84-e2fb-4a69-8d21-c5d8bb291be2\") " pod="openstack/nova-metadata-0" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.894350 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/804e8c84-e2fb-4a69-8d21-c5d8bb291be2-logs\") pod \"nova-metadata-0\" (UID: \"804e8c84-e2fb-4a69-8d21-c5d8bb291be2\") " pod="openstack/nova-metadata-0" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.894421 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/804e8c84-e2fb-4a69-8d21-c5d8bb291be2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"804e8c84-e2fb-4a69-8d21-c5d8bb291be2\") " pod="openstack/nova-metadata-0" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.997343 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/804e8c84-e2fb-4a69-8d21-c5d8bb291be2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"804e8c84-e2fb-4a69-8d21-c5d8bb291be2\") " pod="openstack/nova-metadata-0" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.997577 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/804e8c84-e2fb-4a69-8d21-c5d8bb291be2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"804e8c84-e2fb-4a69-8d21-c5d8bb291be2\") " pod="openstack/nova-metadata-0" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.997743 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkw8n\" (UniqueName: \"kubernetes.io/projected/804e8c84-e2fb-4a69-8d21-c5d8bb291be2-kube-api-access-qkw8n\") pod \"nova-metadata-0\" (UID: \"804e8c84-e2fb-4a69-8d21-c5d8bb291be2\") " pod="openstack/nova-metadata-0" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.997874 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/804e8c84-e2fb-4a69-8d21-c5d8bb291be2-config-data\") pod \"nova-metadata-0\" (UID: \"804e8c84-e2fb-4a69-8d21-c5d8bb291be2\") " pod="openstack/nova-metadata-0" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.997919 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/804e8c84-e2fb-4a69-8d21-c5d8bb291be2-logs\") pod \"nova-metadata-0\" (UID: \"804e8c84-e2fb-4a69-8d21-c5d8bb291be2\") " pod="openstack/nova-metadata-0" Feb 18 14:21:17 crc kubenswrapper[4817]: I0218 14:21:17.998520 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/804e8c84-e2fb-4a69-8d21-c5d8bb291be2-logs\") pod \"nova-metadata-0\" (UID: \"804e8c84-e2fb-4a69-8d21-c5d8bb291be2\") " pod="openstack/nova-metadata-0" Feb 18 14:21:18 crc kubenswrapper[4817]: I0218 14:21:18.002681 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/804e8c84-e2fb-4a69-8d21-c5d8bb291be2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"804e8c84-e2fb-4a69-8d21-c5d8bb291be2\") " pod="openstack/nova-metadata-0" Feb 18 14:21:18 crc kubenswrapper[4817]: I0218 14:21:18.003150 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/804e8c84-e2fb-4a69-8d21-c5d8bb291be2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"804e8c84-e2fb-4a69-8d21-c5d8bb291be2\") " pod="openstack/nova-metadata-0" Feb 18 14:21:18 crc kubenswrapper[4817]: I0218 14:21:18.005832 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/804e8c84-e2fb-4a69-8d21-c5d8bb291be2-config-data\") pod \"nova-metadata-0\" (UID: \"804e8c84-e2fb-4a69-8d21-c5d8bb291be2\") " pod="openstack/nova-metadata-0" Feb 18 14:21:18 crc kubenswrapper[4817]: I0218 14:21:18.022842 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkw8n\" (UniqueName: \"kubernetes.io/projected/804e8c84-e2fb-4a69-8d21-c5d8bb291be2-kube-api-access-qkw8n\") pod \"nova-metadata-0\" (UID: \"804e8c84-e2fb-4a69-8d21-c5d8bb291be2\") " pod="openstack/nova-metadata-0" Feb 18 14:21:18 crc kubenswrapper[4817]: I0218 14:21:18.056190 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 14:21:18 crc kubenswrapper[4817]: I0218 14:21:18.187012 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a6de9a3-608a-46a3-b157-5776f0590b63" path="/var/lib/kubelet/pods/4a6de9a3-608a-46a3-b157-5776f0590b63/volumes" Feb 18 14:21:18 crc kubenswrapper[4817]: I0218 14:21:18.521512 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 14:21:18 crc kubenswrapper[4817]: W0218 14:21:18.528540 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod804e8c84_e2fb_4a69_8d21_c5d8bb291be2.slice/crio-a1ef9c47643d458c64fb1bbb699349cc248c6f7a8e81f9e3db77ab79ffef7605 WatchSource:0}: Error finding container a1ef9c47643d458c64fb1bbb699349cc248c6f7a8e81f9e3db77ab79ffef7605: Status 404 returned error can't find the container with id a1ef9c47643d458c64fb1bbb699349cc248c6f7a8e81f9e3db77ab79ffef7605 Feb 18 14:21:18 crc kubenswrapper[4817]: I0218 14:21:18.637484 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"804e8c84-e2fb-4a69-8d21-c5d8bb291be2","Type":"ContainerStarted","Data":"a1ef9c47643d458c64fb1bbb699349cc248c6f7a8e81f9e3db77ab79ffef7605"} Feb 18 14:21:18 crc kubenswrapper[4817]: I0218 14:21:18.782715 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 18 14:21:18 crc kubenswrapper[4817]: I0218 14:21:18.782791 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 18 14:21:19 crc kubenswrapper[4817]: I0218 14:21:19.165394 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 18 14:21:19 crc kubenswrapper[4817]: I0218 14:21:19.165861 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 18 14:21:19 crc kubenswrapper[4817]: I0218 14:21:19.211428 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:19 crc kubenswrapper[4817]: I0218 14:21:19.224464 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 18 14:21:19 crc kubenswrapper[4817]: I0218 14:21:19.248153 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d578b86f9-th88v" Feb 18 14:21:19 crc kubenswrapper[4817]: I0218 14:21:19.333448 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76d4d7c9b7-hsgjd"] Feb 18 14:21:19 crc kubenswrapper[4817]: I0218 14:21:19.338627 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76d4d7c9b7-hsgjd" podUID="fe95924e-a4f6-4844-b496-461e91941a16" containerName="dnsmasq-dns" containerID="cri-o://b16cfcffd52bda8befb4ab613338c503c09798ef2a97175cdc361648d68ddf89" gracePeriod=10 Feb 18 14:21:19 crc kubenswrapper[4817]: I0218 14:21:19.361849 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-76d4d7c9b7-hsgjd" podUID="fe95924e-a4f6-4844-b496-461e91941a16" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.204:5353: connect: connection refused" Feb 18 14:21:19 crc kubenswrapper[4817]: I0218 14:21:19.652416 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"804e8c84-e2fb-4a69-8d21-c5d8bb291be2","Type":"ContainerStarted","Data":"80f7efcbe5f849c447341e3008d7c195009c7c63dfe561363936ec03f4637064"} Feb 18 14:21:19 crc kubenswrapper[4817]: I0218 14:21:19.652477 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"804e8c84-e2fb-4a69-8d21-c5d8bb291be2","Type":"ContainerStarted","Data":"94d2be14a263212f3973117d7e35008b4526d374e1514a4b6f27a9f85b6c9a95"} Feb 18 14:21:19 crc kubenswrapper[4817]: I0218 14:21:19.702744 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 18 14:21:19 crc kubenswrapper[4817]: I0218 14:21:19.866236 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="55106d39-246d-49ea-ab48-d7b703b72eef" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.216:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 18 14:21:19 crc kubenswrapper[4817]: I0218 14:21:19.866304 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="55106d39-246d-49ea-ab48-d7b703b72eef" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.216:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 18 14:21:20 crc kubenswrapper[4817]: I0218 14:21:20.664037 4817 generic.go:334] "Generic (PLEG): container finished" podID="fe95924e-a4f6-4844-b496-461e91941a16" containerID="b16cfcffd52bda8befb4ab613338c503c09798ef2a97175cdc361648d68ddf89" exitCode=0 Feb 18 14:21:20 crc kubenswrapper[4817]: I0218 14:21:20.664128 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76d4d7c9b7-hsgjd" event={"ID":"fe95924e-a4f6-4844-b496-461e91941a16","Type":"ContainerDied","Data":"b16cfcffd52bda8befb4ab613338c503c09798ef2a97175cdc361648d68ddf89"} Feb 18 14:21:20 crc kubenswrapper[4817]: I0218 14:21:20.695713 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.695687185 podStartE2EDuration="3.695687185s" podCreationTimestamp="2026-02-18 14:21:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:21:20.685181993 +0000 UTC m=+1343.260717996" watchObservedRunningTime="2026-02-18 14:21:20.695687185 +0000 UTC m=+1343.271223168" Feb 18 14:21:21 crc kubenswrapper[4817]: I0218 14:21:21.132145 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5scpm" podUID="01b17e66-ae59-413b-985f-ea5cf5e11600" containerName="registry-server" probeResult="failure" output=< Feb 18 14:21:21 crc kubenswrapper[4817]: timeout: failed to connect service ":50051" within 1s Feb 18 14:21:21 crc kubenswrapper[4817]: > Feb 18 14:21:21 crc kubenswrapper[4817]: I0218 14:21:21.200889 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76d4d7c9b7-hsgjd" Feb 18 14:21:21 crc kubenswrapper[4817]: I0218 14:21:21.290076 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe95924e-a4f6-4844-b496-461e91941a16-dns-swift-storage-0\") pod \"fe95924e-a4f6-4844-b496-461e91941a16\" (UID: \"fe95924e-a4f6-4844-b496-461e91941a16\") " Feb 18 14:21:21 crc kubenswrapper[4817]: I0218 14:21:21.290513 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe95924e-a4f6-4844-b496-461e91941a16-config\") pod \"fe95924e-a4f6-4844-b496-461e91941a16\" (UID: \"fe95924e-a4f6-4844-b496-461e91941a16\") " Feb 18 14:21:21 crc kubenswrapper[4817]: I0218 14:21:21.290664 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe95924e-a4f6-4844-b496-461e91941a16-ovsdbserver-nb\") pod \"fe95924e-a4f6-4844-b496-461e91941a16\" (UID: \"fe95924e-a4f6-4844-b496-461e91941a16\") " Feb 18 14:21:21 crc kubenswrapper[4817]: I0218 14:21:21.290894 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe95924e-a4f6-4844-b496-461e91941a16-dns-svc\") pod \"fe95924e-a4f6-4844-b496-461e91941a16\" (UID: \"fe95924e-a4f6-4844-b496-461e91941a16\") " Feb 18 14:21:21 crc kubenswrapper[4817]: I0218 14:21:21.291108 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe95924e-a4f6-4844-b496-461e91941a16-ovsdbserver-sb\") pod \"fe95924e-a4f6-4844-b496-461e91941a16\" (UID: \"fe95924e-a4f6-4844-b496-461e91941a16\") " Feb 18 14:21:21 crc kubenswrapper[4817]: I0218 14:21:21.291525 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6vnl\" (UniqueName: \"kubernetes.io/projected/fe95924e-a4f6-4844-b496-461e91941a16-kube-api-access-q6vnl\") pod \"fe95924e-a4f6-4844-b496-461e91941a16\" (UID: \"fe95924e-a4f6-4844-b496-461e91941a16\") " Feb 18 14:21:21 crc kubenswrapper[4817]: I0218 14:21:21.309209 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe95924e-a4f6-4844-b496-461e91941a16-kube-api-access-q6vnl" (OuterVolumeSpecName: "kube-api-access-q6vnl") pod "fe95924e-a4f6-4844-b496-461e91941a16" (UID: "fe95924e-a4f6-4844-b496-461e91941a16"). InnerVolumeSpecName "kube-api-access-q6vnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:21:21 crc kubenswrapper[4817]: I0218 14:21:21.348601 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe95924e-a4f6-4844-b496-461e91941a16-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fe95924e-a4f6-4844-b496-461e91941a16" (UID: "fe95924e-a4f6-4844-b496-461e91941a16"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:21:21 crc kubenswrapper[4817]: I0218 14:21:21.353592 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe95924e-a4f6-4844-b496-461e91941a16-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fe95924e-a4f6-4844-b496-461e91941a16" (UID: "fe95924e-a4f6-4844-b496-461e91941a16"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:21:21 crc kubenswrapper[4817]: I0218 14:21:21.358542 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe95924e-a4f6-4844-b496-461e91941a16-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fe95924e-a4f6-4844-b496-461e91941a16" (UID: "fe95924e-a4f6-4844-b496-461e91941a16"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:21:21 crc kubenswrapper[4817]: I0218 14:21:21.359727 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe95924e-a4f6-4844-b496-461e91941a16-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fe95924e-a4f6-4844-b496-461e91941a16" (UID: "fe95924e-a4f6-4844-b496-461e91941a16"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:21:21 crc kubenswrapper[4817]: I0218 14:21:21.394296 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe95924e-a4f6-4844-b496-461e91941a16-config" (OuterVolumeSpecName: "config") pod "fe95924e-a4f6-4844-b496-461e91941a16" (UID: "fe95924e-a4f6-4844-b496-461e91941a16"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:21:21 crc kubenswrapper[4817]: I0218 14:21:21.394401 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe95924e-a4f6-4844-b496-461e91941a16-config\") pod \"fe95924e-a4f6-4844-b496-461e91941a16\" (UID: \"fe95924e-a4f6-4844-b496-461e91941a16\") " Feb 18 14:21:21 crc kubenswrapper[4817]: W0218 14:21:21.394514 4817 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/fe95924e-a4f6-4844-b496-461e91941a16/volumes/kubernetes.io~configmap/config Feb 18 14:21:21 crc kubenswrapper[4817]: I0218 14:21:21.394539 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe95924e-a4f6-4844-b496-461e91941a16-config" (OuterVolumeSpecName: "config") pod "fe95924e-a4f6-4844-b496-461e91941a16" (UID: "fe95924e-a4f6-4844-b496-461e91941a16"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:21:21 crc kubenswrapper[4817]: I0218 14:21:21.394936 4817 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe95924e-a4f6-4844-b496-461e91941a16-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:21 crc kubenswrapper[4817]: I0218 14:21:21.394959 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe95924e-a4f6-4844-b496-461e91941a16-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:21 crc kubenswrapper[4817]: I0218 14:21:21.394970 4817 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe95924e-a4f6-4844-b496-461e91941a16-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:21 crc kubenswrapper[4817]: I0218 14:21:21.394996 4817 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe95924e-a4f6-4844-b496-461e91941a16-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:21 crc kubenswrapper[4817]: I0218 14:21:21.395005 4817 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe95924e-a4f6-4844-b496-461e91941a16-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:21 crc kubenswrapper[4817]: I0218 14:21:21.395016 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6vnl\" (UniqueName: \"kubernetes.io/projected/fe95924e-a4f6-4844-b496-461e91941a16-kube-api-access-q6vnl\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:21 crc kubenswrapper[4817]: I0218 14:21:21.685519 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76d4d7c9b7-hsgjd" event={"ID":"fe95924e-a4f6-4844-b496-461e91941a16","Type":"ContainerDied","Data":"7166b0c6410788e979b3ae08d0f032c6efe6b0fbb4ace60a557184fdaf0b89ef"} Feb 18 14:21:21 crc kubenswrapper[4817]: I0218 14:21:21.685878 4817 scope.go:117] "RemoveContainer" containerID="b16cfcffd52bda8befb4ab613338c503c09798ef2a97175cdc361648d68ddf89" Feb 18 14:21:21 crc kubenswrapper[4817]: I0218 14:21:21.685629 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76d4d7c9b7-hsgjd" Feb 18 14:21:21 crc kubenswrapper[4817]: I0218 14:21:21.692303 4817 generic.go:334] "Generic (PLEG): container finished" podID="76e9a054-69ef-45a0-b901-7ba80c2c2f46" containerID="6068d26a7aff6d6af7977aa0851d1c0f08658d980729bbd3920d67f8b455ea51" exitCode=0 Feb 18 14:21:21 crc kubenswrapper[4817]: I0218 14:21:21.692367 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mr9dl" event={"ID":"76e9a054-69ef-45a0-b901-7ba80c2c2f46","Type":"ContainerDied","Data":"6068d26a7aff6d6af7977aa0851d1c0f08658d980729bbd3920d67f8b455ea51"} Feb 18 14:21:21 crc kubenswrapper[4817]: I0218 14:21:21.730383 4817 scope.go:117] "RemoveContainer" containerID="941c80450ab2e862b065261379136eebc674624ce6ea44cb9888734acd0e551e" Feb 18 14:21:21 crc kubenswrapper[4817]: I0218 14:21:21.737264 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76d4d7c9b7-hsgjd"] Feb 18 14:21:21 crc kubenswrapper[4817]: I0218 14:21:21.748401 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76d4d7c9b7-hsgjd"] Feb 18 14:21:22 crc kubenswrapper[4817]: I0218 14:21:22.187393 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe95924e-a4f6-4844-b496-461e91941a16" path="/var/lib/kubelet/pods/fe95924e-a4f6-4844-b496-461e91941a16/volumes" Feb 18 14:21:23 crc kubenswrapper[4817]: I0218 14:21:23.056482 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 18 14:21:23 crc kubenswrapper[4817]: I0218 14:21:23.057003 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 18 14:21:23 crc kubenswrapper[4817]: I0218 14:21:23.191349 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mr9dl" Feb 18 14:21:23 crc kubenswrapper[4817]: I0218 14:21:23.234776 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e9a054-69ef-45a0-b901-7ba80c2c2f46-combined-ca-bundle\") pod \"76e9a054-69ef-45a0-b901-7ba80c2c2f46\" (UID: \"76e9a054-69ef-45a0-b901-7ba80c2c2f46\") " Feb 18 14:21:23 crc kubenswrapper[4817]: I0218 14:21:23.234822 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pl66r\" (UniqueName: \"kubernetes.io/projected/76e9a054-69ef-45a0-b901-7ba80c2c2f46-kube-api-access-pl66r\") pod \"76e9a054-69ef-45a0-b901-7ba80c2c2f46\" (UID: \"76e9a054-69ef-45a0-b901-7ba80c2c2f46\") " Feb 18 14:21:23 crc kubenswrapper[4817]: I0218 14:21:23.234923 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76e9a054-69ef-45a0-b901-7ba80c2c2f46-config-data\") pod \"76e9a054-69ef-45a0-b901-7ba80c2c2f46\" (UID: \"76e9a054-69ef-45a0-b901-7ba80c2c2f46\") " Feb 18 14:21:23 crc kubenswrapper[4817]: I0218 14:21:23.235132 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76e9a054-69ef-45a0-b901-7ba80c2c2f46-scripts\") pod \"76e9a054-69ef-45a0-b901-7ba80c2c2f46\" (UID: \"76e9a054-69ef-45a0-b901-7ba80c2c2f46\") " Feb 18 14:21:23 crc kubenswrapper[4817]: I0218 14:21:23.243624 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76e9a054-69ef-45a0-b901-7ba80c2c2f46-scripts" (OuterVolumeSpecName: "scripts") pod "76e9a054-69ef-45a0-b901-7ba80c2c2f46" (UID: "76e9a054-69ef-45a0-b901-7ba80c2c2f46"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:21:23 crc kubenswrapper[4817]: I0218 14:21:23.245769 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76e9a054-69ef-45a0-b901-7ba80c2c2f46-kube-api-access-pl66r" (OuterVolumeSpecName: "kube-api-access-pl66r") pod "76e9a054-69ef-45a0-b901-7ba80c2c2f46" (UID: "76e9a054-69ef-45a0-b901-7ba80c2c2f46"). InnerVolumeSpecName "kube-api-access-pl66r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:21:23 crc kubenswrapper[4817]: I0218 14:21:23.265667 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76e9a054-69ef-45a0-b901-7ba80c2c2f46-config-data" (OuterVolumeSpecName: "config-data") pod "76e9a054-69ef-45a0-b901-7ba80c2c2f46" (UID: "76e9a054-69ef-45a0-b901-7ba80c2c2f46"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:21:23 crc kubenswrapper[4817]: I0218 14:21:23.266201 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76e9a054-69ef-45a0-b901-7ba80c2c2f46-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76e9a054-69ef-45a0-b901-7ba80c2c2f46" (UID: "76e9a054-69ef-45a0-b901-7ba80c2c2f46"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:21:23 crc kubenswrapper[4817]: I0218 14:21:23.337705 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76e9a054-69ef-45a0-b901-7ba80c2c2f46-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:23 crc kubenswrapper[4817]: I0218 14:21:23.337737 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e9a054-69ef-45a0-b901-7ba80c2c2f46-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:23 crc kubenswrapper[4817]: I0218 14:21:23.337765 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pl66r\" (UniqueName: \"kubernetes.io/projected/76e9a054-69ef-45a0-b901-7ba80c2c2f46-kube-api-access-pl66r\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:23 crc kubenswrapper[4817]: I0218 14:21:23.337777 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76e9a054-69ef-45a0-b901-7ba80c2c2f46-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:23 crc kubenswrapper[4817]: I0218 14:21:23.716911 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mr9dl" event={"ID":"76e9a054-69ef-45a0-b901-7ba80c2c2f46","Type":"ContainerDied","Data":"4665c1192e053ed90bfcac093678e9e38aa76cf54d6bfa77597c56c7890ddc33"} Feb 18 14:21:23 crc kubenswrapper[4817]: I0218 14:21:23.716957 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4665c1192e053ed90bfcac093678e9e38aa76cf54d6bfa77597c56c7890ddc33" Feb 18 14:21:23 crc kubenswrapper[4817]: I0218 14:21:23.716985 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mr9dl" Feb 18 14:21:23 crc kubenswrapper[4817]: I0218 14:21:23.995572 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 14:21:23 crc kubenswrapper[4817]: I0218 14:21:23.995826 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="6d50c64d-0681-400c-8daf-d061888e2576" containerName="nova-scheduler-scheduler" containerID="cri-o://0bc6c5603aab187ef95120f9c4d06035f9c7fdc4758d45260e310ac6aa1f6c55" gracePeriod=30 Feb 18 14:21:24 crc kubenswrapper[4817]: I0218 14:21:24.005374 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 18 14:21:24 crc kubenswrapper[4817]: I0218 14:21:24.005586 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="55106d39-246d-49ea-ab48-d7b703b72eef" containerName="nova-api-log" containerID="cri-o://6b90228165c1ae793adfa4c2ca91523d60491c9007a99446739444fa3db7a5ac" gracePeriod=30 Feb 18 14:21:24 crc kubenswrapper[4817]: I0218 14:21:24.005671 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="55106d39-246d-49ea-ab48-d7b703b72eef" containerName="nova-api-api" containerID="cri-o://e03bd11a3537a78ff9be585a7445496834cf74ad39d8d91756a8dee8b32da0e6" gracePeriod=30 Feb 18 14:21:24 crc kubenswrapper[4817]: I0218 14:21:24.029256 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 14:21:24 crc kubenswrapper[4817]: I0218 14:21:24.029478 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="804e8c84-e2fb-4a69-8d21-c5d8bb291be2" containerName="nova-metadata-log" containerID="cri-o://94d2be14a263212f3973117d7e35008b4526d374e1514a4b6f27a9f85b6c9a95" gracePeriod=30 Feb 18 14:21:24 crc kubenswrapper[4817]: I0218 14:21:24.029539 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="804e8c84-e2fb-4a69-8d21-c5d8bb291be2" containerName="nova-metadata-metadata" containerID="cri-o://80f7efcbe5f849c447341e3008d7c195009c7c63dfe561363936ec03f4637064" gracePeriod=30 Feb 18 14:21:24 crc kubenswrapper[4817]: E0218 14:21:24.167640 4817 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0bc6c5603aab187ef95120f9c4d06035f9c7fdc4758d45260e310ac6aa1f6c55" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 18 14:21:24 crc kubenswrapper[4817]: E0218 14:21:24.169592 4817 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0bc6c5603aab187ef95120f9c4d06035f9c7fdc4758d45260e310ac6aa1f6c55" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 18 14:21:24 crc kubenswrapper[4817]: E0218 14:21:24.171394 4817 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0bc6c5603aab187ef95120f9c4d06035f9c7fdc4758d45260e310ac6aa1f6c55" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 18 14:21:24 crc kubenswrapper[4817]: E0218 14:21:24.171475 4817 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="6d50c64d-0681-400c-8daf-d061888e2576" containerName="nova-scheduler-scheduler" Feb 18 14:21:24 crc kubenswrapper[4817]: I0218 14:21:24.730768 4817 generic.go:334] "Generic (PLEG): container finished" podID="55106d39-246d-49ea-ab48-d7b703b72eef" containerID="6b90228165c1ae793adfa4c2ca91523d60491c9007a99446739444fa3db7a5ac" exitCode=143 Feb 18 14:21:24 crc kubenswrapper[4817]: I0218 14:21:24.730847 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"55106d39-246d-49ea-ab48-d7b703b72eef","Type":"ContainerDied","Data":"6b90228165c1ae793adfa4c2ca91523d60491c9007a99446739444fa3db7a5ac"} Feb 18 14:21:24 crc kubenswrapper[4817]: I0218 14:21:24.733246 4817 generic.go:334] "Generic (PLEG): container finished" podID="804e8c84-e2fb-4a69-8d21-c5d8bb291be2" containerID="80f7efcbe5f849c447341e3008d7c195009c7c63dfe561363936ec03f4637064" exitCode=0 Feb 18 14:21:24 crc kubenswrapper[4817]: I0218 14:21:24.733270 4817 generic.go:334] "Generic (PLEG): container finished" podID="804e8c84-e2fb-4a69-8d21-c5d8bb291be2" containerID="94d2be14a263212f3973117d7e35008b4526d374e1514a4b6f27a9f85b6c9a95" exitCode=143 Feb 18 14:21:24 crc kubenswrapper[4817]: I0218 14:21:24.733288 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"804e8c84-e2fb-4a69-8d21-c5d8bb291be2","Type":"ContainerDied","Data":"80f7efcbe5f849c447341e3008d7c195009c7c63dfe561363936ec03f4637064"} Feb 18 14:21:24 crc kubenswrapper[4817]: I0218 14:21:24.733311 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"804e8c84-e2fb-4a69-8d21-c5d8bb291be2","Type":"ContainerDied","Data":"94d2be14a263212f3973117d7e35008b4526d374e1514a4b6f27a9f85b6c9a95"} Feb 18 14:21:24 crc kubenswrapper[4817]: I0218 14:21:24.733321 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"804e8c84-e2fb-4a69-8d21-c5d8bb291be2","Type":"ContainerDied","Data":"a1ef9c47643d458c64fb1bbb699349cc248c6f7a8e81f9e3db77ab79ffef7605"} Feb 18 14:21:24 crc kubenswrapper[4817]: I0218 14:21:24.733329 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1ef9c47643d458c64fb1bbb699349cc248c6f7a8e81f9e3db77ab79ffef7605" Feb 18 14:21:24 crc kubenswrapper[4817]: I0218 14:21:24.817423 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 14:21:24 crc kubenswrapper[4817]: I0218 14:21:24.870999 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/804e8c84-e2fb-4a69-8d21-c5d8bb291be2-config-data\") pod \"804e8c84-e2fb-4a69-8d21-c5d8bb291be2\" (UID: \"804e8c84-e2fb-4a69-8d21-c5d8bb291be2\") " Feb 18 14:21:24 crc kubenswrapper[4817]: I0218 14:21:24.871129 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/804e8c84-e2fb-4a69-8d21-c5d8bb291be2-combined-ca-bundle\") pod \"804e8c84-e2fb-4a69-8d21-c5d8bb291be2\" (UID: \"804e8c84-e2fb-4a69-8d21-c5d8bb291be2\") " Feb 18 14:21:24 crc kubenswrapper[4817]: I0218 14:21:24.871235 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkw8n\" (UniqueName: \"kubernetes.io/projected/804e8c84-e2fb-4a69-8d21-c5d8bb291be2-kube-api-access-qkw8n\") pod \"804e8c84-e2fb-4a69-8d21-c5d8bb291be2\" (UID: \"804e8c84-e2fb-4a69-8d21-c5d8bb291be2\") " Feb 18 14:21:24 crc kubenswrapper[4817]: I0218 14:21:24.871277 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/804e8c84-e2fb-4a69-8d21-c5d8bb291be2-logs\") pod \"804e8c84-e2fb-4a69-8d21-c5d8bb291be2\" (UID: \"804e8c84-e2fb-4a69-8d21-c5d8bb291be2\") " Feb 18 14:21:24 crc kubenswrapper[4817]: I0218 14:21:24.871299 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/804e8c84-e2fb-4a69-8d21-c5d8bb291be2-nova-metadata-tls-certs\") pod \"804e8c84-e2fb-4a69-8d21-c5d8bb291be2\" (UID: \"804e8c84-e2fb-4a69-8d21-c5d8bb291be2\") " Feb 18 14:21:24 crc kubenswrapper[4817]: I0218 14:21:24.877283 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/804e8c84-e2fb-4a69-8d21-c5d8bb291be2-logs" (OuterVolumeSpecName: "logs") pod "804e8c84-e2fb-4a69-8d21-c5d8bb291be2" (UID: "804e8c84-e2fb-4a69-8d21-c5d8bb291be2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:21:24 crc kubenswrapper[4817]: I0218 14:21:24.879570 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/804e8c84-e2fb-4a69-8d21-c5d8bb291be2-kube-api-access-qkw8n" (OuterVolumeSpecName: "kube-api-access-qkw8n") pod "804e8c84-e2fb-4a69-8d21-c5d8bb291be2" (UID: "804e8c84-e2fb-4a69-8d21-c5d8bb291be2"). InnerVolumeSpecName "kube-api-access-qkw8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:21:24 crc kubenswrapper[4817]: I0218 14:21:24.907320 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/804e8c84-e2fb-4a69-8d21-c5d8bb291be2-config-data" (OuterVolumeSpecName: "config-data") pod "804e8c84-e2fb-4a69-8d21-c5d8bb291be2" (UID: "804e8c84-e2fb-4a69-8d21-c5d8bb291be2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:21:24 crc kubenswrapper[4817]: I0218 14:21:24.907858 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/804e8c84-e2fb-4a69-8d21-c5d8bb291be2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "804e8c84-e2fb-4a69-8d21-c5d8bb291be2" (UID: "804e8c84-e2fb-4a69-8d21-c5d8bb291be2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:21:24 crc kubenswrapper[4817]: I0218 14:21:24.926853 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/804e8c84-e2fb-4a69-8d21-c5d8bb291be2-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "804e8c84-e2fb-4a69-8d21-c5d8bb291be2" (UID: "804e8c84-e2fb-4a69-8d21-c5d8bb291be2"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:21:24 crc kubenswrapper[4817]: I0218 14:21:24.974152 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkw8n\" (UniqueName: \"kubernetes.io/projected/804e8c84-e2fb-4a69-8d21-c5d8bb291be2-kube-api-access-qkw8n\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:24 crc kubenswrapper[4817]: I0218 14:21:24.974185 4817 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/804e8c84-e2fb-4a69-8d21-c5d8bb291be2-logs\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:24 crc kubenswrapper[4817]: I0218 14:21:24.974195 4817 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/804e8c84-e2fb-4a69-8d21-c5d8bb291be2-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:24 crc kubenswrapper[4817]: I0218 14:21:24.974206 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/804e8c84-e2fb-4a69-8d21-c5d8bb291be2-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:24 crc kubenswrapper[4817]: I0218 14:21:24.974216 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/804e8c84-e2fb-4a69-8d21-c5d8bb291be2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:25 crc kubenswrapper[4817]: I0218 14:21:25.742449 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 14:21:25 crc kubenswrapper[4817]: I0218 14:21:25.776499 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 14:21:25 crc kubenswrapper[4817]: I0218 14:21:25.786908 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 14:21:25 crc kubenswrapper[4817]: I0218 14:21:25.802414 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 18 14:21:25 crc kubenswrapper[4817]: E0218 14:21:25.802947 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="804e8c84-e2fb-4a69-8d21-c5d8bb291be2" containerName="nova-metadata-metadata" Feb 18 14:21:25 crc kubenswrapper[4817]: I0218 14:21:25.802970 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="804e8c84-e2fb-4a69-8d21-c5d8bb291be2" containerName="nova-metadata-metadata" Feb 18 14:21:25 crc kubenswrapper[4817]: E0218 14:21:25.803040 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76e9a054-69ef-45a0-b901-7ba80c2c2f46" containerName="nova-manage" Feb 18 14:21:25 crc kubenswrapper[4817]: I0218 14:21:25.803050 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="76e9a054-69ef-45a0-b901-7ba80c2c2f46" containerName="nova-manage" Feb 18 14:21:25 crc kubenswrapper[4817]: E0218 14:21:25.803087 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="804e8c84-e2fb-4a69-8d21-c5d8bb291be2" containerName="nova-metadata-log" Feb 18 14:21:25 crc kubenswrapper[4817]: I0218 14:21:25.803094 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="804e8c84-e2fb-4a69-8d21-c5d8bb291be2" containerName="nova-metadata-log" Feb 18 14:21:25 crc kubenswrapper[4817]: E0218 14:21:25.803117 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe95924e-a4f6-4844-b496-461e91941a16" containerName="dnsmasq-dns" Feb 18 14:21:25 crc kubenswrapper[4817]: I0218 14:21:25.803124 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe95924e-a4f6-4844-b496-461e91941a16" containerName="dnsmasq-dns" Feb 18 14:21:25 crc kubenswrapper[4817]: E0218 14:21:25.803138 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe95924e-a4f6-4844-b496-461e91941a16" containerName="init" Feb 18 14:21:25 crc kubenswrapper[4817]: I0218 14:21:25.803145 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe95924e-a4f6-4844-b496-461e91941a16" containerName="init" Feb 18 14:21:25 crc kubenswrapper[4817]: I0218 14:21:25.803390 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="804e8c84-e2fb-4a69-8d21-c5d8bb291be2" containerName="nova-metadata-metadata" Feb 18 14:21:25 crc kubenswrapper[4817]: I0218 14:21:25.803420 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="76e9a054-69ef-45a0-b901-7ba80c2c2f46" containerName="nova-manage" Feb 18 14:21:25 crc kubenswrapper[4817]: I0218 14:21:25.803446 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe95924e-a4f6-4844-b496-461e91941a16" containerName="dnsmasq-dns" Feb 18 14:21:25 crc kubenswrapper[4817]: I0218 14:21:25.803459 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="804e8c84-e2fb-4a69-8d21-c5d8bb291be2" containerName="nova-metadata-log" Feb 18 14:21:25 crc kubenswrapper[4817]: I0218 14:21:25.804692 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 14:21:25 crc kubenswrapper[4817]: I0218 14:21:25.807417 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 18 14:21:25 crc kubenswrapper[4817]: I0218 14:21:25.807892 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 18 14:21:25 crc kubenswrapper[4817]: I0218 14:21:25.811770 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 14:21:25 crc kubenswrapper[4817]: I0218 14:21:25.894418 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6e8372d-7ebc-4f22-8d3d-6d653c128b06-config-data\") pod \"nova-metadata-0\" (UID: \"d6e8372d-7ebc-4f22-8d3d-6d653c128b06\") " pod="openstack/nova-metadata-0" Feb 18 14:21:25 crc kubenswrapper[4817]: I0218 14:21:25.894477 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6e8372d-7ebc-4f22-8d3d-6d653c128b06-logs\") pod \"nova-metadata-0\" (UID: \"d6e8372d-7ebc-4f22-8d3d-6d653c128b06\") " pod="openstack/nova-metadata-0" Feb 18 14:21:25 crc kubenswrapper[4817]: I0218 14:21:25.894648 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zt9z\" (UniqueName: \"kubernetes.io/projected/d6e8372d-7ebc-4f22-8d3d-6d653c128b06-kube-api-access-6zt9z\") pod \"nova-metadata-0\" (UID: \"d6e8372d-7ebc-4f22-8d3d-6d653c128b06\") " pod="openstack/nova-metadata-0" Feb 18 14:21:25 crc kubenswrapper[4817]: I0218 14:21:25.894876 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6e8372d-7ebc-4f22-8d3d-6d653c128b06-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d6e8372d-7ebc-4f22-8d3d-6d653c128b06\") " pod="openstack/nova-metadata-0" Feb 18 14:21:25 crc kubenswrapper[4817]: I0218 14:21:25.895040 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e8372d-7ebc-4f22-8d3d-6d653c128b06-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d6e8372d-7ebc-4f22-8d3d-6d653c128b06\") " pod="openstack/nova-metadata-0" Feb 18 14:21:25 crc kubenswrapper[4817]: I0218 14:21:25.998701 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6e8372d-7ebc-4f22-8d3d-6d653c128b06-config-data\") pod \"nova-metadata-0\" (UID: \"d6e8372d-7ebc-4f22-8d3d-6d653c128b06\") " pod="openstack/nova-metadata-0" Feb 18 14:21:25 crc kubenswrapper[4817]: I0218 14:21:25.998756 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6e8372d-7ebc-4f22-8d3d-6d653c128b06-logs\") pod \"nova-metadata-0\" (UID: \"d6e8372d-7ebc-4f22-8d3d-6d653c128b06\") " pod="openstack/nova-metadata-0" Feb 18 14:21:25 crc kubenswrapper[4817]: I0218 14:21:25.998808 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zt9z\" (UniqueName: \"kubernetes.io/projected/d6e8372d-7ebc-4f22-8d3d-6d653c128b06-kube-api-access-6zt9z\") pod \"nova-metadata-0\" (UID: \"d6e8372d-7ebc-4f22-8d3d-6d653c128b06\") " pod="openstack/nova-metadata-0" Feb 18 14:21:25 crc kubenswrapper[4817]: I0218 14:21:25.999239 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6e8372d-7ebc-4f22-8d3d-6d653c128b06-logs\") pod \"nova-metadata-0\" (UID: \"d6e8372d-7ebc-4f22-8d3d-6d653c128b06\") " pod="openstack/nova-metadata-0" Feb 18 14:21:25 crc kubenswrapper[4817]: I0218 14:21:25.999319 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6e8372d-7ebc-4f22-8d3d-6d653c128b06-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d6e8372d-7ebc-4f22-8d3d-6d653c128b06\") " pod="openstack/nova-metadata-0" Feb 18 14:21:25 crc kubenswrapper[4817]: I0218 14:21:25.999649 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e8372d-7ebc-4f22-8d3d-6d653c128b06-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d6e8372d-7ebc-4f22-8d3d-6d653c128b06\") " pod="openstack/nova-metadata-0" Feb 18 14:21:26 crc kubenswrapper[4817]: I0218 14:21:26.002580 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6e8372d-7ebc-4f22-8d3d-6d653c128b06-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d6e8372d-7ebc-4f22-8d3d-6d653c128b06\") " pod="openstack/nova-metadata-0" Feb 18 14:21:26 crc kubenswrapper[4817]: I0218 14:21:26.004942 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e8372d-7ebc-4f22-8d3d-6d653c128b06-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d6e8372d-7ebc-4f22-8d3d-6d653c128b06\") " pod="openstack/nova-metadata-0" Feb 18 14:21:26 crc kubenswrapper[4817]: I0218 14:21:26.008217 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6e8372d-7ebc-4f22-8d3d-6d653c128b06-config-data\") pod \"nova-metadata-0\" (UID: \"d6e8372d-7ebc-4f22-8d3d-6d653c128b06\") " pod="openstack/nova-metadata-0" Feb 18 14:21:26 crc kubenswrapper[4817]: I0218 14:21:26.022487 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zt9z\" (UniqueName: \"kubernetes.io/projected/d6e8372d-7ebc-4f22-8d3d-6d653c128b06-kube-api-access-6zt9z\") pod \"nova-metadata-0\" (UID: \"d6e8372d-7ebc-4f22-8d3d-6d653c128b06\") " pod="openstack/nova-metadata-0" Feb 18 14:21:26 crc kubenswrapper[4817]: I0218 14:21:26.127081 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 14:21:26 crc kubenswrapper[4817]: I0218 14:21:26.185011 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="804e8c84-e2fb-4a69-8d21-c5d8bb291be2" path="/var/lib/kubelet/pods/804e8c84-e2fb-4a69-8d21-c5d8bb291be2/volumes" Feb 18 14:21:26 crc kubenswrapper[4817]: I0218 14:21:26.658199 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 14:21:26 crc kubenswrapper[4817]: W0218 14:21:26.666372 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6e8372d_7ebc_4f22_8d3d_6d653c128b06.slice/crio-01e3ff83d904d0da3040a1616e0b9d58c233b5dfa71d6bc6a348585e5b250c13 WatchSource:0}: Error finding container 01e3ff83d904d0da3040a1616e0b9d58c233b5dfa71d6bc6a348585e5b250c13: Status 404 returned error can't find the container with id 01e3ff83d904d0da3040a1616e0b9d58c233b5dfa71d6bc6a348585e5b250c13 Feb 18 14:21:26 crc kubenswrapper[4817]: I0218 14:21:26.764008 4817 generic.go:334] "Generic (PLEG): container finished" podID="6d50c64d-0681-400c-8daf-d061888e2576" containerID="0bc6c5603aab187ef95120f9c4d06035f9c7fdc4758d45260e310ac6aa1f6c55" exitCode=0 Feb 18 14:21:26 crc kubenswrapper[4817]: I0218 14:21:26.764131 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6d50c64d-0681-400c-8daf-d061888e2576","Type":"ContainerDied","Data":"0bc6c5603aab187ef95120f9c4d06035f9c7fdc4758d45260e310ac6aa1f6c55"} Feb 18 14:21:26 crc kubenswrapper[4817]: I0218 14:21:26.769166 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d6e8372d-7ebc-4f22-8d3d-6d653c128b06","Type":"ContainerStarted","Data":"01e3ff83d904d0da3040a1616e0b9d58c233b5dfa71d6bc6a348585e5b250c13"} Feb 18 14:21:26 crc kubenswrapper[4817]: I0218 14:21:26.903070 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.029105 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d50c64d-0681-400c-8daf-d061888e2576-config-data\") pod \"6d50c64d-0681-400c-8daf-d061888e2576\" (UID: \"6d50c64d-0681-400c-8daf-d061888e2576\") " Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.029453 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d50c64d-0681-400c-8daf-d061888e2576-combined-ca-bundle\") pod \"6d50c64d-0681-400c-8daf-d061888e2576\" (UID: \"6d50c64d-0681-400c-8daf-d061888e2576\") " Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.029520 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwzjp\" (UniqueName: \"kubernetes.io/projected/6d50c64d-0681-400c-8daf-d061888e2576-kube-api-access-nwzjp\") pod \"6d50c64d-0681-400c-8daf-d061888e2576\" (UID: \"6d50c64d-0681-400c-8daf-d061888e2576\") " Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.032925 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d50c64d-0681-400c-8daf-d061888e2576-kube-api-access-nwzjp" (OuterVolumeSpecName: "kube-api-access-nwzjp") pod "6d50c64d-0681-400c-8daf-d061888e2576" (UID: "6d50c64d-0681-400c-8daf-d061888e2576"). InnerVolumeSpecName "kube-api-access-nwzjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.066898 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d50c64d-0681-400c-8daf-d061888e2576-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d50c64d-0681-400c-8daf-d061888e2576" (UID: "6d50c64d-0681-400c-8daf-d061888e2576"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.069160 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d50c64d-0681-400c-8daf-d061888e2576-config-data" (OuterVolumeSpecName: "config-data") pod "6d50c64d-0681-400c-8daf-d061888e2576" (UID: "6d50c64d-0681-400c-8daf-d061888e2576"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.132449 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d50c64d-0681-400c-8daf-d061888e2576-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.132486 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d50c64d-0681-400c-8daf-d061888e2576-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.132499 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwzjp\" (UniqueName: \"kubernetes.io/projected/6d50c64d-0681-400c-8daf-d061888e2576-kube-api-access-nwzjp\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.754721 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.782684 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d6e8372d-7ebc-4f22-8d3d-6d653c128b06","Type":"ContainerStarted","Data":"1be98d9b34c1e9a8c03d153810f7eb61636ba64426214ab705cae525935c5242"} Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.782728 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d6e8372d-7ebc-4f22-8d3d-6d653c128b06","Type":"ContainerStarted","Data":"d2c990d70a0a202e89fdfdf55fe6e779315f8e8016b2d9abf431d4e19a6f7ac4"} Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.789668 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6d50c64d-0681-400c-8daf-d061888e2576","Type":"ContainerDied","Data":"d24222a1e8644534e4299a6b5e4783e5a0ff39ce16b019435b797f4832cd7722"} Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.789747 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.789759 4817 scope.go:117] "RemoveContainer" containerID="0bc6c5603aab187ef95120f9c4d06035f9c7fdc4758d45260e310ac6aa1f6c55" Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.806192 4817 generic.go:334] "Generic (PLEG): container finished" podID="55106d39-246d-49ea-ab48-d7b703b72eef" containerID="e03bd11a3537a78ff9be585a7445496834cf74ad39d8d91756a8dee8b32da0e6" exitCode=0 Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.806241 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"55106d39-246d-49ea-ab48-d7b703b72eef","Type":"ContainerDied","Data":"e03bd11a3537a78ff9be585a7445496834cf74ad39d8d91756a8dee8b32da0e6"} Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.806267 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"55106d39-246d-49ea-ab48-d7b703b72eef","Type":"ContainerDied","Data":"ce87577891535cb7c2c0d65950546740fe6cccedf060d44d1da0cb861f7f1beb"} Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.806318 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.836543 4817 scope.go:117] "RemoveContainer" containerID="e03bd11a3537a78ff9be585a7445496834cf74ad39d8d91756a8dee8b32da0e6" Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.846720 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55106d39-246d-49ea-ab48-d7b703b72eef-logs\") pod \"55106d39-246d-49ea-ab48-d7b703b72eef\" (UID: \"55106d39-246d-49ea-ab48-d7b703b72eef\") " Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.847202 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55106d39-246d-49ea-ab48-d7b703b72eef-logs" (OuterVolumeSpecName: "logs") pod "55106d39-246d-49ea-ab48-d7b703b72eef" (UID: "55106d39-246d-49ea-ab48-d7b703b72eef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.847534 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8trv\" (UniqueName: \"kubernetes.io/projected/55106d39-246d-49ea-ab48-d7b703b72eef-kube-api-access-h8trv\") pod \"55106d39-246d-49ea-ab48-d7b703b72eef\" (UID: \"55106d39-246d-49ea-ab48-d7b703b72eef\") " Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.847678 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55106d39-246d-49ea-ab48-d7b703b72eef-config-data\") pod \"55106d39-246d-49ea-ab48-d7b703b72eef\" (UID: \"55106d39-246d-49ea-ab48-d7b703b72eef\") " Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.847709 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55106d39-246d-49ea-ab48-d7b703b72eef-combined-ca-bundle\") pod \"55106d39-246d-49ea-ab48-d7b703b72eef\" (UID: \"55106d39-246d-49ea-ab48-d7b703b72eef\") " Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.849106 4817 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55106d39-246d-49ea-ab48-d7b703b72eef-logs\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.849487 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.849471688 podStartE2EDuration="2.849471688s" podCreationTimestamp="2026-02-18 14:21:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:21:27.812423129 +0000 UTC m=+1350.387959122" watchObservedRunningTime="2026-02-18 14:21:27.849471688 +0000 UTC m=+1350.425007671" Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.856836 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55106d39-246d-49ea-ab48-d7b703b72eef-kube-api-access-h8trv" (OuterVolumeSpecName: "kube-api-access-h8trv") pod "55106d39-246d-49ea-ab48-d7b703b72eef" (UID: "55106d39-246d-49ea-ab48-d7b703b72eef"). InnerVolumeSpecName "kube-api-access-h8trv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.863884 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.872692 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.886147 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55106d39-246d-49ea-ab48-d7b703b72eef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55106d39-246d-49ea-ab48-d7b703b72eef" (UID: "55106d39-246d-49ea-ab48-d7b703b72eef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.912841 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 14:21:27 crc kubenswrapper[4817]: E0218 14:21:27.913534 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55106d39-246d-49ea-ab48-d7b703b72eef" containerName="nova-api-log" Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.913549 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="55106d39-246d-49ea-ab48-d7b703b72eef" containerName="nova-api-log" Feb 18 14:21:27 crc kubenswrapper[4817]: E0218 14:21:27.913576 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55106d39-246d-49ea-ab48-d7b703b72eef" containerName="nova-api-api" Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.913583 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="55106d39-246d-49ea-ab48-d7b703b72eef" containerName="nova-api-api" Feb 18 14:21:27 crc kubenswrapper[4817]: E0218 14:21:27.913602 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d50c64d-0681-400c-8daf-d061888e2576" containerName="nova-scheduler-scheduler" Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.913608 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d50c64d-0681-400c-8daf-d061888e2576" containerName="nova-scheduler-scheduler" Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.915516 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55106d39-246d-49ea-ab48-d7b703b72eef-config-data" (OuterVolumeSpecName: "config-data") pod "55106d39-246d-49ea-ab48-d7b703b72eef" (UID: "55106d39-246d-49ea-ab48-d7b703b72eef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.915777 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="55106d39-246d-49ea-ab48-d7b703b72eef" containerName="nova-api-api" Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.915805 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="55106d39-246d-49ea-ab48-d7b703b72eef" containerName="nova-api-log" Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.915825 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d50c64d-0681-400c-8daf-d061888e2576" containerName="nova-scheduler-scheduler" Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.919003 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.922244 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.929791 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.951958 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkqfg\" (UniqueName: \"kubernetes.io/projected/cde987c5-28be-48dd-835a-30ad08140eb8-kube-api-access-fkqfg\") pod \"nova-scheduler-0\" (UID: \"cde987c5-28be-48dd-835a-30ad08140eb8\") " pod="openstack/nova-scheduler-0" Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.952113 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cde987c5-28be-48dd-835a-30ad08140eb8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cde987c5-28be-48dd-835a-30ad08140eb8\") " pod="openstack/nova-scheduler-0" Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.952138 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cde987c5-28be-48dd-835a-30ad08140eb8-config-data\") pod \"nova-scheduler-0\" (UID: \"cde987c5-28be-48dd-835a-30ad08140eb8\") " pod="openstack/nova-scheduler-0" Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.952272 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55106d39-246d-49ea-ab48-d7b703b72eef-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.952298 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55106d39-246d-49ea-ab48-d7b703b72eef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.952310 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8trv\" (UniqueName: \"kubernetes.io/projected/55106d39-246d-49ea-ab48-d7b703b72eef-kube-api-access-h8trv\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:27 crc kubenswrapper[4817]: I0218 14:21:27.962458 4817 scope.go:117] "RemoveContainer" containerID="6b90228165c1ae793adfa4c2ca91523d60491c9007a99446739444fa3db7a5ac" Feb 18 14:21:28 crc kubenswrapper[4817]: I0218 14:21:28.020618 4817 scope.go:117] "RemoveContainer" containerID="e03bd11a3537a78ff9be585a7445496834cf74ad39d8d91756a8dee8b32da0e6" Feb 18 14:21:28 crc kubenswrapper[4817]: E0218 14:21:28.021071 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e03bd11a3537a78ff9be585a7445496834cf74ad39d8d91756a8dee8b32da0e6\": container with ID starting with e03bd11a3537a78ff9be585a7445496834cf74ad39d8d91756a8dee8b32da0e6 not found: ID does not exist" containerID="e03bd11a3537a78ff9be585a7445496834cf74ad39d8d91756a8dee8b32da0e6" Feb 18 14:21:28 crc kubenswrapper[4817]: I0218 14:21:28.021119 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e03bd11a3537a78ff9be585a7445496834cf74ad39d8d91756a8dee8b32da0e6"} err="failed to get container status \"e03bd11a3537a78ff9be585a7445496834cf74ad39d8d91756a8dee8b32da0e6\": rpc error: code = NotFound desc = could not find container \"e03bd11a3537a78ff9be585a7445496834cf74ad39d8d91756a8dee8b32da0e6\": container with ID starting with e03bd11a3537a78ff9be585a7445496834cf74ad39d8d91756a8dee8b32da0e6 not found: ID does not exist" Feb 18 14:21:28 crc kubenswrapper[4817]: I0218 14:21:28.021152 4817 scope.go:117] "RemoveContainer" containerID="6b90228165c1ae793adfa4c2ca91523d60491c9007a99446739444fa3db7a5ac" Feb 18 14:21:28 crc kubenswrapper[4817]: E0218 14:21:28.021572 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b90228165c1ae793adfa4c2ca91523d60491c9007a99446739444fa3db7a5ac\": container with ID starting with 6b90228165c1ae793adfa4c2ca91523d60491c9007a99446739444fa3db7a5ac not found: ID does not exist" containerID="6b90228165c1ae793adfa4c2ca91523d60491c9007a99446739444fa3db7a5ac" Feb 18 14:21:28 crc kubenswrapper[4817]: I0218 14:21:28.021599 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b90228165c1ae793adfa4c2ca91523d60491c9007a99446739444fa3db7a5ac"} err="failed to get container status \"6b90228165c1ae793adfa4c2ca91523d60491c9007a99446739444fa3db7a5ac\": rpc error: code = NotFound desc = could not find container \"6b90228165c1ae793adfa4c2ca91523d60491c9007a99446739444fa3db7a5ac\": container with ID starting with 6b90228165c1ae793adfa4c2ca91523d60491c9007a99446739444fa3db7a5ac not found: ID does not exist" Feb 18 14:21:28 crc kubenswrapper[4817]: I0218 14:21:28.054619 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkqfg\" (UniqueName: \"kubernetes.io/projected/cde987c5-28be-48dd-835a-30ad08140eb8-kube-api-access-fkqfg\") pod \"nova-scheduler-0\" (UID: \"cde987c5-28be-48dd-835a-30ad08140eb8\") " pod="openstack/nova-scheduler-0" Feb 18 14:21:28 crc kubenswrapper[4817]: I0218 14:21:28.054755 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cde987c5-28be-48dd-835a-30ad08140eb8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cde987c5-28be-48dd-835a-30ad08140eb8\") " pod="openstack/nova-scheduler-0" Feb 18 14:21:28 crc kubenswrapper[4817]: I0218 14:21:28.054783 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cde987c5-28be-48dd-835a-30ad08140eb8-config-data\") pod \"nova-scheduler-0\" (UID: \"cde987c5-28be-48dd-835a-30ad08140eb8\") " pod="openstack/nova-scheduler-0" Feb 18 14:21:28 crc kubenswrapper[4817]: I0218 14:21:28.062692 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cde987c5-28be-48dd-835a-30ad08140eb8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cde987c5-28be-48dd-835a-30ad08140eb8\") " pod="openstack/nova-scheduler-0" Feb 18 14:21:28 crc kubenswrapper[4817]: I0218 14:21:28.066521 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cde987c5-28be-48dd-835a-30ad08140eb8-config-data\") pod \"nova-scheduler-0\" (UID: \"cde987c5-28be-48dd-835a-30ad08140eb8\") " pod="openstack/nova-scheduler-0" Feb 18 14:21:28 crc kubenswrapper[4817]: I0218 14:21:28.092716 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkqfg\" (UniqueName: \"kubernetes.io/projected/cde987c5-28be-48dd-835a-30ad08140eb8-kube-api-access-fkqfg\") pod \"nova-scheduler-0\" (UID: \"cde987c5-28be-48dd-835a-30ad08140eb8\") " pod="openstack/nova-scheduler-0" Feb 18 14:21:28 crc kubenswrapper[4817]: I0218 14:21:28.189187 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d50c64d-0681-400c-8daf-d061888e2576" path="/var/lib/kubelet/pods/6d50c64d-0681-400c-8daf-d061888e2576/volumes" Feb 18 14:21:28 crc kubenswrapper[4817]: I0218 14:21:28.193040 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 18 14:21:28 crc kubenswrapper[4817]: I0218 14:21:28.222617 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 18 14:21:28 crc kubenswrapper[4817]: I0218 14:21:28.256457 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 14:21:28 crc kubenswrapper[4817]: I0218 14:21:28.284039 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 18 14:21:28 crc kubenswrapper[4817]: I0218 14:21:28.285827 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 14:21:28 crc kubenswrapper[4817]: I0218 14:21:28.292933 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 18 14:21:28 crc kubenswrapper[4817]: I0218 14:21:28.304383 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 18 14:21:28 crc kubenswrapper[4817]: I0218 14:21:28.365854 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c665ad27-3a60-4b0f-854e-00505781b81a-config-data\") pod \"nova-api-0\" (UID: \"c665ad27-3a60-4b0f-854e-00505781b81a\") " pod="openstack/nova-api-0" Feb 18 14:21:28 crc kubenswrapper[4817]: I0218 14:21:28.366920 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c665ad27-3a60-4b0f-854e-00505781b81a-logs\") pod \"nova-api-0\" (UID: \"c665ad27-3a60-4b0f-854e-00505781b81a\") " pod="openstack/nova-api-0" Feb 18 14:21:28 crc kubenswrapper[4817]: I0218 14:21:28.367039 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c665ad27-3a60-4b0f-854e-00505781b81a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c665ad27-3a60-4b0f-854e-00505781b81a\") " pod="openstack/nova-api-0" Feb 18 14:21:28 crc kubenswrapper[4817]: I0218 14:21:28.367184 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d66dq\" (UniqueName: \"kubernetes.io/projected/c665ad27-3a60-4b0f-854e-00505781b81a-kube-api-access-d66dq\") pod \"nova-api-0\" (UID: \"c665ad27-3a60-4b0f-854e-00505781b81a\") " pod="openstack/nova-api-0" Feb 18 14:21:28 crc kubenswrapper[4817]: I0218 14:21:28.471409 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c665ad27-3a60-4b0f-854e-00505781b81a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c665ad27-3a60-4b0f-854e-00505781b81a\") " pod="openstack/nova-api-0" Feb 18 14:21:28 crc kubenswrapper[4817]: I0218 14:21:28.471568 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d66dq\" (UniqueName: \"kubernetes.io/projected/c665ad27-3a60-4b0f-854e-00505781b81a-kube-api-access-d66dq\") pod \"nova-api-0\" (UID: \"c665ad27-3a60-4b0f-854e-00505781b81a\") " pod="openstack/nova-api-0" Feb 18 14:21:28 crc kubenswrapper[4817]: I0218 14:21:28.471628 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c665ad27-3a60-4b0f-854e-00505781b81a-config-data\") pod \"nova-api-0\" (UID: \"c665ad27-3a60-4b0f-854e-00505781b81a\") " pod="openstack/nova-api-0" Feb 18 14:21:28 crc kubenswrapper[4817]: I0218 14:21:28.471763 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c665ad27-3a60-4b0f-854e-00505781b81a-logs\") pod \"nova-api-0\" (UID: \"c665ad27-3a60-4b0f-854e-00505781b81a\") " pod="openstack/nova-api-0" Feb 18 14:21:28 crc kubenswrapper[4817]: I0218 14:21:28.472659 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c665ad27-3a60-4b0f-854e-00505781b81a-logs\") pod \"nova-api-0\" (UID: \"c665ad27-3a60-4b0f-854e-00505781b81a\") " pod="openstack/nova-api-0" Feb 18 14:21:28 crc kubenswrapper[4817]: I0218 14:21:28.478799 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c665ad27-3a60-4b0f-854e-00505781b81a-config-data\") pod \"nova-api-0\" (UID: \"c665ad27-3a60-4b0f-854e-00505781b81a\") " pod="openstack/nova-api-0" Feb 18 14:21:28 crc kubenswrapper[4817]: I0218 14:21:28.478991 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c665ad27-3a60-4b0f-854e-00505781b81a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c665ad27-3a60-4b0f-854e-00505781b81a\") " pod="openstack/nova-api-0" Feb 18 14:21:28 crc kubenswrapper[4817]: I0218 14:21:28.500566 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d66dq\" (UniqueName: \"kubernetes.io/projected/c665ad27-3a60-4b0f-854e-00505781b81a-kube-api-access-d66dq\") pod \"nova-api-0\" (UID: \"c665ad27-3a60-4b0f-854e-00505781b81a\") " pod="openstack/nova-api-0" Feb 18 14:21:28 crc kubenswrapper[4817]: I0218 14:21:28.633556 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 14:21:28 crc kubenswrapper[4817]: I0218 14:21:28.830840 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 14:21:28 crc kubenswrapper[4817]: W0218 14:21:28.836536 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcde987c5_28be_48dd_835a_30ad08140eb8.slice/crio-9bec1611082a62dbb822b1ab7c2604be30bece6f339adcd1e43480d6f9003877 WatchSource:0}: Error finding container 9bec1611082a62dbb822b1ab7c2604be30bece6f339adcd1e43480d6f9003877: Status 404 returned error can't find the container with id 9bec1611082a62dbb822b1ab7c2604be30bece6f339adcd1e43480d6f9003877 Feb 18 14:21:29 crc kubenswrapper[4817]: I0218 14:21:29.138064 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 18 14:21:29 crc kubenswrapper[4817]: W0218 14:21:29.151733 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc665ad27_3a60_4b0f_854e_00505781b81a.slice/crio-6ba5393bb604dccc32be9c796a627d4724dd2a0e4bd1d8cf0ace3625d95bd2f5 WatchSource:0}: Error finding container 6ba5393bb604dccc32be9c796a627d4724dd2a0e4bd1d8cf0ace3625d95bd2f5: Status 404 returned error can't find the container with id 6ba5393bb604dccc32be9c796a627d4724dd2a0e4bd1d8cf0ace3625d95bd2f5 Feb 18 14:21:29 crc kubenswrapper[4817]: I0218 14:21:29.827185 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cde987c5-28be-48dd-835a-30ad08140eb8","Type":"ContainerStarted","Data":"5182819c5508a7d1b7a1e4a119ca1536d229c98b56b0a4eca4b5a710103a01df"} Feb 18 14:21:29 crc kubenswrapper[4817]: I0218 14:21:29.827240 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cde987c5-28be-48dd-835a-30ad08140eb8","Type":"ContainerStarted","Data":"9bec1611082a62dbb822b1ab7c2604be30bece6f339adcd1e43480d6f9003877"} Feb 18 14:21:29 crc kubenswrapper[4817]: I0218 14:21:29.829626 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c665ad27-3a60-4b0f-854e-00505781b81a","Type":"ContainerStarted","Data":"e3375cdeb13f396c2ccc819d81b91758fb99a7f5f79e6337f60d63eccfc1ed22"} Feb 18 14:21:29 crc kubenswrapper[4817]: I0218 14:21:29.829678 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c665ad27-3a60-4b0f-854e-00505781b81a","Type":"ContainerStarted","Data":"fc2210fca05f1eddb86c52b4046cb984c11fc3c5d1c5f24339464e3ec4604ef0"} Feb 18 14:21:29 crc kubenswrapper[4817]: I0218 14:21:29.829695 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c665ad27-3a60-4b0f-854e-00505781b81a","Type":"ContainerStarted","Data":"6ba5393bb604dccc32be9c796a627d4724dd2a0e4bd1d8cf0ace3625d95bd2f5"} Feb 18 14:21:29 crc kubenswrapper[4817]: I0218 14:21:29.856349 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.856310414 podStartE2EDuration="2.856310414s" podCreationTimestamp="2026-02-18 14:21:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:21:29.846925432 +0000 UTC m=+1352.422461425" watchObservedRunningTime="2026-02-18 14:21:29.856310414 +0000 UTC m=+1352.431846437" Feb 18 14:21:29 crc kubenswrapper[4817]: I0218 14:21:29.882848 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.88282462 podStartE2EDuration="1.88282462s" podCreationTimestamp="2026-02-18 14:21:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:21:29.868808988 +0000 UTC m=+1352.444344991" watchObservedRunningTime="2026-02-18 14:21:29.88282462 +0000 UTC m=+1352.458360613" Feb 18 14:21:30 crc kubenswrapper[4817]: I0218 14:21:30.195368 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55106d39-246d-49ea-ab48-d7b703b72eef" path="/var/lib/kubelet/pods/55106d39-246d-49ea-ab48-d7b703b72eef/volumes" Feb 18 14:21:31 crc kubenswrapper[4817]: I0218 14:21:31.127420 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 18 14:21:31 crc kubenswrapper[4817]: I0218 14:21:31.127793 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 18 14:21:31 crc kubenswrapper[4817]: I0218 14:21:31.132143 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5scpm" podUID="01b17e66-ae59-413b-985f-ea5cf5e11600" containerName="registry-server" probeResult="failure" output=< Feb 18 14:21:31 crc kubenswrapper[4817]: timeout: failed to connect service ":50051" within 1s Feb 18 14:21:31 crc kubenswrapper[4817]: > Feb 18 14:21:31 crc kubenswrapper[4817]: I0218 14:21:31.847799 4817 generic.go:334] "Generic (PLEG): container finished" podID="e176b326-3d3d-4b95-8a7e-e18448de49ae" containerID="47a50248f30af6ef7ab21313783ae7fc54be9ceb9b4f064a1b9e653f3b841298" exitCode=0 Feb 18 14:21:31 crc kubenswrapper[4817]: I0218 14:21:31.847848 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hptxc" event={"ID":"e176b326-3d3d-4b95-8a7e-e18448de49ae","Type":"ContainerDied","Data":"47a50248f30af6ef7ab21313783ae7fc54be9ceb9b4f064a1b9e653f3b841298"} Feb 18 14:21:33 crc kubenswrapper[4817]: I0218 14:21:33.257533 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 18 14:21:33 crc kubenswrapper[4817]: I0218 14:21:33.313144 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hptxc" Feb 18 14:21:33 crc kubenswrapper[4817]: I0218 14:21:33.486578 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvmgg\" (UniqueName: \"kubernetes.io/projected/e176b326-3d3d-4b95-8a7e-e18448de49ae-kube-api-access-tvmgg\") pod \"e176b326-3d3d-4b95-8a7e-e18448de49ae\" (UID: \"e176b326-3d3d-4b95-8a7e-e18448de49ae\") " Feb 18 14:21:33 crc kubenswrapper[4817]: I0218 14:21:33.486743 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e176b326-3d3d-4b95-8a7e-e18448de49ae-combined-ca-bundle\") pod \"e176b326-3d3d-4b95-8a7e-e18448de49ae\" (UID: \"e176b326-3d3d-4b95-8a7e-e18448de49ae\") " Feb 18 14:21:33 crc kubenswrapper[4817]: I0218 14:21:33.486881 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e176b326-3d3d-4b95-8a7e-e18448de49ae-config-data\") pod \"e176b326-3d3d-4b95-8a7e-e18448de49ae\" (UID: \"e176b326-3d3d-4b95-8a7e-e18448de49ae\") " Feb 18 14:21:33 crc kubenswrapper[4817]: I0218 14:21:33.486932 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e176b326-3d3d-4b95-8a7e-e18448de49ae-scripts\") pod \"e176b326-3d3d-4b95-8a7e-e18448de49ae\" (UID: \"e176b326-3d3d-4b95-8a7e-e18448de49ae\") " Feb 18 14:21:33 crc kubenswrapper[4817]: I0218 14:21:33.492210 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e176b326-3d3d-4b95-8a7e-e18448de49ae-kube-api-access-tvmgg" (OuterVolumeSpecName: "kube-api-access-tvmgg") pod "e176b326-3d3d-4b95-8a7e-e18448de49ae" (UID: "e176b326-3d3d-4b95-8a7e-e18448de49ae"). InnerVolumeSpecName "kube-api-access-tvmgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:21:33 crc kubenswrapper[4817]: I0218 14:21:33.492333 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e176b326-3d3d-4b95-8a7e-e18448de49ae-scripts" (OuterVolumeSpecName: "scripts") pod "e176b326-3d3d-4b95-8a7e-e18448de49ae" (UID: "e176b326-3d3d-4b95-8a7e-e18448de49ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:21:33 crc kubenswrapper[4817]: I0218 14:21:33.516399 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e176b326-3d3d-4b95-8a7e-e18448de49ae-config-data" (OuterVolumeSpecName: "config-data") pod "e176b326-3d3d-4b95-8a7e-e18448de49ae" (UID: "e176b326-3d3d-4b95-8a7e-e18448de49ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:21:33 crc kubenswrapper[4817]: I0218 14:21:33.517790 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e176b326-3d3d-4b95-8a7e-e18448de49ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e176b326-3d3d-4b95-8a7e-e18448de49ae" (UID: "e176b326-3d3d-4b95-8a7e-e18448de49ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:21:33 crc kubenswrapper[4817]: I0218 14:21:33.591773 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e176b326-3d3d-4b95-8a7e-e18448de49ae-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:33 crc kubenswrapper[4817]: I0218 14:21:33.592038 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e176b326-3d3d-4b95-8a7e-e18448de49ae-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:33 crc kubenswrapper[4817]: I0218 14:21:33.592051 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvmgg\" (UniqueName: \"kubernetes.io/projected/e176b326-3d3d-4b95-8a7e-e18448de49ae-kube-api-access-tvmgg\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:33 crc kubenswrapper[4817]: I0218 14:21:33.592062 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e176b326-3d3d-4b95-8a7e-e18448de49ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:33 crc kubenswrapper[4817]: I0218 14:21:33.877867 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hptxc" event={"ID":"e176b326-3d3d-4b95-8a7e-e18448de49ae","Type":"ContainerDied","Data":"344195bdba5eb0382780eb0c52a435e8ff3de1ff1649f7cac555d49251e20624"} Feb 18 14:21:33 crc kubenswrapper[4817]: I0218 14:21:33.877919 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="344195bdba5eb0382780eb0c52a435e8ff3de1ff1649f7cac555d49251e20624" Feb 18 14:21:33 crc kubenswrapper[4817]: I0218 14:21:33.879301 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hptxc" Feb 18 14:21:33 crc kubenswrapper[4817]: I0218 14:21:33.926538 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 18 14:21:33 crc kubenswrapper[4817]: I0218 14:21:33.996025 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 18 14:21:33 crc kubenswrapper[4817]: E0218 14:21:33.996515 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e176b326-3d3d-4b95-8a7e-e18448de49ae" containerName="nova-cell1-conductor-db-sync" Feb 18 14:21:33 crc kubenswrapper[4817]: I0218 14:21:33.996535 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="e176b326-3d3d-4b95-8a7e-e18448de49ae" containerName="nova-cell1-conductor-db-sync" Feb 18 14:21:33 crc kubenswrapper[4817]: I0218 14:21:33.996725 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="e176b326-3d3d-4b95-8a7e-e18448de49ae" containerName="nova-cell1-conductor-db-sync" Feb 18 14:21:33 crc kubenswrapper[4817]: I0218 14:21:33.997591 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 18 14:21:34 crc kubenswrapper[4817]: I0218 14:21:34.008582 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 18 14:21:34 crc kubenswrapper[4817]: I0218 14:21:34.008905 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 18 14:21:34 crc kubenswrapper[4817]: I0218 14:21:34.024469 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bab4ac8-afc6-4ac1-938c-2d04b5dc7822-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7bab4ac8-afc6-4ac1-938c-2d04b5dc7822\") " pod="openstack/nova-cell1-conductor-0" Feb 18 14:21:34 crc kubenswrapper[4817]: I0218 14:21:34.024566 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bab4ac8-afc6-4ac1-938c-2d04b5dc7822-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7bab4ac8-afc6-4ac1-938c-2d04b5dc7822\") " pod="openstack/nova-cell1-conductor-0" Feb 18 14:21:34 crc kubenswrapper[4817]: I0218 14:21:34.024656 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72nqn\" (UniqueName: \"kubernetes.io/projected/7bab4ac8-afc6-4ac1-938c-2d04b5dc7822-kube-api-access-72nqn\") pod \"nova-cell1-conductor-0\" (UID: \"7bab4ac8-afc6-4ac1-938c-2d04b5dc7822\") " pod="openstack/nova-cell1-conductor-0" Feb 18 14:21:34 crc kubenswrapper[4817]: I0218 14:21:34.127319 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bab4ac8-afc6-4ac1-938c-2d04b5dc7822-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7bab4ac8-afc6-4ac1-938c-2d04b5dc7822\") " pod="openstack/nova-cell1-conductor-0" Feb 18 14:21:34 crc kubenswrapper[4817]: I0218 14:21:34.127388 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bab4ac8-afc6-4ac1-938c-2d04b5dc7822-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7bab4ac8-afc6-4ac1-938c-2d04b5dc7822\") " pod="openstack/nova-cell1-conductor-0" Feb 18 14:21:34 crc kubenswrapper[4817]: I0218 14:21:34.127425 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72nqn\" (UniqueName: \"kubernetes.io/projected/7bab4ac8-afc6-4ac1-938c-2d04b5dc7822-kube-api-access-72nqn\") pod \"nova-cell1-conductor-0\" (UID: \"7bab4ac8-afc6-4ac1-938c-2d04b5dc7822\") " pod="openstack/nova-cell1-conductor-0" Feb 18 14:21:34 crc kubenswrapper[4817]: I0218 14:21:34.134297 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bab4ac8-afc6-4ac1-938c-2d04b5dc7822-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7bab4ac8-afc6-4ac1-938c-2d04b5dc7822\") " pod="openstack/nova-cell1-conductor-0" Feb 18 14:21:34 crc kubenswrapper[4817]: I0218 14:21:34.149257 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bab4ac8-afc6-4ac1-938c-2d04b5dc7822-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7bab4ac8-afc6-4ac1-938c-2d04b5dc7822\") " pod="openstack/nova-cell1-conductor-0" Feb 18 14:21:34 crc kubenswrapper[4817]: I0218 14:21:34.150241 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72nqn\" (UniqueName: \"kubernetes.io/projected/7bab4ac8-afc6-4ac1-938c-2d04b5dc7822-kube-api-access-72nqn\") pod \"nova-cell1-conductor-0\" (UID: \"7bab4ac8-afc6-4ac1-938c-2d04b5dc7822\") " pod="openstack/nova-cell1-conductor-0" Feb 18 14:21:34 crc kubenswrapper[4817]: I0218 14:21:34.326616 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 18 14:21:34 crc kubenswrapper[4817]: W0218 14:21:34.830543 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7bab4ac8_afc6_4ac1_938c_2d04b5dc7822.slice/crio-df57b17a5869af410daab1863be254790891279275c777d664b28a1634134636 WatchSource:0}: Error finding container df57b17a5869af410daab1863be254790891279275c777d664b28a1634134636: Status 404 returned error can't find the container with id df57b17a5869af410daab1863be254790891279275c777d664b28a1634134636 Feb 18 14:21:34 crc kubenswrapper[4817]: I0218 14:21:34.830888 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 18 14:21:34 crc kubenswrapper[4817]: I0218 14:21:34.887179 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7bab4ac8-afc6-4ac1-938c-2d04b5dc7822","Type":"ContainerStarted","Data":"df57b17a5869af410daab1863be254790891279275c777d664b28a1634134636"} Feb 18 14:21:35 crc kubenswrapper[4817]: I0218 14:21:35.900065 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7bab4ac8-afc6-4ac1-938c-2d04b5dc7822","Type":"ContainerStarted","Data":"bf8a9ae55a760173d88b620d17768e2a2657f00e5778be8718a20d1771807d20"} Feb 18 14:21:35 crc kubenswrapper[4817]: I0218 14:21:35.901846 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 18 14:21:35 crc kubenswrapper[4817]: I0218 14:21:35.918073 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.918055591 podStartE2EDuration="2.918055591s" podCreationTimestamp="2026-02-18 14:21:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:21:35.914656883 +0000 UTC m=+1358.490192876" watchObservedRunningTime="2026-02-18 14:21:35.918055591 +0000 UTC m=+1358.493591574" Feb 18 14:21:36 crc kubenswrapper[4817]: I0218 14:21:36.128021 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 18 14:21:36 crc kubenswrapper[4817]: I0218 14:21:36.128071 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 18 14:21:37 crc kubenswrapper[4817]: I0218 14:21:37.139212 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d6e8372d-7ebc-4f22-8d3d-6d653c128b06" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.222:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 18 14:21:37 crc kubenswrapper[4817]: I0218 14:21:37.139298 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d6e8372d-7ebc-4f22-8d3d-6d653c128b06" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.222:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 18 14:21:38 crc kubenswrapper[4817]: I0218 14:21:38.257919 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 18 14:21:38 crc kubenswrapper[4817]: I0218 14:21:38.299014 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 18 14:21:38 crc kubenswrapper[4817]: I0218 14:21:38.483002 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 14:21:38 crc kubenswrapper[4817]: I0218 14:21:38.483210 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="b50fdcb5-1983-4d14-ab5f-390b3dc090ce" containerName="kube-state-metrics" containerID="cri-o://024c6cade3c38911d1846544533703753c41b2dbdc33336d6261e602485f6d9b" gracePeriod=30 Feb 18 14:21:38 crc kubenswrapper[4817]: I0218 14:21:38.634341 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 18 14:21:38 crc kubenswrapper[4817]: I0218 14:21:38.634674 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 18 14:21:39 crc kubenswrapper[4817]: I0218 14:21:38.947390 4817 generic.go:334] "Generic (PLEG): container finished" podID="b50fdcb5-1983-4d14-ab5f-390b3dc090ce" containerID="024c6cade3c38911d1846544533703753c41b2dbdc33336d6261e602485f6d9b" exitCode=2 Feb 18 14:21:39 crc kubenswrapper[4817]: I0218 14:21:38.947476 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b50fdcb5-1983-4d14-ab5f-390b3dc090ce","Type":"ContainerDied","Data":"024c6cade3c38911d1846544533703753c41b2dbdc33336d6261e602485f6d9b"} Feb 18 14:21:39 crc kubenswrapper[4817]: I0218 14:21:38.992230 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 18 14:21:39 crc kubenswrapper[4817]: I0218 14:21:39.091995 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 18 14:21:39 crc kubenswrapper[4817]: I0218 14:21:39.189152 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4982j\" (UniqueName: \"kubernetes.io/projected/b50fdcb5-1983-4d14-ab5f-390b3dc090ce-kube-api-access-4982j\") pod \"b50fdcb5-1983-4d14-ab5f-390b3dc090ce\" (UID: \"b50fdcb5-1983-4d14-ab5f-390b3dc090ce\") " Feb 18 14:21:39 crc kubenswrapper[4817]: I0218 14:21:39.203196 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b50fdcb5-1983-4d14-ab5f-390b3dc090ce-kube-api-access-4982j" (OuterVolumeSpecName: "kube-api-access-4982j") pod "b50fdcb5-1983-4d14-ab5f-390b3dc090ce" (UID: "b50fdcb5-1983-4d14-ab5f-390b3dc090ce"). InnerVolumeSpecName "kube-api-access-4982j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:21:39 crc kubenswrapper[4817]: I0218 14:21:39.292959 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4982j\" (UniqueName: \"kubernetes.io/projected/b50fdcb5-1983-4d14-ab5f-390b3dc090ce-kube-api-access-4982j\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:39 crc kubenswrapper[4817]: I0218 14:21:39.676393 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c665ad27-3a60-4b0f-854e-00505781b81a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.224:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 18 14:21:39 crc kubenswrapper[4817]: I0218 14:21:39.718356 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c665ad27-3a60-4b0f-854e-00505781b81a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.224:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 18 14:21:39 crc kubenswrapper[4817]: I0218 14:21:39.973320 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 18 14:21:39 crc kubenswrapper[4817]: I0218 14:21:39.973463 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b50fdcb5-1983-4d14-ab5f-390b3dc090ce","Type":"ContainerDied","Data":"3437761b58e3decdd124d9e73ffbe01fc7ed463745a654012279e6cffedd4525"} Feb 18 14:21:39 crc kubenswrapper[4817]: I0218 14:21:39.973532 4817 scope.go:117] "RemoveContainer" containerID="024c6cade3c38911d1846544533703753c41b2dbdc33336d6261e602485f6d9b" Feb 18 14:21:40 crc kubenswrapper[4817]: I0218 14:21:40.040477 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 14:21:40 crc kubenswrapper[4817]: I0218 14:21:40.060455 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 14:21:40 crc kubenswrapper[4817]: I0218 14:21:40.081044 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 14:21:40 crc kubenswrapper[4817]: E0218 14:21:40.081813 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b50fdcb5-1983-4d14-ab5f-390b3dc090ce" containerName="kube-state-metrics" Feb 18 14:21:40 crc kubenswrapper[4817]: I0218 14:21:40.081915 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="b50fdcb5-1983-4d14-ab5f-390b3dc090ce" containerName="kube-state-metrics" Feb 18 14:21:40 crc kubenswrapper[4817]: I0218 14:21:40.082302 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="b50fdcb5-1983-4d14-ab5f-390b3dc090ce" containerName="kube-state-metrics" Feb 18 14:21:40 crc kubenswrapper[4817]: I0218 14:21:40.083370 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 18 14:21:40 crc kubenswrapper[4817]: I0218 14:21:40.094583 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 14:21:40 crc kubenswrapper[4817]: I0218 14:21:40.095502 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 18 14:21:40 crc kubenswrapper[4817]: I0218 14:21:40.095698 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 18 14:21:40 crc kubenswrapper[4817]: I0218 14:21:40.172319 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5scpm" Feb 18 14:21:40 crc kubenswrapper[4817]: I0218 14:21:40.188151 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b50fdcb5-1983-4d14-ab5f-390b3dc090ce" path="/var/lib/kubelet/pods/b50fdcb5-1983-4d14-ab5f-390b3dc090ce/volumes" Feb 18 14:21:40 crc kubenswrapper[4817]: I0218 14:21:40.215598 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/493be418-0841-4197-9fd7-50f22ecc6a5a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"493be418-0841-4197-9fd7-50f22ecc6a5a\") " pod="openstack/kube-state-metrics-0" Feb 18 14:21:40 crc kubenswrapper[4817]: I0218 14:21:40.215868 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gm54\" (UniqueName: \"kubernetes.io/projected/493be418-0841-4197-9fd7-50f22ecc6a5a-kube-api-access-5gm54\") pod \"kube-state-metrics-0\" (UID: \"493be418-0841-4197-9fd7-50f22ecc6a5a\") " pod="openstack/kube-state-metrics-0" Feb 18 14:21:40 crc kubenswrapper[4817]: I0218 14:21:40.216026 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/493be418-0841-4197-9fd7-50f22ecc6a5a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"493be418-0841-4197-9fd7-50f22ecc6a5a\") " pod="openstack/kube-state-metrics-0" Feb 18 14:21:40 crc kubenswrapper[4817]: I0218 14:21:40.216131 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/493be418-0841-4197-9fd7-50f22ecc6a5a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"493be418-0841-4197-9fd7-50f22ecc6a5a\") " pod="openstack/kube-state-metrics-0" Feb 18 14:21:40 crc kubenswrapper[4817]: I0218 14:21:40.259274 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5scpm" Feb 18 14:21:40 crc kubenswrapper[4817]: I0218 14:21:40.317674 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/493be418-0841-4197-9fd7-50f22ecc6a5a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"493be418-0841-4197-9fd7-50f22ecc6a5a\") " pod="openstack/kube-state-metrics-0" Feb 18 14:21:40 crc kubenswrapper[4817]: I0218 14:21:40.317736 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gm54\" (UniqueName: \"kubernetes.io/projected/493be418-0841-4197-9fd7-50f22ecc6a5a-kube-api-access-5gm54\") pod \"kube-state-metrics-0\" (UID: \"493be418-0841-4197-9fd7-50f22ecc6a5a\") " pod="openstack/kube-state-metrics-0" Feb 18 14:21:40 crc kubenswrapper[4817]: I0218 14:21:40.317802 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/493be418-0841-4197-9fd7-50f22ecc6a5a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"493be418-0841-4197-9fd7-50f22ecc6a5a\") " pod="openstack/kube-state-metrics-0" Feb 18 14:21:40 crc kubenswrapper[4817]: I0218 14:21:40.317821 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/493be418-0841-4197-9fd7-50f22ecc6a5a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"493be418-0841-4197-9fd7-50f22ecc6a5a\") " pod="openstack/kube-state-metrics-0" Feb 18 14:21:40 crc kubenswrapper[4817]: I0218 14:21:40.323837 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/493be418-0841-4197-9fd7-50f22ecc6a5a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"493be418-0841-4197-9fd7-50f22ecc6a5a\") " pod="openstack/kube-state-metrics-0" Feb 18 14:21:40 crc kubenswrapper[4817]: I0218 14:21:40.325676 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/493be418-0841-4197-9fd7-50f22ecc6a5a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"493be418-0841-4197-9fd7-50f22ecc6a5a\") " pod="openstack/kube-state-metrics-0" Feb 18 14:21:40 crc kubenswrapper[4817]: I0218 14:21:40.325935 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/493be418-0841-4197-9fd7-50f22ecc6a5a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"493be418-0841-4197-9fd7-50f22ecc6a5a\") " pod="openstack/kube-state-metrics-0" Feb 18 14:21:40 crc kubenswrapper[4817]: I0218 14:21:40.338849 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gm54\" (UniqueName: \"kubernetes.io/projected/493be418-0841-4197-9fd7-50f22ecc6a5a-kube-api-access-5gm54\") pod \"kube-state-metrics-0\" (UID: \"493be418-0841-4197-9fd7-50f22ecc6a5a\") " pod="openstack/kube-state-metrics-0" Feb 18 14:21:40 crc kubenswrapper[4817]: I0218 14:21:40.427669 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 18 14:21:40 crc kubenswrapper[4817]: I0218 14:21:40.892750 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:21:40 crc kubenswrapper[4817]: I0218 14:21:40.893946 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed71e1c9-fa52-4a17-901c-5efc187043fb" containerName="ceilometer-central-agent" containerID="cri-o://64bc7c486971d176034a65a33fd115616be7cfb1c7d68b9a6b896d9f077ac6b2" gracePeriod=30 Feb 18 14:21:40 crc kubenswrapper[4817]: I0218 14:21:40.894101 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed71e1c9-fa52-4a17-901c-5efc187043fb" containerName="proxy-httpd" containerID="cri-o://1c6159cba34b7d2a5ffaad168d6f5bac0e2cd56bda4f68e75ae287def34f86ad" gracePeriod=30 Feb 18 14:21:40 crc kubenswrapper[4817]: I0218 14:21:40.894146 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed71e1c9-fa52-4a17-901c-5efc187043fb" containerName="sg-core" containerID="cri-o://55e6a2f3b20af16f8244c617e69436a1c43036f167d97e14c5e8caed1384ce67" gracePeriod=30 Feb 18 14:21:40 crc kubenswrapper[4817]: I0218 14:21:40.894178 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed71e1c9-fa52-4a17-901c-5efc187043fb" containerName="ceilometer-notification-agent" containerID="cri-o://9131995f4e638ba36056894268ca2779e276a03848ff6db8fc441c6ccae4f3f2" gracePeriod=30 Feb 18 14:21:40 crc kubenswrapper[4817]: I0218 14:21:40.939780 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 18 14:21:40 crc kubenswrapper[4817]: I0218 14:21:40.969569 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"493be418-0841-4197-9fd7-50f22ecc6a5a","Type":"ContainerStarted","Data":"f174314d2ce6ac4d7533737e945b31304fcbfd84c95a06692b679bf619d90e08"} Feb 18 14:21:40 crc kubenswrapper[4817]: I0218 14:21:40.988978 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5scpm"] Feb 18 14:21:41 crc kubenswrapper[4817]: I0218 14:21:41.983615 4817 generic.go:334] "Generic (PLEG): container finished" podID="ed71e1c9-fa52-4a17-901c-5efc187043fb" containerID="1c6159cba34b7d2a5ffaad168d6f5bac0e2cd56bda4f68e75ae287def34f86ad" exitCode=0 Feb 18 14:21:41 crc kubenswrapper[4817]: I0218 14:21:41.983952 4817 generic.go:334] "Generic (PLEG): container finished" podID="ed71e1c9-fa52-4a17-901c-5efc187043fb" containerID="55e6a2f3b20af16f8244c617e69436a1c43036f167d97e14c5e8caed1384ce67" exitCode=2 Feb 18 14:21:41 crc kubenswrapper[4817]: I0218 14:21:41.983964 4817 generic.go:334] "Generic (PLEG): container finished" podID="ed71e1c9-fa52-4a17-901c-5efc187043fb" containerID="64bc7c486971d176034a65a33fd115616be7cfb1c7d68b9a6b896d9f077ac6b2" exitCode=0 Feb 18 14:21:41 crc kubenswrapper[4817]: I0218 14:21:41.983664 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed71e1c9-fa52-4a17-901c-5efc187043fb","Type":"ContainerDied","Data":"1c6159cba34b7d2a5ffaad168d6f5bac0e2cd56bda4f68e75ae287def34f86ad"} Feb 18 14:21:41 crc kubenswrapper[4817]: I0218 14:21:41.984048 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed71e1c9-fa52-4a17-901c-5efc187043fb","Type":"ContainerDied","Data":"55e6a2f3b20af16f8244c617e69436a1c43036f167d97e14c5e8caed1384ce67"} Feb 18 14:21:41 crc kubenswrapper[4817]: I0218 14:21:41.984071 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed71e1c9-fa52-4a17-901c-5efc187043fb","Type":"ContainerDied","Data":"64bc7c486971d176034a65a33fd115616be7cfb1c7d68b9a6b896d9f077ac6b2"} Feb 18 14:21:41 crc kubenswrapper[4817]: I0218 14:21:41.986629 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"493be418-0841-4197-9fd7-50f22ecc6a5a","Type":"ContainerStarted","Data":"2d964db5f2a6d768a11793dddeabeabbb567d93ed701290c4cef38218065de12"} Feb 18 14:21:41 crc kubenswrapper[4817]: I0218 14:21:41.986766 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5scpm" podUID="01b17e66-ae59-413b-985f-ea5cf5e11600" containerName="registry-server" containerID="cri-o://d5281a5ac84c58a1c66190ef7efefe08147a31a81c491d34755cb4e7412470da" gracePeriod=2 Feb 18 14:21:42 crc kubenswrapper[4817]: I0218 14:21:42.014955 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.605267478 podStartE2EDuration="2.014925558s" podCreationTimestamp="2026-02-18 14:21:40 +0000 UTC" firstStartedPulling="2026-02-18 14:21:40.93288658 +0000 UTC m=+1363.508422563" lastFinishedPulling="2026-02-18 14:21:41.34254466 +0000 UTC m=+1363.918080643" observedRunningTime="2026-02-18 14:21:42.002156447 +0000 UTC m=+1364.577692430" watchObservedRunningTime="2026-02-18 14:21:42.014925558 +0000 UTC m=+1364.590461541" Feb 18 14:21:42 crc kubenswrapper[4817]: I0218 14:21:42.547249 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5scpm" Feb 18 14:21:42 crc kubenswrapper[4817]: I0218 14:21:42.672922 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01b17e66-ae59-413b-985f-ea5cf5e11600-catalog-content\") pod \"01b17e66-ae59-413b-985f-ea5cf5e11600\" (UID: \"01b17e66-ae59-413b-985f-ea5cf5e11600\") " Feb 18 14:21:42 crc kubenswrapper[4817]: I0218 14:21:42.673112 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01b17e66-ae59-413b-985f-ea5cf5e11600-utilities\") pod \"01b17e66-ae59-413b-985f-ea5cf5e11600\" (UID: \"01b17e66-ae59-413b-985f-ea5cf5e11600\") " Feb 18 14:21:42 crc kubenswrapper[4817]: I0218 14:21:42.673193 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z9hf\" (UniqueName: \"kubernetes.io/projected/01b17e66-ae59-413b-985f-ea5cf5e11600-kube-api-access-8z9hf\") pod \"01b17e66-ae59-413b-985f-ea5cf5e11600\" (UID: \"01b17e66-ae59-413b-985f-ea5cf5e11600\") " Feb 18 14:21:42 crc kubenswrapper[4817]: I0218 14:21:42.673902 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01b17e66-ae59-413b-985f-ea5cf5e11600-utilities" (OuterVolumeSpecName: "utilities") pod "01b17e66-ae59-413b-985f-ea5cf5e11600" (UID: "01b17e66-ae59-413b-985f-ea5cf5e11600"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:21:42 crc kubenswrapper[4817]: I0218 14:21:42.679898 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01b17e66-ae59-413b-985f-ea5cf5e11600-kube-api-access-8z9hf" (OuterVolumeSpecName: "kube-api-access-8z9hf") pod "01b17e66-ae59-413b-985f-ea5cf5e11600" (UID: "01b17e66-ae59-413b-985f-ea5cf5e11600"). InnerVolumeSpecName "kube-api-access-8z9hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:21:42 crc kubenswrapper[4817]: I0218 14:21:42.778209 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01b17e66-ae59-413b-985f-ea5cf5e11600-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:42 crc kubenswrapper[4817]: I0218 14:21:42.778261 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8z9hf\" (UniqueName: \"kubernetes.io/projected/01b17e66-ae59-413b-985f-ea5cf5e11600-kube-api-access-8z9hf\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:42 crc kubenswrapper[4817]: I0218 14:21:42.816513 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01b17e66-ae59-413b-985f-ea5cf5e11600-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01b17e66-ae59-413b-985f-ea5cf5e11600" (UID: "01b17e66-ae59-413b-985f-ea5cf5e11600"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:21:42 crc kubenswrapper[4817]: I0218 14:21:42.879987 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01b17e66-ae59-413b-985f-ea5cf5e11600-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:42 crc kubenswrapper[4817]: I0218 14:21:42.998429 4817 generic.go:334] "Generic (PLEG): container finished" podID="01b17e66-ae59-413b-985f-ea5cf5e11600" containerID="d5281a5ac84c58a1c66190ef7efefe08147a31a81c491d34755cb4e7412470da" exitCode=0 Feb 18 14:21:42 crc kubenswrapper[4817]: I0218 14:21:42.998492 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5scpm" event={"ID":"01b17e66-ae59-413b-985f-ea5cf5e11600","Type":"ContainerDied","Data":"d5281a5ac84c58a1c66190ef7efefe08147a31a81c491d34755cb4e7412470da"} Feb 18 14:21:42 crc kubenswrapper[4817]: I0218 14:21:42.998501 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5scpm" Feb 18 14:21:42 crc kubenswrapper[4817]: I0218 14:21:42.998540 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5scpm" event={"ID":"01b17e66-ae59-413b-985f-ea5cf5e11600","Type":"ContainerDied","Data":"0f3f1f2e1d901caa396c938299bb07cf9d23e240f22830b2c8b880fe0d38e204"} Feb 18 14:21:42 crc kubenswrapper[4817]: I0218 14:21:42.998559 4817 scope.go:117] "RemoveContainer" containerID="d5281a5ac84c58a1c66190ef7efefe08147a31a81c491d34755cb4e7412470da" Feb 18 14:21:42 crc kubenswrapper[4817]: I0218 14:21:42.998945 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 18 14:21:43 crc kubenswrapper[4817]: I0218 14:21:43.019704 4817 scope.go:117] "RemoveContainer" containerID="d37b5d2d6950d880e7a78a73090a7f2e8fbb202e14e81d8701c4623c0b54582d" Feb 18 14:21:43 crc kubenswrapper[4817]: I0218 14:21:43.037572 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5scpm"] Feb 18 14:21:43 crc kubenswrapper[4817]: I0218 14:21:43.046730 4817 scope.go:117] "RemoveContainer" containerID="8d2522b0b5aed8d79a9631a4539d24467dfc33b95b51c6b97a2fc8dfeac6cdf1" Feb 18 14:21:43 crc kubenswrapper[4817]: I0218 14:21:43.051307 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5scpm"] Feb 18 14:21:43 crc kubenswrapper[4817]: I0218 14:21:43.093131 4817 scope.go:117] "RemoveContainer" containerID="d5281a5ac84c58a1c66190ef7efefe08147a31a81c491d34755cb4e7412470da" Feb 18 14:21:43 crc kubenswrapper[4817]: E0218 14:21:43.093794 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5281a5ac84c58a1c66190ef7efefe08147a31a81c491d34755cb4e7412470da\": container with ID starting with d5281a5ac84c58a1c66190ef7efefe08147a31a81c491d34755cb4e7412470da not found: ID does not exist" containerID="d5281a5ac84c58a1c66190ef7efefe08147a31a81c491d34755cb4e7412470da" Feb 18 14:21:43 crc kubenswrapper[4817]: I0218 14:21:43.093836 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5281a5ac84c58a1c66190ef7efefe08147a31a81c491d34755cb4e7412470da"} err="failed to get container status \"d5281a5ac84c58a1c66190ef7efefe08147a31a81c491d34755cb4e7412470da\": rpc error: code = NotFound desc = could not find container \"d5281a5ac84c58a1c66190ef7efefe08147a31a81c491d34755cb4e7412470da\": container with ID starting with d5281a5ac84c58a1c66190ef7efefe08147a31a81c491d34755cb4e7412470da not found: ID does not exist" Feb 18 14:21:43 crc kubenswrapper[4817]: I0218 14:21:43.093864 4817 scope.go:117] "RemoveContainer" containerID="d37b5d2d6950d880e7a78a73090a7f2e8fbb202e14e81d8701c4623c0b54582d" Feb 18 14:21:43 crc kubenswrapper[4817]: E0218 14:21:43.094237 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d37b5d2d6950d880e7a78a73090a7f2e8fbb202e14e81d8701c4623c0b54582d\": container with ID starting with d37b5d2d6950d880e7a78a73090a7f2e8fbb202e14e81d8701c4623c0b54582d not found: ID does not exist" containerID="d37b5d2d6950d880e7a78a73090a7f2e8fbb202e14e81d8701c4623c0b54582d" Feb 18 14:21:43 crc kubenswrapper[4817]: I0218 14:21:43.094260 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d37b5d2d6950d880e7a78a73090a7f2e8fbb202e14e81d8701c4623c0b54582d"} err="failed to get container status \"d37b5d2d6950d880e7a78a73090a7f2e8fbb202e14e81d8701c4623c0b54582d\": rpc error: code = NotFound desc = could not find container \"d37b5d2d6950d880e7a78a73090a7f2e8fbb202e14e81d8701c4623c0b54582d\": container with ID starting with d37b5d2d6950d880e7a78a73090a7f2e8fbb202e14e81d8701c4623c0b54582d not found: ID does not exist" Feb 18 14:21:43 crc kubenswrapper[4817]: I0218 14:21:43.094277 4817 scope.go:117] "RemoveContainer" containerID="8d2522b0b5aed8d79a9631a4539d24467dfc33b95b51c6b97a2fc8dfeac6cdf1" Feb 18 14:21:43 crc kubenswrapper[4817]: E0218 14:21:43.094598 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d2522b0b5aed8d79a9631a4539d24467dfc33b95b51c6b97a2fc8dfeac6cdf1\": container with ID starting with 8d2522b0b5aed8d79a9631a4539d24467dfc33b95b51c6b97a2fc8dfeac6cdf1 not found: ID does not exist" containerID="8d2522b0b5aed8d79a9631a4539d24467dfc33b95b51c6b97a2fc8dfeac6cdf1" Feb 18 14:21:43 crc kubenswrapper[4817]: I0218 14:21:43.094630 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d2522b0b5aed8d79a9631a4539d24467dfc33b95b51c6b97a2fc8dfeac6cdf1"} err="failed to get container status \"8d2522b0b5aed8d79a9631a4539d24467dfc33b95b51c6b97a2fc8dfeac6cdf1\": rpc error: code = NotFound desc = could not find container \"8d2522b0b5aed8d79a9631a4539d24467dfc33b95b51c6b97a2fc8dfeac6cdf1\": container with ID starting with 8d2522b0b5aed8d79a9631a4539d24467dfc33b95b51c6b97a2fc8dfeac6cdf1 not found: ID does not exist" Feb 18 14:21:44 crc kubenswrapper[4817]: I0218 14:21:44.183778 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01b17e66-ae59-413b-985f-ea5cf5e11600" path="/var/lib/kubelet/pods/01b17e66-ae59-413b-985f-ea5cf5e11600/volumes" Feb 18 14:21:44 crc kubenswrapper[4817]: I0218 14:21:44.360769 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 18 14:21:45 crc kubenswrapper[4817]: I0218 14:21:45.025197 4817 generic.go:334] "Generic (PLEG): container finished" podID="ed71e1c9-fa52-4a17-901c-5efc187043fb" containerID="9131995f4e638ba36056894268ca2779e276a03848ff6db8fc441c6ccae4f3f2" exitCode=0 Feb 18 14:21:45 crc kubenswrapper[4817]: I0218 14:21:45.025247 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed71e1c9-fa52-4a17-901c-5efc187043fb","Type":"ContainerDied","Data":"9131995f4e638ba36056894268ca2779e276a03848ff6db8fc441c6ccae4f3f2"} Feb 18 14:21:45 crc kubenswrapper[4817]: I0218 14:21:45.460781 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:21:45 crc kubenswrapper[4817]: I0218 14:21:45.531643 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed71e1c9-fa52-4a17-901c-5efc187043fb-config-data\") pod \"ed71e1c9-fa52-4a17-901c-5efc187043fb\" (UID: \"ed71e1c9-fa52-4a17-901c-5efc187043fb\") " Feb 18 14:21:45 crc kubenswrapper[4817]: I0218 14:21:45.531750 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed71e1c9-fa52-4a17-901c-5efc187043fb-log-httpd\") pod \"ed71e1c9-fa52-4a17-901c-5efc187043fb\" (UID: \"ed71e1c9-fa52-4a17-901c-5efc187043fb\") " Feb 18 14:21:45 crc kubenswrapper[4817]: I0218 14:21:45.531786 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed71e1c9-fa52-4a17-901c-5efc187043fb-scripts\") pod \"ed71e1c9-fa52-4a17-901c-5efc187043fb\" (UID: \"ed71e1c9-fa52-4a17-901c-5efc187043fb\") " Feb 18 14:21:45 crc kubenswrapper[4817]: I0218 14:21:45.531804 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed71e1c9-fa52-4a17-901c-5efc187043fb-sg-core-conf-yaml\") pod \"ed71e1c9-fa52-4a17-901c-5efc187043fb\" (UID: \"ed71e1c9-fa52-4a17-901c-5efc187043fb\") " Feb 18 14:21:45 crc kubenswrapper[4817]: I0218 14:21:45.531925 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed71e1c9-fa52-4a17-901c-5efc187043fb-combined-ca-bundle\") pod \"ed71e1c9-fa52-4a17-901c-5efc187043fb\" (UID: \"ed71e1c9-fa52-4a17-901c-5efc187043fb\") " Feb 18 14:21:45 crc kubenswrapper[4817]: I0218 14:21:45.531981 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed71e1c9-fa52-4a17-901c-5efc187043fb-run-httpd\") pod \"ed71e1c9-fa52-4a17-901c-5efc187043fb\" (UID: \"ed71e1c9-fa52-4a17-901c-5efc187043fb\") " Feb 18 14:21:45 crc kubenswrapper[4817]: I0218 14:21:45.532028 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgmld\" (UniqueName: \"kubernetes.io/projected/ed71e1c9-fa52-4a17-901c-5efc187043fb-kube-api-access-mgmld\") pod \"ed71e1c9-fa52-4a17-901c-5efc187043fb\" (UID: \"ed71e1c9-fa52-4a17-901c-5efc187043fb\") " Feb 18 14:21:45 crc kubenswrapper[4817]: I0218 14:21:45.534289 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed71e1c9-fa52-4a17-901c-5efc187043fb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ed71e1c9-fa52-4a17-901c-5efc187043fb" (UID: "ed71e1c9-fa52-4a17-901c-5efc187043fb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:21:45 crc kubenswrapper[4817]: I0218 14:21:45.534576 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed71e1c9-fa52-4a17-901c-5efc187043fb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ed71e1c9-fa52-4a17-901c-5efc187043fb" (UID: "ed71e1c9-fa52-4a17-901c-5efc187043fb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:21:45 crc kubenswrapper[4817]: I0218 14:21:45.546187 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed71e1c9-fa52-4a17-901c-5efc187043fb-kube-api-access-mgmld" (OuterVolumeSpecName: "kube-api-access-mgmld") pod "ed71e1c9-fa52-4a17-901c-5efc187043fb" (UID: "ed71e1c9-fa52-4a17-901c-5efc187043fb"). InnerVolumeSpecName "kube-api-access-mgmld". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:21:45 crc kubenswrapper[4817]: I0218 14:21:45.549329 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed71e1c9-fa52-4a17-901c-5efc187043fb-scripts" (OuterVolumeSpecName: "scripts") pod "ed71e1c9-fa52-4a17-901c-5efc187043fb" (UID: "ed71e1c9-fa52-4a17-901c-5efc187043fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:21:45 crc kubenswrapper[4817]: I0218 14:21:45.566966 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed71e1c9-fa52-4a17-901c-5efc187043fb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ed71e1c9-fa52-4a17-901c-5efc187043fb" (UID: "ed71e1c9-fa52-4a17-901c-5efc187043fb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:21:45 crc kubenswrapper[4817]: I0218 14:21:45.618633 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed71e1c9-fa52-4a17-901c-5efc187043fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed71e1c9-fa52-4a17-901c-5efc187043fb" (UID: "ed71e1c9-fa52-4a17-901c-5efc187043fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:21:45 crc kubenswrapper[4817]: I0218 14:21:45.635120 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed71e1c9-fa52-4a17-901c-5efc187043fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:45 crc kubenswrapper[4817]: I0218 14:21:45.635165 4817 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed71e1c9-fa52-4a17-901c-5efc187043fb-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:45 crc kubenswrapper[4817]: I0218 14:21:45.635181 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgmld\" (UniqueName: \"kubernetes.io/projected/ed71e1c9-fa52-4a17-901c-5efc187043fb-kube-api-access-mgmld\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:45 crc kubenswrapper[4817]: I0218 14:21:45.635192 4817 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed71e1c9-fa52-4a17-901c-5efc187043fb-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:45 crc kubenswrapper[4817]: I0218 14:21:45.635204 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed71e1c9-fa52-4a17-901c-5efc187043fb-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:45 crc kubenswrapper[4817]: I0218 14:21:45.635217 4817 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed71e1c9-fa52-4a17-901c-5efc187043fb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:45 crc kubenswrapper[4817]: I0218 14:21:45.663216 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed71e1c9-fa52-4a17-901c-5efc187043fb-config-data" (OuterVolumeSpecName: "config-data") pod "ed71e1c9-fa52-4a17-901c-5efc187043fb" (UID: "ed71e1c9-fa52-4a17-901c-5efc187043fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:21:45 crc kubenswrapper[4817]: I0218 14:21:45.737334 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed71e1c9-fa52-4a17-901c-5efc187043fb-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.042264 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed71e1c9-fa52-4a17-901c-5efc187043fb","Type":"ContainerDied","Data":"3371ff3ccddbfc64e59a5801f72c55462a8654d0f2e2f43a2878767a9012e200"} Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.042356 4817 scope.go:117] "RemoveContainer" containerID="1c6159cba34b7d2a5ffaad168d6f5bac0e2cd56bda4f68e75ae287def34f86ad" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.042880 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.105184 4817 scope.go:117] "RemoveContainer" containerID="55e6a2f3b20af16f8244c617e69436a1c43036f167d97e14c5e8caed1384ce67" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.114784 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.127194 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.137534 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:21:46 crc kubenswrapper[4817]: E0218 14:21:46.137938 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01b17e66-ae59-413b-985f-ea5cf5e11600" containerName="extract-utilities" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.137955 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="01b17e66-ae59-413b-985f-ea5cf5e11600" containerName="extract-utilities" Feb 18 14:21:46 crc kubenswrapper[4817]: E0218 14:21:46.137967 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed71e1c9-fa52-4a17-901c-5efc187043fb" containerName="ceilometer-central-agent" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.137976 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed71e1c9-fa52-4a17-901c-5efc187043fb" containerName="ceilometer-central-agent" Feb 18 14:21:46 crc kubenswrapper[4817]: E0218 14:21:46.138787 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01b17e66-ae59-413b-985f-ea5cf5e11600" containerName="registry-server" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.138805 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="01b17e66-ae59-413b-985f-ea5cf5e11600" containerName="registry-server" Feb 18 14:21:46 crc kubenswrapper[4817]: E0218 14:21:46.138815 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed71e1c9-fa52-4a17-901c-5efc187043fb" containerName="proxy-httpd" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.138823 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed71e1c9-fa52-4a17-901c-5efc187043fb" containerName="proxy-httpd" Feb 18 14:21:46 crc kubenswrapper[4817]: E0218 14:21:46.138836 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed71e1c9-fa52-4a17-901c-5efc187043fb" containerName="sg-core" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.138845 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed71e1c9-fa52-4a17-901c-5efc187043fb" containerName="sg-core" Feb 18 14:21:46 crc kubenswrapper[4817]: E0218 14:21:46.138857 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01b17e66-ae59-413b-985f-ea5cf5e11600" containerName="extract-content" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.138863 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="01b17e66-ae59-413b-985f-ea5cf5e11600" containerName="extract-content" Feb 18 14:21:46 crc kubenswrapper[4817]: E0218 14:21:46.138878 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed71e1c9-fa52-4a17-901c-5efc187043fb" containerName="ceilometer-notification-agent" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.138884 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed71e1c9-fa52-4a17-901c-5efc187043fb" containerName="ceilometer-notification-agent" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.139143 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed71e1c9-fa52-4a17-901c-5efc187043fb" containerName="ceilometer-central-agent" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.139172 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed71e1c9-fa52-4a17-901c-5efc187043fb" containerName="ceilometer-notification-agent" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.139179 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed71e1c9-fa52-4a17-901c-5efc187043fb" containerName="proxy-httpd" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.139190 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="01b17e66-ae59-413b-985f-ea5cf5e11600" containerName="registry-server" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.139200 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed71e1c9-fa52-4a17-901c-5efc187043fb" containerName="sg-core" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.139373 4817 scope.go:117] "RemoveContainer" containerID="9131995f4e638ba36056894268ca2779e276a03848ff6db8fc441c6ccae4f3f2" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.140938 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.141661 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.146502 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.146686 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.146792 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.157268 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.158552 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.176915 4817 scope.go:117] "RemoveContainer" containerID="64bc7c486971d176034a65a33fd115616be7cfb1c7d68b9a6b896d9f077ac6b2" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.212004 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed71e1c9-fa52-4a17-901c-5efc187043fb" path="/var/lib/kubelet/pods/ed71e1c9-fa52-4a17-901c-5efc187043fb/volumes" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.215065 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.259391 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2550ff13-9cef-4d0a-a413-21d40e809b87-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2550ff13-9cef-4d0a-a413-21d40e809b87\") " pod="openstack/ceilometer-0" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.259466 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2550ff13-9cef-4d0a-a413-21d40e809b87-scripts\") pod \"ceilometer-0\" (UID: \"2550ff13-9cef-4d0a-a413-21d40e809b87\") " pod="openstack/ceilometer-0" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.259728 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffjbw\" (UniqueName: \"kubernetes.io/projected/2550ff13-9cef-4d0a-a413-21d40e809b87-kube-api-access-ffjbw\") pod \"ceilometer-0\" (UID: \"2550ff13-9cef-4d0a-a413-21d40e809b87\") " pod="openstack/ceilometer-0" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.259820 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2550ff13-9cef-4d0a-a413-21d40e809b87-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2550ff13-9cef-4d0a-a413-21d40e809b87\") " pod="openstack/ceilometer-0" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.259853 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2550ff13-9cef-4d0a-a413-21d40e809b87-config-data\") pod \"ceilometer-0\" (UID: \"2550ff13-9cef-4d0a-a413-21d40e809b87\") " pod="openstack/ceilometer-0" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.259910 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2550ff13-9cef-4d0a-a413-21d40e809b87-log-httpd\") pod \"ceilometer-0\" (UID: \"2550ff13-9cef-4d0a-a413-21d40e809b87\") " pod="openstack/ceilometer-0" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.260198 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2550ff13-9cef-4d0a-a413-21d40e809b87-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2550ff13-9cef-4d0a-a413-21d40e809b87\") " pod="openstack/ceilometer-0" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.260287 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2550ff13-9cef-4d0a-a413-21d40e809b87-run-httpd\") pod \"ceilometer-0\" (UID: \"2550ff13-9cef-4d0a-a413-21d40e809b87\") " pod="openstack/ceilometer-0" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.362473 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2550ff13-9cef-4d0a-a413-21d40e809b87-run-httpd\") pod \"ceilometer-0\" (UID: \"2550ff13-9cef-4d0a-a413-21d40e809b87\") " pod="openstack/ceilometer-0" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.362928 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2550ff13-9cef-4d0a-a413-21d40e809b87-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2550ff13-9cef-4d0a-a413-21d40e809b87\") " pod="openstack/ceilometer-0" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.362983 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2550ff13-9cef-4d0a-a413-21d40e809b87-scripts\") pod \"ceilometer-0\" (UID: \"2550ff13-9cef-4d0a-a413-21d40e809b87\") " pod="openstack/ceilometer-0" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.363099 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2550ff13-9cef-4d0a-a413-21d40e809b87-run-httpd\") pod \"ceilometer-0\" (UID: \"2550ff13-9cef-4d0a-a413-21d40e809b87\") " pod="openstack/ceilometer-0" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.363138 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffjbw\" (UniqueName: \"kubernetes.io/projected/2550ff13-9cef-4d0a-a413-21d40e809b87-kube-api-access-ffjbw\") pod \"ceilometer-0\" (UID: \"2550ff13-9cef-4d0a-a413-21d40e809b87\") " pod="openstack/ceilometer-0" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.363201 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2550ff13-9cef-4d0a-a413-21d40e809b87-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2550ff13-9cef-4d0a-a413-21d40e809b87\") " pod="openstack/ceilometer-0" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.363242 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2550ff13-9cef-4d0a-a413-21d40e809b87-config-data\") pod \"ceilometer-0\" (UID: \"2550ff13-9cef-4d0a-a413-21d40e809b87\") " pod="openstack/ceilometer-0" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.363283 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2550ff13-9cef-4d0a-a413-21d40e809b87-log-httpd\") pod \"ceilometer-0\" (UID: \"2550ff13-9cef-4d0a-a413-21d40e809b87\") " pod="openstack/ceilometer-0" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.363493 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2550ff13-9cef-4d0a-a413-21d40e809b87-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2550ff13-9cef-4d0a-a413-21d40e809b87\") " pod="openstack/ceilometer-0" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.363770 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2550ff13-9cef-4d0a-a413-21d40e809b87-log-httpd\") pod \"ceilometer-0\" (UID: \"2550ff13-9cef-4d0a-a413-21d40e809b87\") " pod="openstack/ceilometer-0" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.366614 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2550ff13-9cef-4d0a-a413-21d40e809b87-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2550ff13-9cef-4d0a-a413-21d40e809b87\") " pod="openstack/ceilometer-0" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.367151 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2550ff13-9cef-4d0a-a413-21d40e809b87-scripts\") pod \"ceilometer-0\" (UID: \"2550ff13-9cef-4d0a-a413-21d40e809b87\") " pod="openstack/ceilometer-0" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.367192 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2550ff13-9cef-4d0a-a413-21d40e809b87-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2550ff13-9cef-4d0a-a413-21d40e809b87\") " pod="openstack/ceilometer-0" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.367623 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2550ff13-9cef-4d0a-a413-21d40e809b87-config-data\") pod \"ceilometer-0\" (UID: \"2550ff13-9cef-4d0a-a413-21d40e809b87\") " pod="openstack/ceilometer-0" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.368915 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2550ff13-9cef-4d0a-a413-21d40e809b87-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2550ff13-9cef-4d0a-a413-21d40e809b87\") " pod="openstack/ceilometer-0" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.381630 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffjbw\" (UniqueName: \"kubernetes.io/projected/2550ff13-9cef-4d0a-a413-21d40e809b87-kube-api-access-ffjbw\") pod \"ceilometer-0\" (UID: \"2550ff13-9cef-4d0a-a413-21d40e809b87\") " pod="openstack/ceilometer-0" Feb 18 14:21:46 crc kubenswrapper[4817]: I0218 14:21:46.471939 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:21:47 crc kubenswrapper[4817]: I0218 14:21:47.004438 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:21:47 crc kubenswrapper[4817]: I0218 14:21:47.060192 4817 generic.go:334] "Generic (PLEG): container finished" podID="543ef68b-ffe2-4b3f-91b2-b34b458751f7" containerID="9c2bf5bd31790eaed02449111a9156317ae8e76208ab4773d58e99c8fb04ab3d" exitCode=137 Feb 18 14:21:47 crc kubenswrapper[4817]: I0218 14:21:47.060645 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"543ef68b-ffe2-4b3f-91b2-b34b458751f7","Type":"ContainerDied","Data":"9c2bf5bd31790eaed02449111a9156317ae8e76208ab4773d58e99c8fb04ab3d"} Feb 18 14:21:47 crc kubenswrapper[4817]: I0218 14:21:47.066647 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2550ff13-9cef-4d0a-a413-21d40e809b87","Type":"ContainerStarted","Data":"d4bcfbf8e0fc0731897328e96563da9a30ebe086e5601256759e23a945f9d3f5"} Feb 18 14:21:47 crc kubenswrapper[4817]: I0218 14:21:47.075199 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 18 14:21:47 crc kubenswrapper[4817]: I0218 14:21:47.184349 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:47 crc kubenswrapper[4817]: I0218 14:21:47.289623 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zm57\" (UniqueName: \"kubernetes.io/projected/543ef68b-ffe2-4b3f-91b2-b34b458751f7-kube-api-access-7zm57\") pod \"543ef68b-ffe2-4b3f-91b2-b34b458751f7\" (UID: \"543ef68b-ffe2-4b3f-91b2-b34b458751f7\") " Feb 18 14:21:47 crc kubenswrapper[4817]: I0218 14:21:47.289834 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/543ef68b-ffe2-4b3f-91b2-b34b458751f7-config-data\") pod \"543ef68b-ffe2-4b3f-91b2-b34b458751f7\" (UID: \"543ef68b-ffe2-4b3f-91b2-b34b458751f7\") " Feb 18 14:21:47 crc kubenswrapper[4817]: I0218 14:21:47.290030 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/543ef68b-ffe2-4b3f-91b2-b34b458751f7-combined-ca-bundle\") pod \"543ef68b-ffe2-4b3f-91b2-b34b458751f7\" (UID: \"543ef68b-ffe2-4b3f-91b2-b34b458751f7\") " Feb 18 14:21:47 crc kubenswrapper[4817]: I0218 14:21:47.296304 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/543ef68b-ffe2-4b3f-91b2-b34b458751f7-kube-api-access-7zm57" (OuterVolumeSpecName: "kube-api-access-7zm57") pod "543ef68b-ffe2-4b3f-91b2-b34b458751f7" (UID: "543ef68b-ffe2-4b3f-91b2-b34b458751f7"). InnerVolumeSpecName "kube-api-access-7zm57". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:21:47 crc kubenswrapper[4817]: I0218 14:21:47.350087 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/543ef68b-ffe2-4b3f-91b2-b34b458751f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "543ef68b-ffe2-4b3f-91b2-b34b458751f7" (UID: "543ef68b-ffe2-4b3f-91b2-b34b458751f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:21:47 crc kubenswrapper[4817]: I0218 14:21:47.354856 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/543ef68b-ffe2-4b3f-91b2-b34b458751f7-config-data" (OuterVolumeSpecName: "config-data") pod "543ef68b-ffe2-4b3f-91b2-b34b458751f7" (UID: "543ef68b-ffe2-4b3f-91b2-b34b458751f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:21:47 crc kubenswrapper[4817]: I0218 14:21:47.393081 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/543ef68b-ffe2-4b3f-91b2-b34b458751f7-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:47 crc kubenswrapper[4817]: I0218 14:21:47.393112 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/543ef68b-ffe2-4b3f-91b2-b34b458751f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:47 crc kubenswrapper[4817]: I0218 14:21:47.393121 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zm57\" (UniqueName: \"kubernetes.io/projected/543ef68b-ffe2-4b3f-91b2-b34b458751f7-kube-api-access-7zm57\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.078671 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2550ff13-9cef-4d0a-a413-21d40e809b87","Type":"ContainerStarted","Data":"780ab183ef5ff537ab569ae41b28e75022d93eab5c57e09b52e79a0667cb073f"} Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.080510 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"543ef68b-ffe2-4b3f-91b2-b34b458751f7","Type":"ContainerDied","Data":"2846ee1a927f6a6f8b981c4b569b8bec9932f4f7ef4e42562dd1d10eab89fda3"} Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.080553 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.080798 4817 scope.go:117] "RemoveContainer" containerID="9c2bf5bd31790eaed02449111a9156317ae8e76208ab4773d58e99c8fb04ab3d" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.149335 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.161799 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.203766 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="543ef68b-ffe2-4b3f-91b2-b34b458751f7" path="/var/lib/kubelet/pods/543ef68b-ffe2-4b3f-91b2-b34b458751f7/volumes" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.204487 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 14:21:48 crc kubenswrapper[4817]: E0218 14:21:48.204825 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="543ef68b-ffe2-4b3f-91b2-b34b458751f7" containerName="nova-cell1-novncproxy-novncproxy" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.204841 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="543ef68b-ffe2-4b3f-91b2-b34b458751f7" containerName="nova-cell1-novncproxy-novncproxy" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.205044 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="543ef68b-ffe2-4b3f-91b2-b34b458751f7" containerName="nova-cell1-novncproxy-novncproxy" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.205930 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.217925 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.218402 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.218478 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.242292 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.314274 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc896752-3d52-40cd-8d7f-2b10ba1afab5-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc896752-3d52-40cd-8d7f-2b10ba1afab5\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.314382 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc896752-3d52-40cd-8d7f-2b10ba1afab5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc896752-3d52-40cd-8d7f-2b10ba1afab5\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.314418 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc896752-3d52-40cd-8d7f-2b10ba1afab5-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc896752-3d52-40cd-8d7f-2b10ba1afab5\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.314543 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc896752-3d52-40cd-8d7f-2b10ba1afab5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc896752-3d52-40cd-8d7f-2b10ba1afab5\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.314578 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24rzp\" (UniqueName: \"kubernetes.io/projected/fc896752-3d52-40cd-8d7f-2b10ba1afab5-kube-api-access-24rzp\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc896752-3d52-40cd-8d7f-2b10ba1afab5\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.416914 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc896752-3d52-40cd-8d7f-2b10ba1afab5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc896752-3d52-40cd-8d7f-2b10ba1afab5\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.417218 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc896752-3d52-40cd-8d7f-2b10ba1afab5-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc896752-3d52-40cd-8d7f-2b10ba1afab5\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.417454 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc896752-3d52-40cd-8d7f-2b10ba1afab5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc896752-3d52-40cd-8d7f-2b10ba1afab5\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.417574 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24rzp\" (UniqueName: \"kubernetes.io/projected/fc896752-3d52-40cd-8d7f-2b10ba1afab5-kube-api-access-24rzp\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc896752-3d52-40cd-8d7f-2b10ba1afab5\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.417777 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc896752-3d52-40cd-8d7f-2b10ba1afab5-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc896752-3d52-40cd-8d7f-2b10ba1afab5\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.424198 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc896752-3d52-40cd-8d7f-2b10ba1afab5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc896752-3d52-40cd-8d7f-2b10ba1afab5\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.430355 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc896752-3d52-40cd-8d7f-2b10ba1afab5-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc896752-3d52-40cd-8d7f-2b10ba1afab5\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.432552 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc896752-3d52-40cd-8d7f-2b10ba1afab5-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc896752-3d52-40cd-8d7f-2b10ba1afab5\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.433202 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc896752-3d52-40cd-8d7f-2b10ba1afab5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc896752-3d52-40cd-8d7f-2b10ba1afab5\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.439528 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24rzp\" (UniqueName: \"kubernetes.io/projected/fc896752-3d52-40cd-8d7f-2b10ba1afab5-kube-api-access-24rzp\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc896752-3d52-40cd-8d7f-2b10ba1afab5\") " pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.537457 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.643840 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.644247 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.645007 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.645052 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.654836 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.654880 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.849796 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64c8b5dcc-9xs2l"] Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.852580 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64c8b5dcc-9xs2l" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.883289 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64c8b5dcc-9xs2l"] Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.983198 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bm24\" (UniqueName: \"kubernetes.io/projected/b08decc2-ae57-4858-93bd-acc42ae42148-kube-api-access-9bm24\") pod \"dnsmasq-dns-64c8b5dcc-9xs2l\" (UID: \"b08decc2-ae57-4858-93bd-acc42ae42148\") " pod="openstack/dnsmasq-dns-64c8b5dcc-9xs2l" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.983529 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b08decc2-ae57-4858-93bd-acc42ae42148-ovsdbserver-sb\") pod \"dnsmasq-dns-64c8b5dcc-9xs2l\" (UID: \"b08decc2-ae57-4858-93bd-acc42ae42148\") " pod="openstack/dnsmasq-dns-64c8b5dcc-9xs2l" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.983687 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b08decc2-ae57-4858-93bd-acc42ae42148-config\") pod \"dnsmasq-dns-64c8b5dcc-9xs2l\" (UID: \"b08decc2-ae57-4858-93bd-acc42ae42148\") " pod="openstack/dnsmasq-dns-64c8b5dcc-9xs2l" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.983818 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b08decc2-ae57-4858-93bd-acc42ae42148-dns-swift-storage-0\") pod \"dnsmasq-dns-64c8b5dcc-9xs2l\" (UID: \"b08decc2-ae57-4858-93bd-acc42ae42148\") " pod="openstack/dnsmasq-dns-64c8b5dcc-9xs2l" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.984067 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b08decc2-ae57-4858-93bd-acc42ae42148-dns-svc\") pod \"dnsmasq-dns-64c8b5dcc-9xs2l\" (UID: \"b08decc2-ae57-4858-93bd-acc42ae42148\") " pod="openstack/dnsmasq-dns-64c8b5dcc-9xs2l" Feb 18 14:21:48 crc kubenswrapper[4817]: I0218 14:21:48.984246 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b08decc2-ae57-4858-93bd-acc42ae42148-ovsdbserver-nb\") pod \"dnsmasq-dns-64c8b5dcc-9xs2l\" (UID: \"b08decc2-ae57-4858-93bd-acc42ae42148\") " pod="openstack/dnsmasq-dns-64c8b5dcc-9xs2l" Feb 18 14:21:49 crc kubenswrapper[4817]: I0218 14:21:49.088645 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 18 14:21:49 crc kubenswrapper[4817]: I0218 14:21:49.089763 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bm24\" (UniqueName: \"kubernetes.io/projected/b08decc2-ae57-4858-93bd-acc42ae42148-kube-api-access-9bm24\") pod \"dnsmasq-dns-64c8b5dcc-9xs2l\" (UID: \"b08decc2-ae57-4858-93bd-acc42ae42148\") " pod="openstack/dnsmasq-dns-64c8b5dcc-9xs2l" Feb 18 14:21:49 crc kubenswrapper[4817]: I0218 14:21:49.090170 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b08decc2-ae57-4858-93bd-acc42ae42148-ovsdbserver-sb\") pod \"dnsmasq-dns-64c8b5dcc-9xs2l\" (UID: \"b08decc2-ae57-4858-93bd-acc42ae42148\") " pod="openstack/dnsmasq-dns-64c8b5dcc-9xs2l" Feb 18 14:21:49 crc kubenswrapper[4817]: I0218 14:21:49.090209 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b08decc2-ae57-4858-93bd-acc42ae42148-config\") pod \"dnsmasq-dns-64c8b5dcc-9xs2l\" (UID: \"b08decc2-ae57-4858-93bd-acc42ae42148\") " pod="openstack/dnsmasq-dns-64c8b5dcc-9xs2l" Feb 18 14:21:49 crc kubenswrapper[4817]: I0218 14:21:49.090248 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b08decc2-ae57-4858-93bd-acc42ae42148-dns-swift-storage-0\") pod \"dnsmasq-dns-64c8b5dcc-9xs2l\" (UID: \"b08decc2-ae57-4858-93bd-acc42ae42148\") " pod="openstack/dnsmasq-dns-64c8b5dcc-9xs2l" Feb 18 14:21:49 crc kubenswrapper[4817]: I0218 14:21:49.090304 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b08decc2-ae57-4858-93bd-acc42ae42148-dns-svc\") pod \"dnsmasq-dns-64c8b5dcc-9xs2l\" (UID: \"b08decc2-ae57-4858-93bd-acc42ae42148\") " pod="openstack/dnsmasq-dns-64c8b5dcc-9xs2l" Feb 18 14:21:49 crc kubenswrapper[4817]: I0218 14:21:49.090367 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b08decc2-ae57-4858-93bd-acc42ae42148-ovsdbserver-nb\") pod \"dnsmasq-dns-64c8b5dcc-9xs2l\" (UID: \"b08decc2-ae57-4858-93bd-acc42ae42148\") " pod="openstack/dnsmasq-dns-64c8b5dcc-9xs2l" Feb 18 14:21:49 crc kubenswrapper[4817]: I0218 14:21:49.091374 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b08decc2-ae57-4858-93bd-acc42ae42148-dns-swift-storage-0\") pod \"dnsmasq-dns-64c8b5dcc-9xs2l\" (UID: \"b08decc2-ae57-4858-93bd-acc42ae42148\") " pod="openstack/dnsmasq-dns-64c8b5dcc-9xs2l" Feb 18 14:21:49 crc kubenswrapper[4817]: I0218 14:21:49.091595 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b08decc2-ae57-4858-93bd-acc42ae42148-ovsdbserver-sb\") pod \"dnsmasq-dns-64c8b5dcc-9xs2l\" (UID: \"b08decc2-ae57-4858-93bd-acc42ae42148\") " pod="openstack/dnsmasq-dns-64c8b5dcc-9xs2l" Feb 18 14:21:49 crc kubenswrapper[4817]: I0218 14:21:49.092102 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b08decc2-ae57-4858-93bd-acc42ae42148-ovsdbserver-nb\") pod \"dnsmasq-dns-64c8b5dcc-9xs2l\" (UID: \"b08decc2-ae57-4858-93bd-acc42ae42148\") " pod="openstack/dnsmasq-dns-64c8b5dcc-9xs2l" Feb 18 14:21:49 crc kubenswrapper[4817]: I0218 14:21:49.092136 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b08decc2-ae57-4858-93bd-acc42ae42148-dns-svc\") pod \"dnsmasq-dns-64c8b5dcc-9xs2l\" (UID: \"b08decc2-ae57-4858-93bd-acc42ae42148\") " pod="openstack/dnsmasq-dns-64c8b5dcc-9xs2l" Feb 18 14:21:49 crc kubenswrapper[4817]: I0218 14:21:49.092188 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b08decc2-ae57-4858-93bd-acc42ae42148-config\") pod \"dnsmasq-dns-64c8b5dcc-9xs2l\" (UID: \"b08decc2-ae57-4858-93bd-acc42ae42148\") " pod="openstack/dnsmasq-dns-64c8b5dcc-9xs2l" Feb 18 14:21:49 crc kubenswrapper[4817]: I0218 14:21:49.131468 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bm24\" (UniqueName: \"kubernetes.io/projected/b08decc2-ae57-4858-93bd-acc42ae42148-kube-api-access-9bm24\") pod \"dnsmasq-dns-64c8b5dcc-9xs2l\" (UID: \"b08decc2-ae57-4858-93bd-acc42ae42148\") " pod="openstack/dnsmasq-dns-64c8b5dcc-9xs2l" Feb 18 14:21:49 crc kubenswrapper[4817]: I0218 14:21:49.140220 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2550ff13-9cef-4d0a-a413-21d40e809b87","Type":"ContainerStarted","Data":"c7e2ab2d5a29e4f6773856c7acb58e746cf27de4a74d0359c46117e6d03f7e64"} Feb 18 14:21:49 crc kubenswrapper[4817]: I0218 14:21:49.226388 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64c8b5dcc-9xs2l" Feb 18 14:21:49 crc kubenswrapper[4817]: W0218 14:21:49.819928 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb08decc2_ae57_4858_93bd_acc42ae42148.slice/crio-e4df5b4345d8240ace1757e3c36822caec3ea6c5940df9dc7c066aaddfb481e8 WatchSource:0}: Error finding container e4df5b4345d8240ace1757e3c36822caec3ea6c5940df9dc7c066aaddfb481e8: Status 404 returned error can't find the container with id e4df5b4345d8240ace1757e3c36822caec3ea6c5940df9dc7c066aaddfb481e8 Feb 18 14:21:49 crc kubenswrapper[4817]: I0218 14:21:49.821408 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64c8b5dcc-9xs2l"] Feb 18 14:21:50 crc kubenswrapper[4817]: I0218 14:21:50.196784 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fc896752-3d52-40cd-8d7f-2b10ba1afab5","Type":"ContainerStarted","Data":"33c9e9004639a29201ded6ee693d1abea5c1280a52c85d17772eb20b37522409"} Feb 18 14:21:50 crc kubenswrapper[4817]: I0218 14:21:50.197177 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fc896752-3d52-40cd-8d7f-2b10ba1afab5","Type":"ContainerStarted","Data":"76eec97e47636a6e2aafc74c8719a8cf7b2616c0c03262177139e4312ea16b10"} Feb 18 14:21:50 crc kubenswrapper[4817]: I0218 14:21:50.197192 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64c8b5dcc-9xs2l" event={"ID":"b08decc2-ae57-4858-93bd-acc42ae42148","Type":"ContainerStarted","Data":"e4df5b4345d8240ace1757e3c36822caec3ea6c5940df9dc7c066aaddfb481e8"} Feb 18 14:21:50 crc kubenswrapper[4817]: I0218 14:21:50.207942 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.2079190300000002 podStartE2EDuration="2.20791903s" podCreationTimestamp="2026-02-18 14:21:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:21:50.201853593 +0000 UTC m=+1372.777389586" watchObservedRunningTime="2026-02-18 14:21:50.20791903 +0000 UTC m=+1372.783455013" Feb 18 14:21:50 crc kubenswrapper[4817]: I0218 14:21:50.443445 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 18 14:21:51 crc kubenswrapper[4817]: I0218 14:21:51.193532 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2550ff13-9cef-4d0a-a413-21d40e809b87","Type":"ContainerStarted","Data":"121f8e6c68b4781fd471e0e57f82d2f0f6a0d6adf0ed36ddb60242357cea9ccf"} Feb 18 14:21:51 crc kubenswrapper[4817]: I0218 14:21:51.195451 4817 generic.go:334] "Generic (PLEG): container finished" podID="b08decc2-ae57-4858-93bd-acc42ae42148" containerID="287c80c5d3d514702d73a712c09c3cfc643b05e69403d0ae2d41731b6006066f" exitCode=0 Feb 18 14:21:51 crc kubenswrapper[4817]: I0218 14:21:51.195568 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64c8b5dcc-9xs2l" event={"ID":"b08decc2-ae57-4858-93bd-acc42ae42148","Type":"ContainerDied","Data":"287c80c5d3d514702d73a712c09c3cfc643b05e69403d0ae2d41731b6006066f"} Feb 18 14:21:51 crc kubenswrapper[4817]: I0218 14:21:51.726574 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 18 14:21:51 crc kubenswrapper[4817]: I0218 14:21:51.727184 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c665ad27-3a60-4b0f-854e-00505781b81a" containerName="nova-api-log" containerID="cri-o://fc2210fca05f1eddb86c52b4046cb984c11fc3c5d1c5f24339464e3ec4604ef0" gracePeriod=30 Feb 18 14:21:51 crc kubenswrapper[4817]: I0218 14:21:51.727260 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c665ad27-3a60-4b0f-854e-00505781b81a" containerName="nova-api-api" containerID="cri-o://e3375cdeb13f396c2ccc819d81b91758fb99a7f5f79e6337f60d63eccfc1ed22" gracePeriod=30 Feb 18 14:21:52 crc kubenswrapper[4817]: I0218 14:21:52.208524 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64c8b5dcc-9xs2l" event={"ID":"b08decc2-ae57-4858-93bd-acc42ae42148","Type":"ContainerStarted","Data":"56c1e054a5fe3b8e91d26c5b54e34e0521ee5f8873010f66b22a5af0e89f1122"} Feb 18 14:21:52 crc kubenswrapper[4817]: I0218 14:21:52.210210 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-64c8b5dcc-9xs2l" Feb 18 14:21:52 crc kubenswrapper[4817]: I0218 14:21:52.221394 4817 generic.go:334] "Generic (PLEG): container finished" podID="c665ad27-3a60-4b0f-854e-00505781b81a" containerID="fc2210fca05f1eddb86c52b4046cb984c11fc3c5d1c5f24339464e3ec4604ef0" exitCode=143 Feb 18 14:21:52 crc kubenswrapper[4817]: I0218 14:21:52.221455 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c665ad27-3a60-4b0f-854e-00505781b81a","Type":"ContainerDied","Data":"fc2210fca05f1eddb86c52b4046cb984c11fc3c5d1c5f24339464e3ec4604ef0"} Feb 18 14:21:52 crc kubenswrapper[4817]: I0218 14:21:52.221776 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:21:52 crc kubenswrapper[4817]: I0218 14:21:52.246431 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-64c8b5dcc-9xs2l" podStartSLOduration=4.246411167 podStartE2EDuration="4.246411167s" podCreationTimestamp="2026-02-18 14:21:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:21:52.238211054 +0000 UTC m=+1374.813747057" watchObservedRunningTime="2026-02-18 14:21:52.246411167 +0000 UTC m=+1374.821947140" Feb 18 14:21:53 crc kubenswrapper[4817]: I0218 14:21:53.237112 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2550ff13-9cef-4d0a-a413-21d40e809b87","Type":"ContainerStarted","Data":"f2ff8e07b29de796ea9989435892b3dd54afe8572d94e6b21440b66fe1311291"} Feb 18 14:21:53 crc kubenswrapper[4817]: I0218 14:21:53.237714 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 18 14:21:53 crc kubenswrapper[4817]: I0218 14:21:53.237551 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2550ff13-9cef-4d0a-a413-21d40e809b87" containerName="proxy-httpd" containerID="cri-o://f2ff8e07b29de796ea9989435892b3dd54afe8572d94e6b21440b66fe1311291" gracePeriod=30 Feb 18 14:21:53 crc kubenswrapper[4817]: I0218 14:21:53.237280 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2550ff13-9cef-4d0a-a413-21d40e809b87" containerName="ceilometer-central-agent" containerID="cri-o://780ab183ef5ff537ab569ae41b28e75022d93eab5c57e09b52e79a0667cb073f" gracePeriod=30 Feb 18 14:21:53 crc kubenswrapper[4817]: I0218 14:21:53.237580 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2550ff13-9cef-4d0a-a413-21d40e809b87" containerName="ceilometer-notification-agent" containerID="cri-o://c7e2ab2d5a29e4f6773856c7acb58e746cf27de4a74d0359c46117e6d03f7e64" gracePeriod=30 Feb 18 14:21:53 crc kubenswrapper[4817]: I0218 14:21:53.237566 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2550ff13-9cef-4d0a-a413-21d40e809b87" containerName="sg-core" containerID="cri-o://121f8e6c68b4781fd471e0e57f82d2f0f6a0d6adf0ed36ddb60242357cea9ccf" gracePeriod=30 Feb 18 14:21:53 crc kubenswrapper[4817]: I0218 14:21:53.265196 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.656748609 podStartE2EDuration="7.265172247s" podCreationTimestamp="2026-02-18 14:21:46 +0000 UTC" firstStartedPulling="2026-02-18 14:21:47.012723165 +0000 UTC m=+1369.588259148" lastFinishedPulling="2026-02-18 14:21:52.621146803 +0000 UTC m=+1375.196682786" observedRunningTime="2026-02-18 14:21:53.255053935 +0000 UTC m=+1375.830589918" watchObservedRunningTime="2026-02-18 14:21:53.265172247 +0000 UTC m=+1375.840708230" Feb 18 14:21:53 crc kubenswrapper[4817]: I0218 14:21:53.538477 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:54 crc kubenswrapper[4817]: I0218 14:21:54.248393 4817 generic.go:334] "Generic (PLEG): container finished" podID="2550ff13-9cef-4d0a-a413-21d40e809b87" containerID="121f8e6c68b4781fd471e0e57f82d2f0f6a0d6adf0ed36ddb60242357cea9ccf" exitCode=2 Feb 18 14:21:54 crc kubenswrapper[4817]: I0218 14:21:54.248717 4817 generic.go:334] "Generic (PLEG): container finished" podID="2550ff13-9cef-4d0a-a413-21d40e809b87" containerID="c7e2ab2d5a29e4f6773856c7acb58e746cf27de4a74d0359c46117e6d03f7e64" exitCode=0 Feb 18 14:21:54 crc kubenswrapper[4817]: I0218 14:21:54.248434 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2550ff13-9cef-4d0a-a413-21d40e809b87","Type":"ContainerDied","Data":"121f8e6c68b4781fd471e0e57f82d2f0f6a0d6adf0ed36ddb60242357cea9ccf"} Feb 18 14:21:54 crc kubenswrapper[4817]: I0218 14:21:54.248886 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2550ff13-9cef-4d0a-a413-21d40e809b87","Type":"ContainerDied","Data":"c7e2ab2d5a29e4f6773856c7acb58e746cf27de4a74d0359c46117e6d03f7e64"} Feb 18 14:21:55 crc kubenswrapper[4817]: I0218 14:21:55.262214 4817 generic.go:334] "Generic (PLEG): container finished" podID="c665ad27-3a60-4b0f-854e-00505781b81a" containerID="e3375cdeb13f396c2ccc819d81b91758fb99a7f5f79e6337f60d63eccfc1ed22" exitCode=0 Feb 18 14:21:55 crc kubenswrapper[4817]: I0218 14:21:55.262521 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c665ad27-3a60-4b0f-854e-00505781b81a","Type":"ContainerDied","Data":"e3375cdeb13f396c2ccc819d81b91758fb99a7f5f79e6337f60d63eccfc1ed22"} Feb 18 14:21:55 crc kubenswrapper[4817]: I0218 14:21:55.433785 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 14:21:55 crc kubenswrapper[4817]: I0218 14:21:55.499336 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c665ad27-3a60-4b0f-854e-00505781b81a-logs\") pod \"c665ad27-3a60-4b0f-854e-00505781b81a\" (UID: \"c665ad27-3a60-4b0f-854e-00505781b81a\") " Feb 18 14:21:55 crc kubenswrapper[4817]: I0218 14:21:55.499505 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c665ad27-3a60-4b0f-854e-00505781b81a-combined-ca-bundle\") pod \"c665ad27-3a60-4b0f-854e-00505781b81a\" (UID: \"c665ad27-3a60-4b0f-854e-00505781b81a\") " Feb 18 14:21:55 crc kubenswrapper[4817]: I0218 14:21:55.499549 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d66dq\" (UniqueName: \"kubernetes.io/projected/c665ad27-3a60-4b0f-854e-00505781b81a-kube-api-access-d66dq\") pod \"c665ad27-3a60-4b0f-854e-00505781b81a\" (UID: \"c665ad27-3a60-4b0f-854e-00505781b81a\") " Feb 18 14:21:55 crc kubenswrapper[4817]: I0218 14:21:55.499740 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c665ad27-3a60-4b0f-854e-00505781b81a-config-data\") pod \"c665ad27-3a60-4b0f-854e-00505781b81a\" (UID: \"c665ad27-3a60-4b0f-854e-00505781b81a\") " Feb 18 14:21:55 crc kubenswrapper[4817]: I0218 14:21:55.500602 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c665ad27-3a60-4b0f-854e-00505781b81a-logs" (OuterVolumeSpecName: "logs") pod "c665ad27-3a60-4b0f-854e-00505781b81a" (UID: "c665ad27-3a60-4b0f-854e-00505781b81a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:21:55 crc kubenswrapper[4817]: I0218 14:21:55.510561 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c665ad27-3a60-4b0f-854e-00505781b81a-kube-api-access-d66dq" (OuterVolumeSpecName: "kube-api-access-d66dq") pod "c665ad27-3a60-4b0f-854e-00505781b81a" (UID: "c665ad27-3a60-4b0f-854e-00505781b81a"). InnerVolumeSpecName "kube-api-access-d66dq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:21:55 crc kubenswrapper[4817]: I0218 14:21:55.539329 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c665ad27-3a60-4b0f-854e-00505781b81a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c665ad27-3a60-4b0f-854e-00505781b81a" (UID: "c665ad27-3a60-4b0f-854e-00505781b81a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:21:55 crc kubenswrapper[4817]: I0218 14:21:55.546357 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c665ad27-3a60-4b0f-854e-00505781b81a-config-data" (OuterVolumeSpecName: "config-data") pod "c665ad27-3a60-4b0f-854e-00505781b81a" (UID: "c665ad27-3a60-4b0f-854e-00505781b81a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:21:55 crc kubenswrapper[4817]: I0218 14:21:55.602069 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c665ad27-3a60-4b0f-854e-00505781b81a-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:55 crc kubenswrapper[4817]: I0218 14:21:55.602106 4817 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c665ad27-3a60-4b0f-854e-00505781b81a-logs\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:55 crc kubenswrapper[4817]: I0218 14:21:55.602120 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c665ad27-3a60-4b0f-854e-00505781b81a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:55 crc kubenswrapper[4817]: I0218 14:21:55.602134 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d66dq\" (UniqueName: \"kubernetes.io/projected/c665ad27-3a60-4b0f-854e-00505781b81a-kube-api-access-d66dq\") on node \"crc\" DevicePath \"\"" Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.274806 4817 generic.go:334] "Generic (PLEG): container finished" podID="2550ff13-9cef-4d0a-a413-21d40e809b87" containerID="780ab183ef5ff537ab569ae41b28e75022d93eab5c57e09b52e79a0667cb073f" exitCode=0 Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.274868 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2550ff13-9cef-4d0a-a413-21d40e809b87","Type":"ContainerDied","Data":"780ab183ef5ff537ab569ae41b28e75022d93eab5c57e09b52e79a0667cb073f"} Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.277435 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c665ad27-3a60-4b0f-854e-00505781b81a","Type":"ContainerDied","Data":"6ba5393bb604dccc32be9c796a627d4724dd2a0e4bd1d8cf0ace3625d95bd2f5"} Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.277477 4817 scope.go:117] "RemoveContainer" containerID="e3375cdeb13f396c2ccc819d81b91758fb99a7f5f79e6337f60d63eccfc1ed22" Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.277644 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.303310 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.306383 4817 scope.go:117] "RemoveContainer" containerID="fc2210fca05f1eddb86c52b4046cb984c11fc3c5d1c5f24339464e3ec4604ef0" Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.319889 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.329229 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 18 14:21:56 crc kubenswrapper[4817]: E0218 14:21:56.329755 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c665ad27-3a60-4b0f-854e-00505781b81a" containerName="nova-api-log" Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.329792 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="c665ad27-3a60-4b0f-854e-00505781b81a" containerName="nova-api-log" Feb 18 14:21:56 crc kubenswrapper[4817]: E0218 14:21:56.329801 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c665ad27-3a60-4b0f-854e-00505781b81a" containerName="nova-api-api" Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.329807 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="c665ad27-3a60-4b0f-854e-00505781b81a" containerName="nova-api-api" Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.330044 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="c665ad27-3a60-4b0f-854e-00505781b81a" containerName="nova-api-api" Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.330098 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="c665ad27-3a60-4b0f-854e-00505781b81a" containerName="nova-api-log" Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.331416 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.335640 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.335740 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.340944 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.350863 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.520116 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dbdd4a-921a-413d-9feb-102ff3337d7e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"32dbdd4a-921a-413d-9feb-102ff3337d7e\") " pod="openstack/nova-api-0" Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.520657 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32dbdd4a-921a-413d-9feb-102ff3337d7e-public-tls-certs\") pod \"nova-api-0\" (UID: \"32dbdd4a-921a-413d-9feb-102ff3337d7e\") " pod="openstack/nova-api-0" Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.520895 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32dbdd4a-921a-413d-9feb-102ff3337d7e-logs\") pod \"nova-api-0\" (UID: \"32dbdd4a-921a-413d-9feb-102ff3337d7e\") " pod="openstack/nova-api-0" Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.521094 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32dbdd4a-921a-413d-9feb-102ff3337d7e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"32dbdd4a-921a-413d-9feb-102ff3337d7e\") " pod="openstack/nova-api-0" Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.521324 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32dbdd4a-921a-413d-9feb-102ff3337d7e-config-data\") pod \"nova-api-0\" (UID: \"32dbdd4a-921a-413d-9feb-102ff3337d7e\") " pod="openstack/nova-api-0" Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.521404 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c65d9\" (UniqueName: \"kubernetes.io/projected/32dbdd4a-921a-413d-9feb-102ff3337d7e-kube-api-access-c65d9\") pod \"nova-api-0\" (UID: \"32dbdd4a-921a-413d-9feb-102ff3337d7e\") " pod="openstack/nova-api-0" Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.623496 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32dbdd4a-921a-413d-9feb-102ff3337d7e-config-data\") pod \"nova-api-0\" (UID: \"32dbdd4a-921a-413d-9feb-102ff3337d7e\") " pod="openstack/nova-api-0" Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.623576 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c65d9\" (UniqueName: \"kubernetes.io/projected/32dbdd4a-921a-413d-9feb-102ff3337d7e-kube-api-access-c65d9\") pod \"nova-api-0\" (UID: \"32dbdd4a-921a-413d-9feb-102ff3337d7e\") " pod="openstack/nova-api-0" Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.623636 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dbdd4a-921a-413d-9feb-102ff3337d7e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"32dbdd4a-921a-413d-9feb-102ff3337d7e\") " pod="openstack/nova-api-0" Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.623673 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32dbdd4a-921a-413d-9feb-102ff3337d7e-public-tls-certs\") pod \"nova-api-0\" (UID: \"32dbdd4a-921a-413d-9feb-102ff3337d7e\") " pod="openstack/nova-api-0" Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.623752 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32dbdd4a-921a-413d-9feb-102ff3337d7e-logs\") pod \"nova-api-0\" (UID: \"32dbdd4a-921a-413d-9feb-102ff3337d7e\") " pod="openstack/nova-api-0" Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.623819 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32dbdd4a-921a-413d-9feb-102ff3337d7e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"32dbdd4a-921a-413d-9feb-102ff3337d7e\") " pod="openstack/nova-api-0" Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.624324 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32dbdd4a-921a-413d-9feb-102ff3337d7e-logs\") pod \"nova-api-0\" (UID: \"32dbdd4a-921a-413d-9feb-102ff3337d7e\") " pod="openstack/nova-api-0" Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.629732 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32dbdd4a-921a-413d-9feb-102ff3337d7e-public-tls-certs\") pod \"nova-api-0\" (UID: \"32dbdd4a-921a-413d-9feb-102ff3337d7e\") " pod="openstack/nova-api-0" Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.629824 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dbdd4a-921a-413d-9feb-102ff3337d7e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"32dbdd4a-921a-413d-9feb-102ff3337d7e\") " pod="openstack/nova-api-0" Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.630167 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32dbdd4a-921a-413d-9feb-102ff3337d7e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"32dbdd4a-921a-413d-9feb-102ff3337d7e\") " pod="openstack/nova-api-0" Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.638038 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32dbdd4a-921a-413d-9feb-102ff3337d7e-config-data\") pod \"nova-api-0\" (UID: \"32dbdd4a-921a-413d-9feb-102ff3337d7e\") " pod="openstack/nova-api-0" Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.640605 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c65d9\" (UniqueName: \"kubernetes.io/projected/32dbdd4a-921a-413d-9feb-102ff3337d7e-kube-api-access-c65d9\") pod \"nova-api-0\" (UID: \"32dbdd4a-921a-413d-9feb-102ff3337d7e\") " pod="openstack/nova-api-0" Feb 18 14:21:56 crc kubenswrapper[4817]: I0218 14:21:56.664378 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 14:21:57 crc kubenswrapper[4817]: I0218 14:21:57.151613 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 18 14:21:57 crc kubenswrapper[4817]: I0218 14:21:57.402251 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"32dbdd4a-921a-413d-9feb-102ff3337d7e","Type":"ContainerStarted","Data":"0c1a945c27407719a16eaa882a02676d4e88540594e6dcc01b3b590ff8bd8fd5"} Feb 18 14:21:58 crc kubenswrapper[4817]: I0218 14:21:58.184727 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c665ad27-3a60-4b0f-854e-00505781b81a" path="/var/lib/kubelet/pods/c665ad27-3a60-4b0f-854e-00505781b81a/volumes" Feb 18 14:21:58 crc kubenswrapper[4817]: I0218 14:21:58.422118 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"32dbdd4a-921a-413d-9feb-102ff3337d7e","Type":"ContainerStarted","Data":"828ae87010c78765c313f339639193e5efd0b302f9bba2db9e8de540dbbaa51f"} Feb 18 14:21:58 crc kubenswrapper[4817]: I0218 14:21:58.422166 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"32dbdd4a-921a-413d-9feb-102ff3337d7e","Type":"ContainerStarted","Data":"bf8956ae2fede457f6132a1ed06622def3580c3f84f2922a8f5aca098967b3e5"} Feb 18 14:21:58 crc kubenswrapper[4817]: I0218 14:21:58.537621 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:58 crc kubenswrapper[4817]: I0218 14:21:58.556799 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:58 crc kubenswrapper[4817]: I0218 14:21:58.584122 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.584096303 podStartE2EDuration="2.584096303s" podCreationTimestamp="2026-02-18 14:21:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:21:58.447707214 +0000 UTC m=+1381.023243217" watchObservedRunningTime="2026-02-18 14:21:58.584096303 +0000 UTC m=+1381.159632286" Feb 18 14:21:59 crc kubenswrapper[4817]: I0218 14:21:59.228594 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-64c8b5dcc-9xs2l" Feb 18 14:21:59 crc kubenswrapper[4817]: I0218 14:21:59.296000 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d578b86f9-th88v"] Feb 18 14:21:59 crc kubenswrapper[4817]: I0218 14:21:59.296230 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d578b86f9-th88v" podUID="e6b021a8-f766-4296-87b8-88b45f99a5ba" containerName="dnsmasq-dns" containerID="cri-o://e0d235e105fd3072304bce38d7b920551c82803f91c13815c70e98c0c9045e74" gracePeriod=10 Feb 18 14:21:59 crc kubenswrapper[4817]: I0218 14:21:59.451760 4817 generic.go:334] "Generic (PLEG): container finished" podID="e6b021a8-f766-4296-87b8-88b45f99a5ba" containerID="e0d235e105fd3072304bce38d7b920551c82803f91c13815c70e98c0c9045e74" exitCode=0 Feb 18 14:21:59 crc kubenswrapper[4817]: I0218 14:21:59.451840 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d578b86f9-th88v" event={"ID":"e6b021a8-f766-4296-87b8-88b45f99a5ba","Type":"ContainerDied","Data":"e0d235e105fd3072304bce38d7b920551c82803f91c13815c70e98c0c9045e74"} Feb 18 14:21:59 crc kubenswrapper[4817]: I0218 14:21:59.477605 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 18 14:21:59 crc kubenswrapper[4817]: I0218 14:21:59.688924 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-khvxr"] Feb 18 14:21:59 crc kubenswrapper[4817]: I0218 14:21:59.690220 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-khvxr" Feb 18 14:21:59 crc kubenswrapper[4817]: I0218 14:21:59.695524 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 18 14:21:59 crc kubenswrapper[4817]: I0218 14:21:59.695718 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 18 14:21:59 crc kubenswrapper[4817]: I0218 14:21:59.716995 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-khvxr"] Feb 18 14:21:59 crc kubenswrapper[4817]: I0218 14:21:59.813435 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nrr7\" (UniqueName: \"kubernetes.io/projected/22047a76-beb3-439b-994a-25a8c306be7b-kube-api-access-5nrr7\") pod \"nova-cell1-cell-mapping-khvxr\" (UID: \"22047a76-beb3-439b-994a-25a8c306be7b\") " pod="openstack/nova-cell1-cell-mapping-khvxr" Feb 18 14:21:59 crc kubenswrapper[4817]: I0218 14:21:59.813519 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22047a76-beb3-439b-994a-25a8c306be7b-config-data\") pod \"nova-cell1-cell-mapping-khvxr\" (UID: \"22047a76-beb3-439b-994a-25a8c306be7b\") " pod="openstack/nova-cell1-cell-mapping-khvxr" Feb 18 14:21:59 crc kubenswrapper[4817]: I0218 14:21:59.813555 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22047a76-beb3-439b-994a-25a8c306be7b-scripts\") pod \"nova-cell1-cell-mapping-khvxr\" (UID: \"22047a76-beb3-439b-994a-25a8c306be7b\") " pod="openstack/nova-cell1-cell-mapping-khvxr" Feb 18 14:21:59 crc kubenswrapper[4817]: I0218 14:21:59.814138 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22047a76-beb3-439b-994a-25a8c306be7b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-khvxr\" (UID: \"22047a76-beb3-439b-994a-25a8c306be7b\") " pod="openstack/nova-cell1-cell-mapping-khvxr" Feb 18 14:21:59 crc kubenswrapper[4817]: I0218 14:21:59.906120 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d578b86f9-th88v" Feb 18 14:21:59 crc kubenswrapper[4817]: I0218 14:21:59.915719 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22047a76-beb3-439b-994a-25a8c306be7b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-khvxr\" (UID: \"22047a76-beb3-439b-994a-25a8c306be7b\") " pod="openstack/nova-cell1-cell-mapping-khvxr" Feb 18 14:21:59 crc kubenswrapper[4817]: I0218 14:21:59.916022 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nrr7\" (UniqueName: \"kubernetes.io/projected/22047a76-beb3-439b-994a-25a8c306be7b-kube-api-access-5nrr7\") pod \"nova-cell1-cell-mapping-khvxr\" (UID: \"22047a76-beb3-439b-994a-25a8c306be7b\") " pod="openstack/nova-cell1-cell-mapping-khvxr" Feb 18 14:21:59 crc kubenswrapper[4817]: I0218 14:21:59.916149 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22047a76-beb3-439b-994a-25a8c306be7b-config-data\") pod \"nova-cell1-cell-mapping-khvxr\" (UID: \"22047a76-beb3-439b-994a-25a8c306be7b\") " pod="openstack/nova-cell1-cell-mapping-khvxr" Feb 18 14:21:59 crc kubenswrapper[4817]: I0218 14:21:59.916288 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22047a76-beb3-439b-994a-25a8c306be7b-scripts\") pod \"nova-cell1-cell-mapping-khvxr\" (UID: \"22047a76-beb3-439b-994a-25a8c306be7b\") " pod="openstack/nova-cell1-cell-mapping-khvxr" Feb 18 14:21:59 crc kubenswrapper[4817]: I0218 14:21:59.922304 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22047a76-beb3-439b-994a-25a8c306be7b-config-data\") pod \"nova-cell1-cell-mapping-khvxr\" (UID: \"22047a76-beb3-439b-994a-25a8c306be7b\") " pod="openstack/nova-cell1-cell-mapping-khvxr" Feb 18 14:21:59 crc kubenswrapper[4817]: I0218 14:21:59.922680 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22047a76-beb3-439b-994a-25a8c306be7b-scripts\") pod \"nova-cell1-cell-mapping-khvxr\" (UID: \"22047a76-beb3-439b-994a-25a8c306be7b\") " pod="openstack/nova-cell1-cell-mapping-khvxr" Feb 18 14:21:59 crc kubenswrapper[4817]: I0218 14:21:59.924836 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22047a76-beb3-439b-994a-25a8c306be7b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-khvxr\" (UID: \"22047a76-beb3-439b-994a-25a8c306be7b\") " pod="openstack/nova-cell1-cell-mapping-khvxr" Feb 18 14:21:59 crc kubenswrapper[4817]: I0218 14:21:59.936954 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nrr7\" (UniqueName: \"kubernetes.io/projected/22047a76-beb3-439b-994a-25a8c306be7b-kube-api-access-5nrr7\") pod \"nova-cell1-cell-mapping-khvxr\" (UID: \"22047a76-beb3-439b-994a-25a8c306be7b\") " pod="openstack/nova-cell1-cell-mapping-khvxr" Feb 18 14:22:00 crc kubenswrapper[4817]: I0218 14:22:00.018240 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e6b021a8-f766-4296-87b8-88b45f99a5ba-dns-swift-storage-0\") pod \"e6b021a8-f766-4296-87b8-88b45f99a5ba\" (UID: \"e6b021a8-f766-4296-87b8-88b45f99a5ba\") " Feb 18 14:22:00 crc kubenswrapper[4817]: I0218 14:22:00.018297 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6b021a8-f766-4296-87b8-88b45f99a5ba-ovsdbserver-sb\") pod \"e6b021a8-f766-4296-87b8-88b45f99a5ba\" (UID: \"e6b021a8-f766-4296-87b8-88b45f99a5ba\") " Feb 18 14:22:00 crc kubenswrapper[4817]: I0218 14:22:00.018357 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6b021a8-f766-4296-87b8-88b45f99a5ba-ovsdbserver-nb\") pod \"e6b021a8-f766-4296-87b8-88b45f99a5ba\" (UID: \"e6b021a8-f766-4296-87b8-88b45f99a5ba\") " Feb 18 14:22:00 crc kubenswrapper[4817]: I0218 14:22:00.018415 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkf8b\" (UniqueName: \"kubernetes.io/projected/e6b021a8-f766-4296-87b8-88b45f99a5ba-kube-api-access-xkf8b\") pod \"e6b021a8-f766-4296-87b8-88b45f99a5ba\" (UID: \"e6b021a8-f766-4296-87b8-88b45f99a5ba\") " Feb 18 14:22:00 crc kubenswrapper[4817]: I0218 14:22:00.018567 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6b021a8-f766-4296-87b8-88b45f99a5ba-config\") pod \"e6b021a8-f766-4296-87b8-88b45f99a5ba\" (UID: \"e6b021a8-f766-4296-87b8-88b45f99a5ba\") " Feb 18 14:22:00 crc kubenswrapper[4817]: I0218 14:22:00.018651 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6b021a8-f766-4296-87b8-88b45f99a5ba-dns-svc\") pod \"e6b021a8-f766-4296-87b8-88b45f99a5ba\" (UID: \"e6b021a8-f766-4296-87b8-88b45f99a5ba\") " Feb 18 14:22:00 crc kubenswrapper[4817]: I0218 14:22:00.021453 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6b021a8-f766-4296-87b8-88b45f99a5ba-kube-api-access-xkf8b" (OuterVolumeSpecName: "kube-api-access-xkf8b") pod "e6b021a8-f766-4296-87b8-88b45f99a5ba" (UID: "e6b021a8-f766-4296-87b8-88b45f99a5ba"). InnerVolumeSpecName "kube-api-access-xkf8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:22:00 crc kubenswrapper[4817]: I0218 14:22:00.022287 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-khvxr" Feb 18 14:22:00 crc kubenswrapper[4817]: I0218 14:22:00.079078 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6b021a8-f766-4296-87b8-88b45f99a5ba-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e6b021a8-f766-4296-87b8-88b45f99a5ba" (UID: "e6b021a8-f766-4296-87b8-88b45f99a5ba"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:22:00 crc kubenswrapper[4817]: I0218 14:22:00.084238 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6b021a8-f766-4296-87b8-88b45f99a5ba-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e6b021a8-f766-4296-87b8-88b45f99a5ba" (UID: "e6b021a8-f766-4296-87b8-88b45f99a5ba"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:22:00 crc kubenswrapper[4817]: I0218 14:22:00.088144 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6b021a8-f766-4296-87b8-88b45f99a5ba-config" (OuterVolumeSpecName: "config") pod "e6b021a8-f766-4296-87b8-88b45f99a5ba" (UID: "e6b021a8-f766-4296-87b8-88b45f99a5ba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:22:00 crc kubenswrapper[4817]: I0218 14:22:00.089190 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6b021a8-f766-4296-87b8-88b45f99a5ba-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e6b021a8-f766-4296-87b8-88b45f99a5ba" (UID: "e6b021a8-f766-4296-87b8-88b45f99a5ba"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:22:00 crc kubenswrapper[4817]: I0218 14:22:00.117124 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6b021a8-f766-4296-87b8-88b45f99a5ba-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e6b021a8-f766-4296-87b8-88b45f99a5ba" (UID: "e6b021a8-f766-4296-87b8-88b45f99a5ba"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:22:00 crc kubenswrapper[4817]: I0218 14:22:00.122890 4817 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e6b021a8-f766-4296-87b8-88b45f99a5ba-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 14:22:00 crc kubenswrapper[4817]: I0218 14:22:00.122948 4817 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6b021a8-f766-4296-87b8-88b45f99a5ba-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 14:22:00 crc kubenswrapper[4817]: I0218 14:22:00.122963 4817 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6b021a8-f766-4296-87b8-88b45f99a5ba-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 14:22:00 crc kubenswrapper[4817]: I0218 14:22:00.122996 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkf8b\" (UniqueName: \"kubernetes.io/projected/e6b021a8-f766-4296-87b8-88b45f99a5ba-kube-api-access-xkf8b\") on node \"crc\" DevicePath \"\"" Feb 18 14:22:00 crc kubenswrapper[4817]: I0218 14:22:00.123013 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6b021a8-f766-4296-87b8-88b45f99a5ba-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:22:00 crc kubenswrapper[4817]: I0218 14:22:00.123025 4817 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6b021a8-f766-4296-87b8-88b45f99a5ba-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 14:22:00 crc kubenswrapper[4817]: I0218 14:22:00.466744 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d578b86f9-th88v" event={"ID":"e6b021a8-f766-4296-87b8-88b45f99a5ba","Type":"ContainerDied","Data":"2c5d33fd94566ebdc2a2cfb229a950124e4d811986e19a29d0d3d4df7cd422f0"} Feb 18 14:22:00 crc kubenswrapper[4817]: I0218 14:22:00.466836 4817 scope.go:117] "RemoveContainer" containerID="e0d235e105fd3072304bce38d7b920551c82803f91c13815c70e98c0c9045e74" Feb 18 14:22:00 crc kubenswrapper[4817]: I0218 14:22:00.466854 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d578b86f9-th88v" Feb 18 14:22:00 crc kubenswrapper[4817]: I0218 14:22:00.519886 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d578b86f9-th88v"] Feb 18 14:22:00 crc kubenswrapper[4817]: I0218 14:22:00.528071 4817 scope.go:117] "RemoveContainer" containerID="9f4e4287c85bd791f9c7fcd06f427262a323952dc37bc8614cbd4785daa1688a" Feb 18 14:22:00 crc kubenswrapper[4817]: I0218 14:22:00.529059 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d578b86f9-th88v"] Feb 18 14:22:00 crc kubenswrapper[4817]: I0218 14:22:00.540797 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-khvxr"] Feb 18 14:22:01 crc kubenswrapper[4817]: I0218 14:22:01.488494 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-khvxr" event={"ID":"22047a76-beb3-439b-994a-25a8c306be7b","Type":"ContainerStarted","Data":"522a5c4cd7a61ef4b17190b5ad9cbb8f6613603f9ad5fd28f81bab978cd24c1c"} Feb 18 14:22:01 crc kubenswrapper[4817]: I0218 14:22:01.488930 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-khvxr" event={"ID":"22047a76-beb3-439b-994a-25a8c306be7b","Type":"ContainerStarted","Data":"0c06ace94c08e7a1ff07b093e8b321e4a93c44510ae9596533436dc2e108af93"} Feb 18 14:22:01 crc kubenswrapper[4817]: I0218 14:22:01.512502 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-khvxr" podStartSLOduration=2.512477665 podStartE2EDuration="2.512477665s" podCreationTimestamp="2026-02-18 14:21:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:22:01.500229878 +0000 UTC m=+1384.075765861" watchObservedRunningTime="2026-02-18 14:22:01.512477665 +0000 UTC m=+1384.088013648" Feb 18 14:22:02 crc kubenswrapper[4817]: I0218 14:22:02.188027 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6b021a8-f766-4296-87b8-88b45f99a5ba" path="/var/lib/kubelet/pods/e6b021a8-f766-4296-87b8-88b45f99a5ba/volumes" Feb 18 14:22:06 crc kubenswrapper[4817]: I0218 14:22:06.545574 4817 generic.go:334] "Generic (PLEG): container finished" podID="22047a76-beb3-439b-994a-25a8c306be7b" containerID="522a5c4cd7a61ef4b17190b5ad9cbb8f6613603f9ad5fd28f81bab978cd24c1c" exitCode=0 Feb 18 14:22:06 crc kubenswrapper[4817]: I0218 14:22:06.545666 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-khvxr" event={"ID":"22047a76-beb3-439b-994a-25a8c306be7b","Type":"ContainerDied","Data":"522a5c4cd7a61ef4b17190b5ad9cbb8f6613603f9ad5fd28f81bab978cd24c1c"} Feb 18 14:22:06 crc kubenswrapper[4817]: I0218 14:22:06.664883 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 18 14:22:06 crc kubenswrapper[4817]: I0218 14:22:06.664999 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 18 14:22:07 crc kubenswrapper[4817]: I0218 14:22:07.678199 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="32dbdd4a-921a-413d-9feb-102ff3337d7e" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.230:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 18 14:22:07 crc kubenswrapper[4817]: I0218 14:22:07.678230 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="32dbdd4a-921a-413d-9feb-102ff3337d7e" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.230:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 18 14:22:08 crc kubenswrapper[4817]: I0218 14:22:08.043661 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-khvxr" Feb 18 14:22:08 crc kubenswrapper[4817]: I0218 14:22:08.100371 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22047a76-beb3-439b-994a-25a8c306be7b-config-data\") pod \"22047a76-beb3-439b-994a-25a8c306be7b\" (UID: \"22047a76-beb3-439b-994a-25a8c306be7b\") " Feb 18 14:22:08 crc kubenswrapper[4817]: I0218 14:22:08.100524 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nrr7\" (UniqueName: \"kubernetes.io/projected/22047a76-beb3-439b-994a-25a8c306be7b-kube-api-access-5nrr7\") pod \"22047a76-beb3-439b-994a-25a8c306be7b\" (UID: \"22047a76-beb3-439b-994a-25a8c306be7b\") " Feb 18 14:22:08 crc kubenswrapper[4817]: I0218 14:22:08.100681 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22047a76-beb3-439b-994a-25a8c306be7b-combined-ca-bundle\") pod \"22047a76-beb3-439b-994a-25a8c306be7b\" (UID: \"22047a76-beb3-439b-994a-25a8c306be7b\") " Feb 18 14:22:08 crc kubenswrapper[4817]: I0218 14:22:08.100777 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22047a76-beb3-439b-994a-25a8c306be7b-scripts\") pod \"22047a76-beb3-439b-994a-25a8c306be7b\" (UID: \"22047a76-beb3-439b-994a-25a8c306be7b\") " Feb 18 14:22:08 crc kubenswrapper[4817]: I0218 14:22:08.117389 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22047a76-beb3-439b-994a-25a8c306be7b-kube-api-access-5nrr7" (OuterVolumeSpecName: "kube-api-access-5nrr7") pod "22047a76-beb3-439b-994a-25a8c306be7b" (UID: "22047a76-beb3-439b-994a-25a8c306be7b"). InnerVolumeSpecName "kube-api-access-5nrr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:22:08 crc kubenswrapper[4817]: I0218 14:22:08.121348 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22047a76-beb3-439b-994a-25a8c306be7b-scripts" (OuterVolumeSpecName: "scripts") pod "22047a76-beb3-439b-994a-25a8c306be7b" (UID: "22047a76-beb3-439b-994a-25a8c306be7b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:22:08 crc kubenswrapper[4817]: I0218 14:22:08.156268 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22047a76-beb3-439b-994a-25a8c306be7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22047a76-beb3-439b-994a-25a8c306be7b" (UID: "22047a76-beb3-439b-994a-25a8c306be7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:22:08 crc kubenswrapper[4817]: I0218 14:22:08.158252 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22047a76-beb3-439b-994a-25a8c306be7b-config-data" (OuterVolumeSpecName: "config-data") pod "22047a76-beb3-439b-994a-25a8c306be7b" (UID: "22047a76-beb3-439b-994a-25a8c306be7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:22:08 crc kubenswrapper[4817]: I0218 14:22:08.204375 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22047a76-beb3-439b-994a-25a8c306be7b-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:22:08 crc kubenswrapper[4817]: I0218 14:22:08.204414 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22047a76-beb3-439b-994a-25a8c306be7b-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:22:08 crc kubenswrapper[4817]: I0218 14:22:08.204427 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nrr7\" (UniqueName: \"kubernetes.io/projected/22047a76-beb3-439b-994a-25a8c306be7b-kube-api-access-5nrr7\") on node \"crc\" DevicePath \"\"" Feb 18 14:22:08 crc kubenswrapper[4817]: I0218 14:22:08.204441 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22047a76-beb3-439b-994a-25a8c306be7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:22:08 crc kubenswrapper[4817]: I0218 14:22:08.571238 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-khvxr" event={"ID":"22047a76-beb3-439b-994a-25a8c306be7b","Type":"ContainerDied","Data":"0c06ace94c08e7a1ff07b093e8b321e4a93c44510ae9596533436dc2e108af93"} Feb 18 14:22:08 crc kubenswrapper[4817]: I0218 14:22:08.571279 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c06ace94c08e7a1ff07b093e8b321e4a93c44510ae9596533436dc2e108af93" Feb 18 14:22:08 crc kubenswrapper[4817]: I0218 14:22:08.571338 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-khvxr" Feb 18 14:22:08 crc kubenswrapper[4817]: I0218 14:22:08.892836 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 14:22:08 crc kubenswrapper[4817]: I0218 14:22:08.893303 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="cde987c5-28be-48dd-835a-30ad08140eb8" containerName="nova-scheduler-scheduler" containerID="cri-o://5182819c5508a7d1b7a1e4a119ca1536d229c98b56b0a4eca4b5a710103a01df" gracePeriod=30 Feb 18 14:22:08 crc kubenswrapper[4817]: I0218 14:22:08.917695 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 18 14:22:08 crc kubenswrapper[4817]: I0218 14:22:08.918489 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="32dbdd4a-921a-413d-9feb-102ff3337d7e" containerName="nova-api-log" containerID="cri-o://bf8956ae2fede457f6132a1ed06622def3580c3f84f2922a8f5aca098967b3e5" gracePeriod=30 Feb 18 14:22:08 crc kubenswrapper[4817]: I0218 14:22:08.918925 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="32dbdd4a-921a-413d-9feb-102ff3337d7e" containerName="nova-api-api" containerID="cri-o://828ae87010c78765c313f339639193e5efd0b302f9bba2db9e8de540dbbaa51f" gracePeriod=30 Feb 18 14:22:08 crc kubenswrapper[4817]: I0218 14:22:08.948399 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 14:22:08 crc kubenswrapper[4817]: I0218 14:22:08.948626 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d6e8372d-7ebc-4f22-8d3d-6d653c128b06" containerName="nova-metadata-log" containerID="cri-o://d2c990d70a0a202e89fdfdf55fe6e779315f8e8016b2d9abf431d4e19a6f7ac4" gracePeriod=30 Feb 18 14:22:08 crc kubenswrapper[4817]: I0218 14:22:08.949076 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d6e8372d-7ebc-4f22-8d3d-6d653c128b06" containerName="nova-metadata-metadata" containerID="cri-o://1be98d9b34c1e9a8c03d153810f7eb61636ba64426214ab705cae525935c5242" gracePeriod=30 Feb 18 14:22:09 crc kubenswrapper[4817]: I0218 14:22:09.582284 4817 generic.go:334] "Generic (PLEG): container finished" podID="d6e8372d-7ebc-4f22-8d3d-6d653c128b06" containerID="d2c990d70a0a202e89fdfdf55fe6e779315f8e8016b2d9abf431d4e19a6f7ac4" exitCode=143 Feb 18 14:22:09 crc kubenswrapper[4817]: I0218 14:22:09.582353 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d6e8372d-7ebc-4f22-8d3d-6d653c128b06","Type":"ContainerDied","Data":"d2c990d70a0a202e89fdfdf55fe6e779315f8e8016b2d9abf431d4e19a6f7ac4"} Feb 18 14:22:09 crc kubenswrapper[4817]: I0218 14:22:09.584887 4817 generic.go:334] "Generic (PLEG): container finished" podID="32dbdd4a-921a-413d-9feb-102ff3337d7e" containerID="bf8956ae2fede457f6132a1ed06622def3580c3f84f2922a8f5aca098967b3e5" exitCode=143 Feb 18 14:22:09 crc kubenswrapper[4817]: I0218 14:22:09.584913 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"32dbdd4a-921a-413d-9feb-102ff3337d7e","Type":"ContainerDied","Data":"bf8956ae2fede457f6132a1ed06622def3580c3f84f2922a8f5aca098967b3e5"} Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.081676 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="d6e8372d-7ebc-4f22-8d3d-6d653c128b06" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.222:8775/\": read tcp 10.217.0.2:47642->10.217.0.222:8775: read: connection reset by peer" Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.081673 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="d6e8372d-7ebc-4f22-8d3d-6d653c128b06" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.222:8775/\": read tcp 10.217.0.2:47644->10.217.0.222:8775: read: connection reset by peer" Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.564497 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.636857 4817 generic.go:334] "Generic (PLEG): container finished" podID="d6e8372d-7ebc-4f22-8d3d-6d653c128b06" containerID="1be98d9b34c1e9a8c03d153810f7eb61636ba64426214ab705cae525935c5242" exitCode=0 Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.636933 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d6e8372d-7ebc-4f22-8d3d-6d653c128b06","Type":"ContainerDied","Data":"1be98d9b34c1e9a8c03d153810f7eb61636ba64426214ab705cae525935c5242"} Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.636966 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d6e8372d-7ebc-4f22-8d3d-6d653c128b06","Type":"ContainerDied","Data":"01e3ff83d904d0da3040a1616e0b9d58c233b5dfa71d6bc6a348585e5b250c13"} Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.637021 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01e3ff83d904d0da3040a1616e0b9d58c233b5dfa71d6bc6a348585e5b250c13" Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.641558 4817 generic.go:334] "Generic (PLEG): container finished" podID="cde987c5-28be-48dd-835a-30ad08140eb8" containerID="5182819c5508a7d1b7a1e4a119ca1536d229c98b56b0a4eca4b5a710103a01df" exitCode=0 Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.641611 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cde987c5-28be-48dd-835a-30ad08140eb8","Type":"ContainerDied","Data":"5182819c5508a7d1b7a1e4a119ca1536d229c98b56b0a4eca4b5a710103a01df"} Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.641618 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.641643 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cde987c5-28be-48dd-835a-30ad08140eb8","Type":"ContainerDied","Data":"9bec1611082a62dbb822b1ab7c2604be30bece6f339adcd1e43480d6f9003877"} Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.641663 4817 scope.go:117] "RemoveContainer" containerID="5182819c5508a7d1b7a1e4a119ca1536d229c98b56b0a4eca4b5a710103a01df" Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.663271 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.671166 4817 scope.go:117] "RemoveContainer" containerID="5182819c5508a7d1b7a1e4a119ca1536d229c98b56b0a4eca4b5a710103a01df" Feb 18 14:22:12 crc kubenswrapper[4817]: E0218 14:22:12.671756 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5182819c5508a7d1b7a1e4a119ca1536d229c98b56b0a4eca4b5a710103a01df\": container with ID starting with 5182819c5508a7d1b7a1e4a119ca1536d229c98b56b0a4eca4b5a710103a01df not found: ID does not exist" containerID="5182819c5508a7d1b7a1e4a119ca1536d229c98b56b0a4eca4b5a710103a01df" Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.671791 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5182819c5508a7d1b7a1e4a119ca1536d229c98b56b0a4eca4b5a710103a01df"} err="failed to get container status \"5182819c5508a7d1b7a1e4a119ca1536d229c98b56b0a4eca4b5a710103a01df\": rpc error: code = NotFound desc = could not find container \"5182819c5508a7d1b7a1e4a119ca1536d229c98b56b0a4eca4b5a710103a01df\": container with ID starting with 5182819c5508a7d1b7a1e4a119ca1536d229c98b56b0a4eca4b5a710103a01df not found: ID does not exist" Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.703965 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkqfg\" (UniqueName: \"kubernetes.io/projected/cde987c5-28be-48dd-835a-30ad08140eb8-kube-api-access-fkqfg\") pod \"cde987c5-28be-48dd-835a-30ad08140eb8\" (UID: \"cde987c5-28be-48dd-835a-30ad08140eb8\") " Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.704097 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cde987c5-28be-48dd-835a-30ad08140eb8-config-data\") pod \"cde987c5-28be-48dd-835a-30ad08140eb8\" (UID: \"cde987c5-28be-48dd-835a-30ad08140eb8\") " Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.704139 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cde987c5-28be-48dd-835a-30ad08140eb8-combined-ca-bundle\") pod \"cde987c5-28be-48dd-835a-30ad08140eb8\" (UID: \"cde987c5-28be-48dd-835a-30ad08140eb8\") " Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.711197 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cde987c5-28be-48dd-835a-30ad08140eb8-kube-api-access-fkqfg" (OuterVolumeSpecName: "kube-api-access-fkqfg") pod "cde987c5-28be-48dd-835a-30ad08140eb8" (UID: "cde987c5-28be-48dd-835a-30ad08140eb8"). InnerVolumeSpecName "kube-api-access-fkqfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.741768 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cde987c5-28be-48dd-835a-30ad08140eb8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cde987c5-28be-48dd-835a-30ad08140eb8" (UID: "cde987c5-28be-48dd-835a-30ad08140eb8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.745248 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cde987c5-28be-48dd-835a-30ad08140eb8-config-data" (OuterVolumeSpecName: "config-data") pod "cde987c5-28be-48dd-835a-30ad08140eb8" (UID: "cde987c5-28be-48dd-835a-30ad08140eb8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.805774 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zt9z\" (UniqueName: \"kubernetes.io/projected/d6e8372d-7ebc-4f22-8d3d-6d653c128b06-kube-api-access-6zt9z\") pod \"d6e8372d-7ebc-4f22-8d3d-6d653c128b06\" (UID: \"d6e8372d-7ebc-4f22-8d3d-6d653c128b06\") " Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.805938 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e8372d-7ebc-4f22-8d3d-6d653c128b06-combined-ca-bundle\") pod \"d6e8372d-7ebc-4f22-8d3d-6d653c128b06\" (UID: \"d6e8372d-7ebc-4f22-8d3d-6d653c128b06\") " Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.806118 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6e8372d-7ebc-4f22-8d3d-6d653c128b06-nova-metadata-tls-certs\") pod \"d6e8372d-7ebc-4f22-8d3d-6d653c128b06\" (UID: \"d6e8372d-7ebc-4f22-8d3d-6d653c128b06\") " Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.806169 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6e8372d-7ebc-4f22-8d3d-6d653c128b06-logs\") pod \"d6e8372d-7ebc-4f22-8d3d-6d653c128b06\" (UID: \"d6e8372d-7ebc-4f22-8d3d-6d653c128b06\") " Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.806215 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6e8372d-7ebc-4f22-8d3d-6d653c128b06-config-data\") pod \"d6e8372d-7ebc-4f22-8d3d-6d653c128b06\" (UID: \"d6e8372d-7ebc-4f22-8d3d-6d653c128b06\") " Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.807892 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6e8372d-7ebc-4f22-8d3d-6d653c128b06-logs" (OuterVolumeSpecName: "logs") pod "d6e8372d-7ebc-4f22-8d3d-6d653c128b06" (UID: "d6e8372d-7ebc-4f22-8d3d-6d653c128b06"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.808086 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkqfg\" (UniqueName: \"kubernetes.io/projected/cde987c5-28be-48dd-835a-30ad08140eb8-kube-api-access-fkqfg\") on node \"crc\" DevicePath \"\"" Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.808128 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cde987c5-28be-48dd-835a-30ad08140eb8-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.808142 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cde987c5-28be-48dd-835a-30ad08140eb8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.809160 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6e8372d-7ebc-4f22-8d3d-6d653c128b06-kube-api-access-6zt9z" (OuterVolumeSpecName: "kube-api-access-6zt9z") pod "d6e8372d-7ebc-4f22-8d3d-6d653c128b06" (UID: "d6e8372d-7ebc-4f22-8d3d-6d653c128b06"). InnerVolumeSpecName "kube-api-access-6zt9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.836556 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6e8372d-7ebc-4f22-8d3d-6d653c128b06-config-data" (OuterVolumeSpecName: "config-data") pod "d6e8372d-7ebc-4f22-8d3d-6d653c128b06" (UID: "d6e8372d-7ebc-4f22-8d3d-6d653c128b06"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.840545 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6e8372d-7ebc-4f22-8d3d-6d653c128b06-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6e8372d-7ebc-4f22-8d3d-6d653c128b06" (UID: "d6e8372d-7ebc-4f22-8d3d-6d653c128b06"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.867325 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6e8372d-7ebc-4f22-8d3d-6d653c128b06-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "d6e8372d-7ebc-4f22-8d3d-6d653c128b06" (UID: "d6e8372d-7ebc-4f22-8d3d-6d653c128b06"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.910097 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zt9z\" (UniqueName: \"kubernetes.io/projected/d6e8372d-7ebc-4f22-8d3d-6d653c128b06-kube-api-access-6zt9z\") on node \"crc\" DevicePath \"\"" Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.910158 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e8372d-7ebc-4f22-8d3d-6d653c128b06-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.910172 4817 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6e8372d-7ebc-4f22-8d3d-6d653c128b06-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.910185 4817 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6e8372d-7ebc-4f22-8d3d-6d653c128b06-logs\") on node \"crc\" DevicePath \"\"" Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.910198 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6e8372d-7ebc-4f22-8d3d-6d653c128b06-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.986115 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 14:22:12 crc kubenswrapper[4817]: I0218 14:22:12.997869 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.008095 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 14:22:13 crc kubenswrapper[4817]: E0218 14:22:13.009269 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b021a8-f766-4296-87b8-88b45f99a5ba" containerName="dnsmasq-dns" Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.009290 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b021a8-f766-4296-87b8-88b45f99a5ba" containerName="dnsmasq-dns" Feb 18 14:22:13 crc kubenswrapper[4817]: E0218 14:22:13.009371 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cde987c5-28be-48dd-835a-30ad08140eb8" containerName="nova-scheduler-scheduler" Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.009385 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="cde987c5-28be-48dd-835a-30ad08140eb8" containerName="nova-scheduler-scheduler" Feb 18 14:22:13 crc kubenswrapper[4817]: E0218 14:22:13.009397 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e8372d-7ebc-4f22-8d3d-6d653c128b06" containerName="nova-metadata-metadata" Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.009404 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e8372d-7ebc-4f22-8d3d-6d653c128b06" containerName="nova-metadata-metadata" Feb 18 14:22:13 crc kubenswrapper[4817]: E0218 14:22:13.009472 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22047a76-beb3-439b-994a-25a8c306be7b" containerName="nova-manage" Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.009485 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="22047a76-beb3-439b-994a-25a8c306be7b" containerName="nova-manage" Feb 18 14:22:13 crc kubenswrapper[4817]: E0218 14:22:13.009495 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b021a8-f766-4296-87b8-88b45f99a5ba" containerName="init" Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.009503 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b021a8-f766-4296-87b8-88b45f99a5ba" containerName="init" Feb 18 14:22:13 crc kubenswrapper[4817]: E0218 14:22:13.009575 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e8372d-7ebc-4f22-8d3d-6d653c128b06" containerName="nova-metadata-log" Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.009587 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e8372d-7ebc-4f22-8d3d-6d653c128b06" containerName="nova-metadata-log" Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.010324 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6e8372d-7ebc-4f22-8d3d-6d653c128b06" containerName="nova-metadata-metadata" Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.010357 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6e8372d-7ebc-4f22-8d3d-6d653c128b06" containerName="nova-metadata-log" Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.010420 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="cde987c5-28be-48dd-835a-30ad08140eb8" containerName="nova-scheduler-scheduler" Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.010443 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="22047a76-beb3-439b-994a-25a8c306be7b" containerName="nova-manage" Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.010452 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6b021a8-f766-4296-87b8-88b45f99a5ba" containerName="dnsmasq-dns" Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.014153 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.016407 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76z2f\" (UniqueName: \"kubernetes.io/projected/d5bac496-cea3-4c61-91c1-0c0ebc884737-kube-api-access-76z2f\") pod \"nova-scheduler-0\" (UID: \"d5bac496-cea3-4c61-91c1-0c0ebc884737\") " pod="openstack/nova-scheduler-0" Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.016440 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5bac496-cea3-4c61-91c1-0c0ebc884737-config-data\") pod \"nova-scheduler-0\" (UID: \"d5bac496-cea3-4c61-91c1-0c0ebc884737\") " pod="openstack/nova-scheduler-0" Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.016532 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5bac496-cea3-4c61-91c1-0c0ebc884737-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d5bac496-cea3-4c61-91c1-0c0ebc884737\") " pod="openstack/nova-scheduler-0" Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.019449 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.020271 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.118073 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76z2f\" (UniqueName: \"kubernetes.io/projected/d5bac496-cea3-4c61-91c1-0c0ebc884737-kube-api-access-76z2f\") pod \"nova-scheduler-0\" (UID: \"d5bac496-cea3-4c61-91c1-0c0ebc884737\") " pod="openstack/nova-scheduler-0" Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.118117 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5bac496-cea3-4c61-91c1-0c0ebc884737-config-data\") pod \"nova-scheduler-0\" (UID: \"d5bac496-cea3-4c61-91c1-0c0ebc884737\") " pod="openstack/nova-scheduler-0" Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.118212 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5bac496-cea3-4c61-91c1-0c0ebc884737-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d5bac496-cea3-4c61-91c1-0c0ebc884737\") " pod="openstack/nova-scheduler-0" Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.125771 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5bac496-cea3-4c61-91c1-0c0ebc884737-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d5bac496-cea3-4c61-91c1-0c0ebc884737\") " pod="openstack/nova-scheduler-0" Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.127706 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5bac496-cea3-4c61-91c1-0c0ebc884737-config-data\") pod \"nova-scheduler-0\" (UID: \"d5bac496-cea3-4c61-91c1-0c0ebc884737\") " pod="openstack/nova-scheduler-0" Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.145221 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76z2f\" (UniqueName: \"kubernetes.io/projected/d5bac496-cea3-4c61-91c1-0c0ebc884737-kube-api-access-76z2f\") pod \"nova-scheduler-0\" (UID: \"d5bac496-cea3-4c61-91c1-0c0ebc884737\") " pod="openstack/nova-scheduler-0" Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.340869 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.657060 4817 generic.go:334] "Generic (PLEG): container finished" podID="32dbdd4a-921a-413d-9feb-102ff3337d7e" containerID="828ae87010c78765c313f339639193e5efd0b302f9bba2db9e8de540dbbaa51f" exitCode=0 Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.657218 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"32dbdd4a-921a-413d-9feb-102ff3337d7e","Type":"ContainerDied","Data":"828ae87010c78765c313f339639193e5efd0b302f9bba2db9e8de540dbbaa51f"} Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.659563 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.756495 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.798041 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.811856 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.813958 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.819286 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.819365 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.821535 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.844846 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 18 14:22:13 crc kubenswrapper[4817]: W0218 14:22:13.849289 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5bac496_cea3_4c61_91c1_0c0ebc884737.slice/crio-8085a42611663913a4b5898edb89107dde8928b7186ae06b63242dd8253c2dc5 WatchSource:0}: Error finding container 8085a42611663913a4b5898edb89107dde8928b7186ae06b63242dd8253c2dc5: Status 404 returned error can't find the container with id 8085a42611663913a4b5898edb89107dde8928b7186ae06b63242dd8253c2dc5 Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.953854 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd52eab4-329f-4cab-83cc-c082d2d3f1d4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bd52eab4-329f-4cab-83cc-c082d2d3f1d4\") " pod="openstack/nova-metadata-0" Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.954439 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd52eab4-329f-4cab-83cc-c082d2d3f1d4-logs\") pod \"nova-metadata-0\" (UID: \"bd52eab4-329f-4cab-83cc-c082d2d3f1d4\") " pod="openstack/nova-metadata-0" Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.954520 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsr96\" (UniqueName: \"kubernetes.io/projected/bd52eab4-329f-4cab-83cc-c082d2d3f1d4-kube-api-access-vsr96\") pod \"nova-metadata-0\" (UID: \"bd52eab4-329f-4cab-83cc-c082d2d3f1d4\") " pod="openstack/nova-metadata-0" Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.954570 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd52eab4-329f-4cab-83cc-c082d2d3f1d4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bd52eab4-329f-4cab-83cc-c082d2d3f1d4\") " pod="openstack/nova-metadata-0" Feb 18 14:22:13 crc kubenswrapper[4817]: I0218 14:22:13.954674 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd52eab4-329f-4cab-83cc-c082d2d3f1d4-config-data\") pod \"nova-metadata-0\" (UID: \"bd52eab4-329f-4cab-83cc-c082d2d3f1d4\") " pod="openstack/nova-metadata-0" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.056358 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd52eab4-329f-4cab-83cc-c082d2d3f1d4-config-data\") pod \"nova-metadata-0\" (UID: \"bd52eab4-329f-4cab-83cc-c082d2d3f1d4\") " pod="openstack/nova-metadata-0" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.056406 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd52eab4-329f-4cab-83cc-c082d2d3f1d4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bd52eab4-329f-4cab-83cc-c082d2d3f1d4\") " pod="openstack/nova-metadata-0" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.056517 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd52eab4-329f-4cab-83cc-c082d2d3f1d4-logs\") pod \"nova-metadata-0\" (UID: \"bd52eab4-329f-4cab-83cc-c082d2d3f1d4\") " pod="openstack/nova-metadata-0" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.056556 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsr96\" (UniqueName: \"kubernetes.io/projected/bd52eab4-329f-4cab-83cc-c082d2d3f1d4-kube-api-access-vsr96\") pod \"nova-metadata-0\" (UID: \"bd52eab4-329f-4cab-83cc-c082d2d3f1d4\") " pod="openstack/nova-metadata-0" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.056599 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd52eab4-329f-4cab-83cc-c082d2d3f1d4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bd52eab4-329f-4cab-83cc-c082d2d3f1d4\") " pod="openstack/nova-metadata-0" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.058812 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd52eab4-329f-4cab-83cc-c082d2d3f1d4-logs\") pod \"nova-metadata-0\" (UID: \"bd52eab4-329f-4cab-83cc-c082d2d3f1d4\") " pod="openstack/nova-metadata-0" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.063832 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd52eab4-329f-4cab-83cc-c082d2d3f1d4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bd52eab4-329f-4cab-83cc-c082d2d3f1d4\") " pod="openstack/nova-metadata-0" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.065732 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd52eab4-329f-4cab-83cc-c082d2d3f1d4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bd52eab4-329f-4cab-83cc-c082d2d3f1d4\") " pod="openstack/nova-metadata-0" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.071638 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd52eab4-329f-4cab-83cc-c082d2d3f1d4-config-data\") pod \"nova-metadata-0\" (UID: \"bd52eab4-329f-4cab-83cc-c082d2d3f1d4\") " pod="openstack/nova-metadata-0" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.073598 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsr96\" (UniqueName: \"kubernetes.io/projected/bd52eab4-329f-4cab-83cc-c082d2d3f1d4-kube-api-access-vsr96\") pod \"nova-metadata-0\" (UID: \"bd52eab4-329f-4cab-83cc-c082d2d3f1d4\") " pod="openstack/nova-metadata-0" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.134195 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.184926 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cde987c5-28be-48dd-835a-30ad08140eb8" path="/var/lib/kubelet/pods/cde987c5-28be-48dd-835a-30ad08140eb8/volumes" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.185785 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6e8372d-7ebc-4f22-8d3d-6d653c128b06" path="/var/lib/kubelet/pods/d6e8372d-7ebc-4f22-8d3d-6d653c128b06/volumes" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.266370 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.365069 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32dbdd4a-921a-413d-9feb-102ff3337d7e-config-data\") pod \"32dbdd4a-921a-413d-9feb-102ff3337d7e\" (UID: \"32dbdd4a-921a-413d-9feb-102ff3337d7e\") " Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.365162 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32dbdd4a-921a-413d-9feb-102ff3337d7e-public-tls-certs\") pod \"32dbdd4a-921a-413d-9feb-102ff3337d7e\" (UID: \"32dbdd4a-921a-413d-9feb-102ff3337d7e\") " Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.365255 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c65d9\" (UniqueName: \"kubernetes.io/projected/32dbdd4a-921a-413d-9feb-102ff3337d7e-kube-api-access-c65d9\") pod \"32dbdd4a-921a-413d-9feb-102ff3337d7e\" (UID: \"32dbdd4a-921a-413d-9feb-102ff3337d7e\") " Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.365307 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32dbdd4a-921a-413d-9feb-102ff3337d7e-internal-tls-certs\") pod \"32dbdd4a-921a-413d-9feb-102ff3337d7e\" (UID: \"32dbdd4a-921a-413d-9feb-102ff3337d7e\") " Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.365432 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dbdd4a-921a-413d-9feb-102ff3337d7e-combined-ca-bundle\") pod \"32dbdd4a-921a-413d-9feb-102ff3337d7e\" (UID: \"32dbdd4a-921a-413d-9feb-102ff3337d7e\") " Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.365633 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32dbdd4a-921a-413d-9feb-102ff3337d7e-logs\") pod \"32dbdd4a-921a-413d-9feb-102ff3337d7e\" (UID: \"32dbdd4a-921a-413d-9feb-102ff3337d7e\") " Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.366788 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32dbdd4a-921a-413d-9feb-102ff3337d7e-logs" (OuterVolumeSpecName: "logs") pod "32dbdd4a-921a-413d-9feb-102ff3337d7e" (UID: "32dbdd4a-921a-413d-9feb-102ff3337d7e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.391092 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32dbdd4a-921a-413d-9feb-102ff3337d7e-kube-api-access-c65d9" (OuterVolumeSpecName: "kube-api-access-c65d9") pod "32dbdd4a-921a-413d-9feb-102ff3337d7e" (UID: "32dbdd4a-921a-413d-9feb-102ff3337d7e"). InnerVolumeSpecName "kube-api-access-c65d9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.469311 4817 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32dbdd4a-921a-413d-9feb-102ff3337d7e-logs\") on node \"crc\" DevicePath \"\"" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.469684 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c65d9\" (UniqueName: \"kubernetes.io/projected/32dbdd4a-921a-413d-9feb-102ff3337d7e-kube-api-access-c65d9\") on node \"crc\" DevicePath \"\"" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.475210 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32dbdd4a-921a-413d-9feb-102ff3337d7e-config-data" (OuterVolumeSpecName: "config-data") pod "32dbdd4a-921a-413d-9feb-102ff3337d7e" (UID: "32dbdd4a-921a-413d-9feb-102ff3337d7e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.556287 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32dbdd4a-921a-413d-9feb-102ff3337d7e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "32dbdd4a-921a-413d-9feb-102ff3337d7e" (UID: "32dbdd4a-921a-413d-9feb-102ff3337d7e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.562130 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32dbdd4a-921a-413d-9feb-102ff3337d7e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32dbdd4a-921a-413d-9feb-102ff3337d7e" (UID: "32dbdd4a-921a-413d-9feb-102ff3337d7e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.577596 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32dbdd4a-921a-413d-9feb-102ff3337d7e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "32dbdd4a-921a-413d-9feb-102ff3337d7e" (UID: "32dbdd4a-921a-413d-9feb-102ff3337d7e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.578612 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32dbdd4a-921a-413d-9feb-102ff3337d7e-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.578637 4817 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32dbdd4a-921a-413d-9feb-102ff3337d7e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.578647 4817 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32dbdd4a-921a-413d-9feb-102ff3337d7e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.578656 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dbdd4a-921a-413d-9feb-102ff3337d7e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.634242 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.673297 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"32dbdd4a-921a-413d-9feb-102ff3337d7e","Type":"ContainerDied","Data":"0c1a945c27407719a16eaa882a02676d4e88540594e6dcc01b3b590ff8bd8fd5"} Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.673345 4817 scope.go:117] "RemoveContainer" containerID="828ae87010c78765c313f339639193e5efd0b302f9bba2db9e8de540dbbaa51f" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.673457 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.676713 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bd52eab4-329f-4cab-83cc-c082d2d3f1d4","Type":"ContainerStarted","Data":"e629d1509f13df30c406ed1eef6172ba60496f05b679005593eac2d8df3b30c6"} Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.679376 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d5bac496-cea3-4c61-91c1-0c0ebc884737","Type":"ContainerStarted","Data":"f9ef7638179dd108cdfa36ff7e2ef3aa4ad8ae5f4b984c3543a14a0b5a1acad0"} Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.679423 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d5bac496-cea3-4c61-91c1-0c0ebc884737","Type":"ContainerStarted","Data":"8085a42611663913a4b5898edb89107dde8928b7186ae06b63242dd8253c2dc5"} Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.700419 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.7003989219999998 podStartE2EDuration="2.700398922s" podCreationTimestamp="2026-02-18 14:22:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:22:14.699534389 +0000 UTC m=+1397.275070372" watchObservedRunningTime="2026-02-18 14:22:14.700398922 +0000 UTC m=+1397.275934905" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.717367 4817 scope.go:117] "RemoveContainer" containerID="bf8956ae2fede457f6132a1ed06622def3580c3f84f2922a8f5aca098967b3e5" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.733134 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.754818 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.774999 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 18 14:22:14 crc kubenswrapper[4817]: E0218 14:22:14.775493 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32dbdd4a-921a-413d-9feb-102ff3337d7e" containerName="nova-api-log" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.775508 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="32dbdd4a-921a-413d-9feb-102ff3337d7e" containerName="nova-api-log" Feb 18 14:22:14 crc kubenswrapper[4817]: E0218 14:22:14.775545 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32dbdd4a-921a-413d-9feb-102ff3337d7e" containerName="nova-api-api" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.775552 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="32dbdd4a-921a-413d-9feb-102ff3337d7e" containerName="nova-api-api" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.775781 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="32dbdd4a-921a-413d-9feb-102ff3337d7e" containerName="nova-api-log" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.775803 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="32dbdd4a-921a-413d-9feb-102ff3337d7e" containerName="nova-api-api" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.790238 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.796325 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.799671 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.799889 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.800061 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.888505 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bea1dd6e-5f07-4dd6-a191-f07f59d36043-config-data\") pod \"nova-api-0\" (UID: \"bea1dd6e-5f07-4dd6-a191-f07f59d36043\") " pod="openstack/nova-api-0" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.888565 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsbr6\" (UniqueName: \"kubernetes.io/projected/bea1dd6e-5f07-4dd6-a191-f07f59d36043-kube-api-access-bsbr6\") pod \"nova-api-0\" (UID: \"bea1dd6e-5f07-4dd6-a191-f07f59d36043\") " pod="openstack/nova-api-0" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.888683 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bea1dd6e-5f07-4dd6-a191-f07f59d36043-logs\") pod \"nova-api-0\" (UID: \"bea1dd6e-5f07-4dd6-a191-f07f59d36043\") " pod="openstack/nova-api-0" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.888740 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bea1dd6e-5f07-4dd6-a191-f07f59d36043-public-tls-certs\") pod \"nova-api-0\" (UID: \"bea1dd6e-5f07-4dd6-a191-f07f59d36043\") " pod="openstack/nova-api-0" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.888792 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bea1dd6e-5f07-4dd6-a191-f07f59d36043-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bea1dd6e-5f07-4dd6-a191-f07f59d36043\") " pod="openstack/nova-api-0" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.888938 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bea1dd6e-5f07-4dd6-a191-f07f59d36043-internal-tls-certs\") pod \"nova-api-0\" (UID: \"bea1dd6e-5f07-4dd6-a191-f07f59d36043\") " pod="openstack/nova-api-0" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.990917 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bea1dd6e-5f07-4dd6-a191-f07f59d36043-internal-tls-certs\") pod \"nova-api-0\" (UID: \"bea1dd6e-5f07-4dd6-a191-f07f59d36043\") " pod="openstack/nova-api-0" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.991041 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bea1dd6e-5f07-4dd6-a191-f07f59d36043-config-data\") pod \"nova-api-0\" (UID: \"bea1dd6e-5f07-4dd6-a191-f07f59d36043\") " pod="openstack/nova-api-0" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.991079 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsbr6\" (UniqueName: \"kubernetes.io/projected/bea1dd6e-5f07-4dd6-a191-f07f59d36043-kube-api-access-bsbr6\") pod \"nova-api-0\" (UID: \"bea1dd6e-5f07-4dd6-a191-f07f59d36043\") " pod="openstack/nova-api-0" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.991173 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bea1dd6e-5f07-4dd6-a191-f07f59d36043-logs\") pod \"nova-api-0\" (UID: \"bea1dd6e-5f07-4dd6-a191-f07f59d36043\") " pod="openstack/nova-api-0" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.991201 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bea1dd6e-5f07-4dd6-a191-f07f59d36043-public-tls-certs\") pod \"nova-api-0\" (UID: \"bea1dd6e-5f07-4dd6-a191-f07f59d36043\") " pod="openstack/nova-api-0" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.991240 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bea1dd6e-5f07-4dd6-a191-f07f59d36043-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bea1dd6e-5f07-4dd6-a191-f07f59d36043\") " pod="openstack/nova-api-0" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.991646 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bea1dd6e-5f07-4dd6-a191-f07f59d36043-logs\") pod \"nova-api-0\" (UID: \"bea1dd6e-5f07-4dd6-a191-f07f59d36043\") " pod="openstack/nova-api-0" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.996217 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bea1dd6e-5f07-4dd6-a191-f07f59d36043-public-tls-certs\") pod \"nova-api-0\" (UID: \"bea1dd6e-5f07-4dd6-a191-f07f59d36043\") " pod="openstack/nova-api-0" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.996433 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bea1dd6e-5f07-4dd6-a191-f07f59d36043-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bea1dd6e-5f07-4dd6-a191-f07f59d36043\") " pod="openstack/nova-api-0" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.996745 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bea1dd6e-5f07-4dd6-a191-f07f59d36043-internal-tls-certs\") pod \"nova-api-0\" (UID: \"bea1dd6e-5f07-4dd6-a191-f07f59d36043\") " pod="openstack/nova-api-0" Feb 18 14:22:14 crc kubenswrapper[4817]: I0218 14:22:14.998277 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bea1dd6e-5f07-4dd6-a191-f07f59d36043-config-data\") pod \"nova-api-0\" (UID: \"bea1dd6e-5f07-4dd6-a191-f07f59d36043\") " pod="openstack/nova-api-0" Feb 18 14:22:15 crc kubenswrapper[4817]: I0218 14:22:15.012434 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsbr6\" (UniqueName: \"kubernetes.io/projected/bea1dd6e-5f07-4dd6-a191-f07f59d36043-kube-api-access-bsbr6\") pod \"nova-api-0\" (UID: \"bea1dd6e-5f07-4dd6-a191-f07f59d36043\") " pod="openstack/nova-api-0" Feb 18 14:22:15 crc kubenswrapper[4817]: I0218 14:22:15.126180 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 18 14:22:15 crc kubenswrapper[4817]: I0218 14:22:15.611390 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 18 14:22:15 crc kubenswrapper[4817]: W0218 14:22:15.622519 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbea1dd6e_5f07_4dd6_a191_f07f59d36043.slice/crio-69cdea44d2e68551fc28aea9ad4941006d27361b867e1a265aa5ecc50677b17b WatchSource:0}: Error finding container 69cdea44d2e68551fc28aea9ad4941006d27361b867e1a265aa5ecc50677b17b: Status 404 returned error can't find the container with id 69cdea44d2e68551fc28aea9ad4941006d27361b867e1a265aa5ecc50677b17b Feb 18 14:22:15 crc kubenswrapper[4817]: I0218 14:22:15.693714 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bea1dd6e-5f07-4dd6-a191-f07f59d36043","Type":"ContainerStarted","Data":"69cdea44d2e68551fc28aea9ad4941006d27361b867e1a265aa5ecc50677b17b"} Feb 18 14:22:15 crc kubenswrapper[4817]: I0218 14:22:15.696190 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bd52eab4-329f-4cab-83cc-c082d2d3f1d4","Type":"ContainerStarted","Data":"7bc96ea43c574c15f05b3b238c16b57511d6fb768d50bfdfb5fabe03d6929e99"} Feb 18 14:22:16 crc kubenswrapper[4817]: I0218 14:22:16.218670 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32dbdd4a-921a-413d-9feb-102ff3337d7e" path="/var/lib/kubelet/pods/32dbdd4a-921a-413d-9feb-102ff3337d7e/volumes" Feb 18 14:22:16 crc kubenswrapper[4817]: I0218 14:22:16.493104 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="2550ff13-9cef-4d0a-a413-21d40e809b87" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 18 14:22:16 crc kubenswrapper[4817]: I0218 14:22:16.717853 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bd52eab4-329f-4cab-83cc-c082d2d3f1d4","Type":"ContainerStarted","Data":"fed37a926702b09ee95195585c0e910eac1f206b0c1fbbc529c717df2a366503"} Feb 18 14:22:16 crc kubenswrapper[4817]: I0218 14:22:16.721761 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bea1dd6e-5f07-4dd6-a191-f07f59d36043","Type":"ContainerStarted","Data":"39ee07840527205b005f30e4d739d9cfffef0ea04e1ccfb2d080c7dcb0a6b31a"} Feb 18 14:22:16 crc kubenswrapper[4817]: I0218 14:22:16.721796 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bea1dd6e-5f07-4dd6-a191-f07f59d36043","Type":"ContainerStarted","Data":"6a1f854e0c4a995e0d706c1ab70c43787f003d1c63cc27f5c6c164191dd590aa"} Feb 18 14:22:16 crc kubenswrapper[4817]: I0218 14:22:16.746971 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.746948756 podStartE2EDuration="3.746948756s" podCreationTimestamp="2026-02-18 14:22:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:22:16.73514992 +0000 UTC m=+1399.310685903" watchObservedRunningTime="2026-02-18 14:22:16.746948756 +0000 UTC m=+1399.322484749" Feb 18 14:22:17 crc kubenswrapper[4817]: I0218 14:22:17.750481 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.750456662 podStartE2EDuration="3.750456662s" podCreationTimestamp="2026-02-18 14:22:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:22:17.749022984 +0000 UTC m=+1400.324558977" watchObservedRunningTime="2026-02-18 14:22:17.750456662 +0000 UTC m=+1400.325992645" Feb 18 14:22:18 crc kubenswrapper[4817]: I0218 14:22:18.342282 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 18 14:22:19 crc kubenswrapper[4817]: I0218 14:22:19.135179 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 18 14:22:19 crc kubenswrapper[4817]: I0218 14:22:19.135334 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 18 14:22:23 crc kubenswrapper[4817]: I0218 14:22:23.342712 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 18 14:22:23 crc kubenswrapper[4817]: I0218 14:22:23.379603 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 18 14:22:23 crc kubenswrapper[4817]: I0218 14:22:23.700232 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:22:23 crc kubenswrapper[4817]: I0218 14:22:23.824774 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2550ff13-9cef-4d0a-a413-21d40e809b87-log-httpd\") pod \"2550ff13-9cef-4d0a-a413-21d40e809b87\" (UID: \"2550ff13-9cef-4d0a-a413-21d40e809b87\") " Feb 18 14:22:23 crc kubenswrapper[4817]: I0218 14:22:23.824885 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2550ff13-9cef-4d0a-a413-21d40e809b87-scripts\") pod \"2550ff13-9cef-4d0a-a413-21d40e809b87\" (UID: \"2550ff13-9cef-4d0a-a413-21d40e809b87\") " Feb 18 14:22:23 crc kubenswrapper[4817]: I0218 14:22:23.824940 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2550ff13-9cef-4d0a-a413-21d40e809b87-config-data\") pod \"2550ff13-9cef-4d0a-a413-21d40e809b87\" (UID: \"2550ff13-9cef-4d0a-a413-21d40e809b87\") " Feb 18 14:22:23 crc kubenswrapper[4817]: I0218 14:22:23.825007 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2550ff13-9cef-4d0a-a413-21d40e809b87-run-httpd\") pod \"2550ff13-9cef-4d0a-a413-21d40e809b87\" (UID: \"2550ff13-9cef-4d0a-a413-21d40e809b87\") " Feb 18 14:22:23 crc kubenswrapper[4817]: I0218 14:22:23.825161 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffjbw\" (UniqueName: \"kubernetes.io/projected/2550ff13-9cef-4d0a-a413-21d40e809b87-kube-api-access-ffjbw\") pod \"2550ff13-9cef-4d0a-a413-21d40e809b87\" (UID: \"2550ff13-9cef-4d0a-a413-21d40e809b87\") " Feb 18 14:22:23 crc kubenswrapper[4817]: I0218 14:22:23.825208 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2550ff13-9cef-4d0a-a413-21d40e809b87-combined-ca-bundle\") pod \"2550ff13-9cef-4d0a-a413-21d40e809b87\" (UID: \"2550ff13-9cef-4d0a-a413-21d40e809b87\") " Feb 18 14:22:23 crc kubenswrapper[4817]: I0218 14:22:23.825262 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2550ff13-9cef-4d0a-a413-21d40e809b87-ceilometer-tls-certs\") pod \"2550ff13-9cef-4d0a-a413-21d40e809b87\" (UID: \"2550ff13-9cef-4d0a-a413-21d40e809b87\") " Feb 18 14:22:23 crc kubenswrapper[4817]: I0218 14:22:23.825302 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2550ff13-9cef-4d0a-a413-21d40e809b87-sg-core-conf-yaml\") pod \"2550ff13-9cef-4d0a-a413-21d40e809b87\" (UID: \"2550ff13-9cef-4d0a-a413-21d40e809b87\") " Feb 18 14:22:23 crc kubenswrapper[4817]: I0218 14:22:23.825910 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2550ff13-9cef-4d0a-a413-21d40e809b87-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2550ff13-9cef-4d0a-a413-21d40e809b87" (UID: "2550ff13-9cef-4d0a-a413-21d40e809b87"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:22:23 crc kubenswrapper[4817]: I0218 14:22:23.826316 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2550ff13-9cef-4d0a-a413-21d40e809b87-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2550ff13-9cef-4d0a-a413-21d40e809b87" (UID: "2550ff13-9cef-4d0a-a413-21d40e809b87"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:22:23 crc kubenswrapper[4817]: I0218 14:22:23.831546 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2550ff13-9cef-4d0a-a413-21d40e809b87-kube-api-access-ffjbw" (OuterVolumeSpecName: "kube-api-access-ffjbw") pod "2550ff13-9cef-4d0a-a413-21d40e809b87" (UID: "2550ff13-9cef-4d0a-a413-21d40e809b87"). InnerVolumeSpecName "kube-api-access-ffjbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:22:23 crc kubenswrapper[4817]: I0218 14:22:23.832243 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2550ff13-9cef-4d0a-a413-21d40e809b87-scripts" (OuterVolumeSpecName: "scripts") pod "2550ff13-9cef-4d0a-a413-21d40e809b87" (UID: "2550ff13-9cef-4d0a-a413-21d40e809b87"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:22:23 crc kubenswrapper[4817]: I0218 14:22:23.834754 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2550ff13-9cef-4d0a-a413-21d40e809b87","Type":"ContainerDied","Data":"f2ff8e07b29de796ea9989435892b3dd54afe8572d94e6b21440b66fe1311291"} Feb 18 14:22:23 crc kubenswrapper[4817]: I0218 14:22:23.834760 4817 generic.go:334] "Generic (PLEG): container finished" podID="2550ff13-9cef-4d0a-a413-21d40e809b87" containerID="f2ff8e07b29de796ea9989435892b3dd54afe8572d94e6b21440b66fe1311291" exitCode=137 Feb 18 14:22:23 crc kubenswrapper[4817]: I0218 14:22:23.834814 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2550ff13-9cef-4d0a-a413-21d40e809b87","Type":"ContainerDied","Data":"d4bcfbf8e0fc0731897328e96563da9a30ebe086e5601256759e23a945f9d3f5"} Feb 18 14:22:23 crc kubenswrapper[4817]: I0218 14:22:23.834840 4817 scope.go:117] "RemoveContainer" containerID="f2ff8e07b29de796ea9989435892b3dd54afe8572d94e6b21440b66fe1311291" Feb 18 14:22:23 crc kubenswrapper[4817]: I0218 14:22:23.834951 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:22:23 crc kubenswrapper[4817]: I0218 14:22:23.866089 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2550ff13-9cef-4d0a-a413-21d40e809b87-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2550ff13-9cef-4d0a-a413-21d40e809b87" (UID: "2550ff13-9cef-4d0a-a413-21d40e809b87"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:22:23 crc kubenswrapper[4817]: I0218 14:22:23.875225 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 18 14:22:23 crc kubenswrapper[4817]: I0218 14:22:23.914622 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2550ff13-9cef-4d0a-a413-21d40e809b87-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "2550ff13-9cef-4d0a-a413-21d40e809b87" (UID: "2550ff13-9cef-4d0a-a413-21d40e809b87"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:22:23 crc kubenswrapper[4817]: I0218 14:22:23.927448 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffjbw\" (UniqueName: \"kubernetes.io/projected/2550ff13-9cef-4d0a-a413-21d40e809b87-kube-api-access-ffjbw\") on node \"crc\" DevicePath \"\"" Feb 18 14:22:23 crc kubenswrapper[4817]: I0218 14:22:23.927482 4817 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2550ff13-9cef-4d0a-a413-21d40e809b87-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 14:22:23 crc kubenswrapper[4817]: I0218 14:22:23.927493 4817 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2550ff13-9cef-4d0a-a413-21d40e809b87-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 18 14:22:23 crc kubenswrapper[4817]: I0218 14:22:23.927501 4817 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2550ff13-9cef-4d0a-a413-21d40e809b87-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 14:22:23 crc kubenswrapper[4817]: I0218 14:22:23.927512 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2550ff13-9cef-4d0a-a413-21d40e809b87-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:22:23 crc kubenswrapper[4817]: I0218 14:22:23.927521 4817 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2550ff13-9cef-4d0a-a413-21d40e809b87-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 14:22:23 crc kubenswrapper[4817]: I0218 14:22:23.965560 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2550ff13-9cef-4d0a-a413-21d40e809b87-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2550ff13-9cef-4d0a-a413-21d40e809b87" (UID: "2550ff13-9cef-4d0a-a413-21d40e809b87"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:22:23 crc kubenswrapper[4817]: I0218 14:22:23.977414 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2550ff13-9cef-4d0a-a413-21d40e809b87-config-data" (OuterVolumeSpecName: "config-data") pod "2550ff13-9cef-4d0a-a413-21d40e809b87" (UID: "2550ff13-9cef-4d0a-a413-21d40e809b87"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:22:23 crc kubenswrapper[4817]: I0218 14:22:23.998874 4817 scope.go:117] "RemoveContainer" containerID="121f8e6c68b4781fd471e0e57f82d2f0f6a0d6adf0ed36ddb60242357cea9ccf" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.022254 4817 scope.go:117] "RemoveContainer" containerID="c7e2ab2d5a29e4f6773856c7acb58e746cf27de4a74d0359c46117e6d03f7e64" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.030757 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2550ff13-9cef-4d0a-a413-21d40e809b87-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.030890 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2550ff13-9cef-4d0a-a413-21d40e809b87-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.055536 4817 scope.go:117] "RemoveContainer" containerID="780ab183ef5ff537ab569ae41b28e75022d93eab5c57e09b52e79a0667cb073f" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.095200 4817 scope.go:117] "RemoveContainer" containerID="f2ff8e07b29de796ea9989435892b3dd54afe8572d94e6b21440b66fe1311291" Feb 18 14:22:24 crc kubenswrapper[4817]: E0218 14:22:24.097109 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2ff8e07b29de796ea9989435892b3dd54afe8572d94e6b21440b66fe1311291\": container with ID starting with f2ff8e07b29de796ea9989435892b3dd54afe8572d94e6b21440b66fe1311291 not found: ID does not exist" containerID="f2ff8e07b29de796ea9989435892b3dd54afe8572d94e6b21440b66fe1311291" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.097152 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2ff8e07b29de796ea9989435892b3dd54afe8572d94e6b21440b66fe1311291"} err="failed to get container status \"f2ff8e07b29de796ea9989435892b3dd54afe8572d94e6b21440b66fe1311291\": rpc error: code = NotFound desc = could not find container \"f2ff8e07b29de796ea9989435892b3dd54afe8572d94e6b21440b66fe1311291\": container with ID starting with f2ff8e07b29de796ea9989435892b3dd54afe8572d94e6b21440b66fe1311291 not found: ID does not exist" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.097192 4817 scope.go:117] "RemoveContainer" containerID="121f8e6c68b4781fd471e0e57f82d2f0f6a0d6adf0ed36ddb60242357cea9ccf" Feb 18 14:22:24 crc kubenswrapper[4817]: E0218 14:22:24.097947 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"121f8e6c68b4781fd471e0e57f82d2f0f6a0d6adf0ed36ddb60242357cea9ccf\": container with ID starting with 121f8e6c68b4781fd471e0e57f82d2f0f6a0d6adf0ed36ddb60242357cea9ccf not found: ID does not exist" containerID="121f8e6c68b4781fd471e0e57f82d2f0f6a0d6adf0ed36ddb60242357cea9ccf" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.097974 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"121f8e6c68b4781fd471e0e57f82d2f0f6a0d6adf0ed36ddb60242357cea9ccf"} err="failed to get container status \"121f8e6c68b4781fd471e0e57f82d2f0f6a0d6adf0ed36ddb60242357cea9ccf\": rpc error: code = NotFound desc = could not find container \"121f8e6c68b4781fd471e0e57f82d2f0f6a0d6adf0ed36ddb60242357cea9ccf\": container with ID starting with 121f8e6c68b4781fd471e0e57f82d2f0f6a0d6adf0ed36ddb60242357cea9ccf not found: ID does not exist" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.098035 4817 scope.go:117] "RemoveContainer" containerID="c7e2ab2d5a29e4f6773856c7acb58e746cf27de4a74d0359c46117e6d03f7e64" Feb 18 14:22:24 crc kubenswrapper[4817]: E0218 14:22:24.098382 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7e2ab2d5a29e4f6773856c7acb58e746cf27de4a74d0359c46117e6d03f7e64\": container with ID starting with c7e2ab2d5a29e4f6773856c7acb58e746cf27de4a74d0359c46117e6d03f7e64 not found: ID does not exist" containerID="c7e2ab2d5a29e4f6773856c7acb58e746cf27de4a74d0359c46117e6d03f7e64" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.098418 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7e2ab2d5a29e4f6773856c7acb58e746cf27de4a74d0359c46117e6d03f7e64"} err="failed to get container status \"c7e2ab2d5a29e4f6773856c7acb58e746cf27de4a74d0359c46117e6d03f7e64\": rpc error: code = NotFound desc = could not find container \"c7e2ab2d5a29e4f6773856c7acb58e746cf27de4a74d0359c46117e6d03f7e64\": container with ID starting with c7e2ab2d5a29e4f6773856c7acb58e746cf27de4a74d0359c46117e6d03f7e64 not found: ID does not exist" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.098437 4817 scope.go:117] "RemoveContainer" containerID="780ab183ef5ff537ab569ae41b28e75022d93eab5c57e09b52e79a0667cb073f" Feb 18 14:22:24 crc kubenswrapper[4817]: E0218 14:22:24.099683 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"780ab183ef5ff537ab569ae41b28e75022d93eab5c57e09b52e79a0667cb073f\": container with ID starting with 780ab183ef5ff537ab569ae41b28e75022d93eab5c57e09b52e79a0667cb073f not found: ID does not exist" containerID="780ab183ef5ff537ab569ae41b28e75022d93eab5c57e09b52e79a0667cb073f" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.099775 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"780ab183ef5ff537ab569ae41b28e75022d93eab5c57e09b52e79a0667cb073f"} err="failed to get container status \"780ab183ef5ff537ab569ae41b28e75022d93eab5c57e09b52e79a0667cb073f\": rpc error: code = NotFound desc = could not find container \"780ab183ef5ff537ab569ae41b28e75022d93eab5c57e09b52e79a0667cb073f\": container with ID starting with 780ab183ef5ff537ab569ae41b28e75022d93eab5c57e09b52e79a0667cb073f not found: ID does not exist" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.135047 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.135130 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.189935 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.190029 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.213649 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:22:24 crc kubenswrapper[4817]: E0218 14:22:24.214196 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2550ff13-9cef-4d0a-a413-21d40e809b87" containerName="sg-core" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.214219 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="2550ff13-9cef-4d0a-a413-21d40e809b87" containerName="sg-core" Feb 18 14:22:24 crc kubenswrapper[4817]: E0218 14:22:24.214244 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2550ff13-9cef-4d0a-a413-21d40e809b87" containerName="proxy-httpd" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.214253 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="2550ff13-9cef-4d0a-a413-21d40e809b87" containerName="proxy-httpd" Feb 18 14:22:24 crc kubenswrapper[4817]: E0218 14:22:24.214270 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2550ff13-9cef-4d0a-a413-21d40e809b87" containerName="ceilometer-notification-agent" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.214279 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="2550ff13-9cef-4d0a-a413-21d40e809b87" containerName="ceilometer-notification-agent" Feb 18 14:22:24 crc kubenswrapper[4817]: E0218 14:22:24.214305 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2550ff13-9cef-4d0a-a413-21d40e809b87" containerName="ceilometer-central-agent" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.214314 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="2550ff13-9cef-4d0a-a413-21d40e809b87" containerName="ceilometer-central-agent" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.214573 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="2550ff13-9cef-4d0a-a413-21d40e809b87" containerName="ceilometer-central-agent" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.214592 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="2550ff13-9cef-4d0a-a413-21d40e809b87" containerName="ceilometer-notification-agent" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.214606 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="2550ff13-9cef-4d0a-a413-21d40e809b87" containerName="sg-core" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.214627 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="2550ff13-9cef-4d0a-a413-21d40e809b87" containerName="proxy-httpd" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.217151 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.220883 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.226416 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.226682 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.236301 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.338596 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb3b5a5c-e223-4a62-bd8d-84b653889697-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bb3b5a5c-e223-4a62-bd8d-84b653889697\") " pod="openstack/ceilometer-0" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.341993 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb3b5a5c-e223-4a62-bd8d-84b653889697-scripts\") pod \"ceilometer-0\" (UID: \"bb3b5a5c-e223-4a62-bd8d-84b653889697\") " pod="openstack/ceilometer-0" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.342279 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb3b5a5c-e223-4a62-bd8d-84b653889697-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bb3b5a5c-e223-4a62-bd8d-84b653889697\") " pod="openstack/ceilometer-0" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.342335 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb3b5a5c-e223-4a62-bd8d-84b653889697-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bb3b5a5c-e223-4a62-bd8d-84b653889697\") " pod="openstack/ceilometer-0" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.342725 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb3b5a5c-e223-4a62-bd8d-84b653889697-config-data\") pod \"ceilometer-0\" (UID: \"bb3b5a5c-e223-4a62-bd8d-84b653889697\") " pod="openstack/ceilometer-0" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.342960 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb3b5a5c-e223-4a62-bd8d-84b653889697-log-httpd\") pod \"ceilometer-0\" (UID: \"bb3b5a5c-e223-4a62-bd8d-84b653889697\") " pod="openstack/ceilometer-0" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.343369 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78tqz\" (UniqueName: \"kubernetes.io/projected/bb3b5a5c-e223-4a62-bd8d-84b653889697-kube-api-access-78tqz\") pod \"ceilometer-0\" (UID: \"bb3b5a5c-e223-4a62-bd8d-84b653889697\") " pod="openstack/ceilometer-0" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.344781 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb3b5a5c-e223-4a62-bd8d-84b653889697-run-httpd\") pod \"ceilometer-0\" (UID: \"bb3b5a5c-e223-4a62-bd8d-84b653889697\") " pod="openstack/ceilometer-0" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.446223 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78tqz\" (UniqueName: \"kubernetes.io/projected/bb3b5a5c-e223-4a62-bd8d-84b653889697-kube-api-access-78tqz\") pod \"ceilometer-0\" (UID: \"bb3b5a5c-e223-4a62-bd8d-84b653889697\") " pod="openstack/ceilometer-0" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.446277 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb3b5a5c-e223-4a62-bd8d-84b653889697-run-httpd\") pod \"ceilometer-0\" (UID: \"bb3b5a5c-e223-4a62-bd8d-84b653889697\") " pod="openstack/ceilometer-0" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.446301 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb3b5a5c-e223-4a62-bd8d-84b653889697-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bb3b5a5c-e223-4a62-bd8d-84b653889697\") " pod="openstack/ceilometer-0" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.446322 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb3b5a5c-e223-4a62-bd8d-84b653889697-scripts\") pod \"ceilometer-0\" (UID: \"bb3b5a5c-e223-4a62-bd8d-84b653889697\") " pod="openstack/ceilometer-0" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.446369 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb3b5a5c-e223-4a62-bd8d-84b653889697-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bb3b5a5c-e223-4a62-bd8d-84b653889697\") " pod="openstack/ceilometer-0" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.446388 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb3b5a5c-e223-4a62-bd8d-84b653889697-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bb3b5a5c-e223-4a62-bd8d-84b653889697\") " pod="openstack/ceilometer-0" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.446448 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb3b5a5c-e223-4a62-bd8d-84b653889697-config-data\") pod \"ceilometer-0\" (UID: \"bb3b5a5c-e223-4a62-bd8d-84b653889697\") " pod="openstack/ceilometer-0" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.446496 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb3b5a5c-e223-4a62-bd8d-84b653889697-log-httpd\") pod \"ceilometer-0\" (UID: \"bb3b5a5c-e223-4a62-bd8d-84b653889697\") " pod="openstack/ceilometer-0" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.447096 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb3b5a5c-e223-4a62-bd8d-84b653889697-log-httpd\") pod \"ceilometer-0\" (UID: \"bb3b5a5c-e223-4a62-bd8d-84b653889697\") " pod="openstack/ceilometer-0" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.447127 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb3b5a5c-e223-4a62-bd8d-84b653889697-run-httpd\") pod \"ceilometer-0\" (UID: \"bb3b5a5c-e223-4a62-bd8d-84b653889697\") " pod="openstack/ceilometer-0" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.451256 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb3b5a5c-e223-4a62-bd8d-84b653889697-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bb3b5a5c-e223-4a62-bd8d-84b653889697\") " pod="openstack/ceilometer-0" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.451739 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb3b5a5c-e223-4a62-bd8d-84b653889697-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bb3b5a5c-e223-4a62-bd8d-84b653889697\") " pod="openstack/ceilometer-0" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.451736 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb3b5a5c-e223-4a62-bd8d-84b653889697-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bb3b5a5c-e223-4a62-bd8d-84b653889697\") " pod="openstack/ceilometer-0" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.451861 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb3b5a5c-e223-4a62-bd8d-84b653889697-scripts\") pod \"ceilometer-0\" (UID: \"bb3b5a5c-e223-4a62-bd8d-84b653889697\") " pod="openstack/ceilometer-0" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.465510 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78tqz\" (UniqueName: \"kubernetes.io/projected/bb3b5a5c-e223-4a62-bd8d-84b653889697-kube-api-access-78tqz\") pod \"ceilometer-0\" (UID: \"bb3b5a5c-e223-4a62-bd8d-84b653889697\") " pod="openstack/ceilometer-0" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.465663 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb3b5a5c-e223-4a62-bd8d-84b653889697-config-data\") pod \"ceilometer-0\" (UID: \"bb3b5a5c-e223-4a62-bd8d-84b653889697\") " pod="openstack/ceilometer-0" Feb 18 14:22:24 crc kubenswrapper[4817]: I0218 14:22:24.540898 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:22:25 crc kubenswrapper[4817]: I0218 14:22:25.043853 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:22:25 crc kubenswrapper[4817]: W0218 14:22:25.049167 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb3b5a5c_e223_4a62_bd8d_84b653889697.slice/crio-aad2875d36aec620b1222763db8c0ba44e405b561423e2fcb439049b97f1ca7f WatchSource:0}: Error finding container aad2875d36aec620b1222763db8c0ba44e405b561423e2fcb439049b97f1ca7f: Status 404 returned error can't find the container with id aad2875d36aec620b1222763db8c0ba44e405b561423e2fcb439049b97f1ca7f Feb 18 14:22:25 crc kubenswrapper[4817]: I0218 14:22:25.127564 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 18 14:22:25 crc kubenswrapper[4817]: I0218 14:22:25.128148 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 18 14:22:25 crc kubenswrapper[4817]: I0218 14:22:25.156161 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="bd52eab4-329f-4cab-83cc-c082d2d3f1d4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.233:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 18 14:22:25 crc kubenswrapper[4817]: I0218 14:22:25.156185 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="bd52eab4-329f-4cab-83cc-c082d2d3f1d4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.233:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 18 14:22:25 crc kubenswrapper[4817]: I0218 14:22:25.868392 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb3b5a5c-e223-4a62-bd8d-84b653889697","Type":"ContainerStarted","Data":"aad2875d36aec620b1222763db8c0ba44e405b561423e2fcb439049b97f1ca7f"} Feb 18 14:22:26 crc kubenswrapper[4817]: I0218 14:22:26.144211 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bea1dd6e-5f07-4dd6-a191-f07f59d36043" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.234:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 18 14:22:26 crc kubenswrapper[4817]: I0218 14:22:26.144656 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bea1dd6e-5f07-4dd6-a191-f07f59d36043" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.234:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 18 14:22:26 crc kubenswrapper[4817]: I0218 14:22:26.185516 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2550ff13-9cef-4d0a-a413-21d40e809b87" path="/var/lib/kubelet/pods/2550ff13-9cef-4d0a-a413-21d40e809b87/volumes" Feb 18 14:22:26 crc kubenswrapper[4817]: I0218 14:22:26.882395 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb3b5a5c-e223-4a62-bd8d-84b653889697","Type":"ContainerStarted","Data":"9a2db3452a0648fd66ddab73b7fe8b82d893560d0f12f682da453bdb12048dbd"} Feb 18 14:22:27 crc kubenswrapper[4817]: I0218 14:22:27.896956 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb3b5a5c-e223-4a62-bd8d-84b653889697","Type":"ContainerStarted","Data":"5e4d0d64a00da56827db8a745e9fd2024aa3dd7a4e8562cc4593fbb36a925600"} Feb 18 14:22:28 crc kubenswrapper[4817]: I0218 14:22:28.911215 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb3b5a5c-e223-4a62-bd8d-84b653889697","Type":"ContainerStarted","Data":"d43a1b42c663e3b29097b7ec141835a84338509d833d0e22054298d5467d7321"} Feb 18 14:22:30 crc kubenswrapper[4817]: I0218 14:22:30.941282 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb3b5a5c-e223-4a62-bd8d-84b653889697","Type":"ContainerStarted","Data":"85a99f903affb0d19004e0ab694954d6c64026ff3efe8234504281e5fde89681"} Feb 18 14:22:30 crc kubenswrapper[4817]: I0218 14:22:30.942336 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 18 14:22:30 crc kubenswrapper[4817]: I0218 14:22:30.989922 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.428991449 podStartE2EDuration="6.989897031s" podCreationTimestamp="2026-02-18 14:22:24 +0000 UTC" firstStartedPulling="2026-02-18 14:22:25.051564978 +0000 UTC m=+1407.627100961" lastFinishedPulling="2026-02-18 14:22:29.61247056 +0000 UTC m=+1412.188006543" observedRunningTime="2026-02-18 14:22:30.969264017 +0000 UTC m=+1413.544800000" watchObservedRunningTime="2026-02-18 14:22:30.989897031 +0000 UTC m=+1413.565433024" Feb 18 14:22:34 crc kubenswrapper[4817]: I0218 14:22:34.150295 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 18 14:22:34 crc kubenswrapper[4817]: I0218 14:22:34.151572 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 18 14:22:34 crc kubenswrapper[4817]: I0218 14:22:34.160000 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 18 14:22:35 crc kubenswrapper[4817]: I0218 14:22:35.003753 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 18 14:22:35 crc kubenswrapper[4817]: I0218 14:22:35.133880 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 18 14:22:35 crc kubenswrapper[4817]: I0218 14:22:35.134395 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 18 14:22:35 crc kubenswrapper[4817]: I0218 14:22:35.161541 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 18 14:22:35 crc kubenswrapper[4817]: I0218 14:22:35.166866 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 18 14:22:36 crc kubenswrapper[4817]: I0218 14:22:36.010239 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 18 14:22:36 crc kubenswrapper[4817]: I0218 14:22:36.015737 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 18 14:22:42 crc kubenswrapper[4817]: I0218 14:22:42.863588 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:22:42 crc kubenswrapper[4817]: I0218 14:22:42.864107 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:22:54 crc kubenswrapper[4817]: I0218 14:22:54.550615 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 18 14:23:07 crc kubenswrapper[4817]: I0218 14:23:07.495359 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-4xc6g"] Feb 18 14:23:07 crc kubenswrapper[4817]: I0218 14:23:07.506094 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-4xc6g"] Feb 18 14:23:07 crc kubenswrapper[4817]: I0218 14:23:07.550729 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-6mxmm"] Feb 18 14:23:07 crc kubenswrapper[4817]: I0218 14:23:07.552854 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-6mxmm" Feb 18 14:23:07 crc kubenswrapper[4817]: I0218 14:23:07.560517 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 18 14:23:07 crc kubenswrapper[4817]: I0218 14:23:07.569513 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-6mxmm"] Feb 18 14:23:07 crc kubenswrapper[4817]: I0218 14:23:07.645257 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e4e668a-0bfc-430d-8796-9ed775e01913-combined-ca-bundle\") pod \"cloudkitty-db-sync-6mxmm\" (UID: \"2e4e668a-0bfc-430d-8796-9ed775e01913\") " pod="openstack/cloudkitty-db-sync-6mxmm" Feb 18 14:23:07 crc kubenswrapper[4817]: I0218 14:23:07.645344 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/2e4e668a-0bfc-430d-8796-9ed775e01913-certs\") pod \"cloudkitty-db-sync-6mxmm\" (UID: \"2e4e668a-0bfc-430d-8796-9ed775e01913\") " pod="openstack/cloudkitty-db-sync-6mxmm" Feb 18 14:23:07 crc kubenswrapper[4817]: I0218 14:23:07.645408 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgpgr\" (UniqueName: \"kubernetes.io/projected/2e4e668a-0bfc-430d-8796-9ed775e01913-kube-api-access-rgpgr\") pod \"cloudkitty-db-sync-6mxmm\" (UID: \"2e4e668a-0bfc-430d-8796-9ed775e01913\") " pod="openstack/cloudkitty-db-sync-6mxmm" Feb 18 14:23:07 crc kubenswrapper[4817]: I0218 14:23:07.645516 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e4e668a-0bfc-430d-8796-9ed775e01913-scripts\") pod \"cloudkitty-db-sync-6mxmm\" (UID: \"2e4e668a-0bfc-430d-8796-9ed775e01913\") " pod="openstack/cloudkitty-db-sync-6mxmm" Feb 18 14:23:07 crc kubenswrapper[4817]: I0218 14:23:07.645540 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e4e668a-0bfc-430d-8796-9ed775e01913-config-data\") pod \"cloudkitty-db-sync-6mxmm\" (UID: \"2e4e668a-0bfc-430d-8796-9ed775e01913\") " pod="openstack/cloudkitty-db-sync-6mxmm" Feb 18 14:23:07 crc kubenswrapper[4817]: I0218 14:23:07.747730 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgpgr\" (UniqueName: \"kubernetes.io/projected/2e4e668a-0bfc-430d-8796-9ed775e01913-kube-api-access-rgpgr\") pod \"cloudkitty-db-sync-6mxmm\" (UID: \"2e4e668a-0bfc-430d-8796-9ed775e01913\") " pod="openstack/cloudkitty-db-sync-6mxmm" Feb 18 14:23:07 crc kubenswrapper[4817]: I0218 14:23:07.747929 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e4e668a-0bfc-430d-8796-9ed775e01913-scripts\") pod \"cloudkitty-db-sync-6mxmm\" (UID: \"2e4e668a-0bfc-430d-8796-9ed775e01913\") " pod="openstack/cloudkitty-db-sync-6mxmm" Feb 18 14:23:07 crc kubenswrapper[4817]: I0218 14:23:07.747971 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e4e668a-0bfc-430d-8796-9ed775e01913-config-data\") pod \"cloudkitty-db-sync-6mxmm\" (UID: \"2e4e668a-0bfc-430d-8796-9ed775e01913\") " pod="openstack/cloudkitty-db-sync-6mxmm" Feb 18 14:23:07 crc kubenswrapper[4817]: I0218 14:23:07.748068 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e4e668a-0bfc-430d-8796-9ed775e01913-combined-ca-bundle\") pod \"cloudkitty-db-sync-6mxmm\" (UID: \"2e4e668a-0bfc-430d-8796-9ed775e01913\") " pod="openstack/cloudkitty-db-sync-6mxmm" Feb 18 14:23:07 crc kubenswrapper[4817]: I0218 14:23:07.748123 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/2e4e668a-0bfc-430d-8796-9ed775e01913-certs\") pod \"cloudkitty-db-sync-6mxmm\" (UID: \"2e4e668a-0bfc-430d-8796-9ed775e01913\") " pod="openstack/cloudkitty-db-sync-6mxmm" Feb 18 14:23:07 crc kubenswrapper[4817]: I0218 14:23:07.759630 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e4e668a-0bfc-430d-8796-9ed775e01913-scripts\") pod \"cloudkitty-db-sync-6mxmm\" (UID: \"2e4e668a-0bfc-430d-8796-9ed775e01913\") " pod="openstack/cloudkitty-db-sync-6mxmm" Feb 18 14:23:07 crc kubenswrapper[4817]: I0218 14:23:07.768770 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e4e668a-0bfc-430d-8796-9ed775e01913-combined-ca-bundle\") pod \"cloudkitty-db-sync-6mxmm\" (UID: \"2e4e668a-0bfc-430d-8796-9ed775e01913\") " pod="openstack/cloudkitty-db-sync-6mxmm" Feb 18 14:23:07 crc kubenswrapper[4817]: I0218 14:23:07.770025 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/2e4e668a-0bfc-430d-8796-9ed775e01913-certs\") pod \"cloudkitty-db-sync-6mxmm\" (UID: \"2e4e668a-0bfc-430d-8796-9ed775e01913\") " pod="openstack/cloudkitty-db-sync-6mxmm" Feb 18 14:23:07 crc kubenswrapper[4817]: I0218 14:23:07.774505 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e4e668a-0bfc-430d-8796-9ed775e01913-config-data\") pod \"cloudkitty-db-sync-6mxmm\" (UID: \"2e4e668a-0bfc-430d-8796-9ed775e01913\") " pod="openstack/cloudkitty-db-sync-6mxmm" Feb 18 14:23:07 crc kubenswrapper[4817]: I0218 14:23:07.781658 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgpgr\" (UniqueName: \"kubernetes.io/projected/2e4e668a-0bfc-430d-8796-9ed775e01913-kube-api-access-rgpgr\") pod \"cloudkitty-db-sync-6mxmm\" (UID: \"2e4e668a-0bfc-430d-8796-9ed775e01913\") " pod="openstack/cloudkitty-db-sync-6mxmm" Feb 18 14:23:07 crc kubenswrapper[4817]: I0218 14:23:07.889496 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-6mxmm" Feb 18 14:23:08 crc kubenswrapper[4817]: I0218 14:23:08.185487 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e385fdc-9c05-49ce-a823-dd99efa98e94" path="/var/lib/kubelet/pods/0e385fdc-9c05-49ce-a823-dd99efa98e94/volumes" Feb 18 14:23:08 crc kubenswrapper[4817]: I0218 14:23:08.381857 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-6mxmm"] Feb 18 14:23:09 crc kubenswrapper[4817]: I0218 14:23:09.241572 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:23:09 crc kubenswrapper[4817]: I0218 14:23:09.241915 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bb3b5a5c-e223-4a62-bd8d-84b653889697" containerName="proxy-httpd" containerID="cri-o://85a99f903affb0d19004e0ab694954d6c64026ff3efe8234504281e5fde89681" gracePeriod=30 Feb 18 14:23:09 crc kubenswrapper[4817]: I0218 14:23:09.241965 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bb3b5a5c-e223-4a62-bd8d-84b653889697" containerName="ceilometer-central-agent" containerID="cri-o://9a2db3452a0648fd66ddab73b7fe8b82d893560d0f12f682da453bdb12048dbd" gracePeriod=30 Feb 18 14:23:09 crc kubenswrapper[4817]: I0218 14:23:09.241972 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bb3b5a5c-e223-4a62-bd8d-84b653889697" containerName="sg-core" containerID="cri-o://d43a1b42c663e3b29097b7ec141835a84338509d833d0e22054298d5467d7321" gracePeriod=30 Feb 18 14:23:09 crc kubenswrapper[4817]: I0218 14:23:09.242006 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bb3b5a5c-e223-4a62-bd8d-84b653889697" containerName="ceilometer-notification-agent" containerID="cri-o://5e4d0d64a00da56827db8a745e9fd2024aa3dd7a4e8562cc4593fbb36a925600" gracePeriod=30 Feb 18 14:23:09 crc kubenswrapper[4817]: I0218 14:23:09.368453 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-6mxmm" event={"ID":"2e4e668a-0bfc-430d-8796-9ed775e01913","Type":"ContainerStarted","Data":"a7d20d234e195966956494ead38ec33cb3335a382b4ff29943ffaeaece23a53c"} Feb 18 14:23:09 crc kubenswrapper[4817]: I0218 14:23:09.811248 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 14:23:10 crc kubenswrapper[4817]: I0218 14:23:10.383200 4817 generic.go:334] "Generic (PLEG): container finished" podID="bb3b5a5c-e223-4a62-bd8d-84b653889697" containerID="85a99f903affb0d19004e0ab694954d6c64026ff3efe8234504281e5fde89681" exitCode=0 Feb 18 14:23:10 crc kubenswrapper[4817]: I0218 14:23:10.383489 4817 generic.go:334] "Generic (PLEG): container finished" podID="bb3b5a5c-e223-4a62-bd8d-84b653889697" containerID="d43a1b42c663e3b29097b7ec141835a84338509d833d0e22054298d5467d7321" exitCode=2 Feb 18 14:23:10 crc kubenswrapper[4817]: I0218 14:23:10.383500 4817 generic.go:334] "Generic (PLEG): container finished" podID="bb3b5a5c-e223-4a62-bd8d-84b653889697" containerID="9a2db3452a0648fd66ddab73b7fe8b82d893560d0f12f682da453bdb12048dbd" exitCode=0 Feb 18 14:23:10 crc kubenswrapper[4817]: I0218 14:23:10.383268 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb3b5a5c-e223-4a62-bd8d-84b653889697","Type":"ContainerDied","Data":"85a99f903affb0d19004e0ab694954d6c64026ff3efe8234504281e5fde89681"} Feb 18 14:23:10 crc kubenswrapper[4817]: I0218 14:23:10.383542 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb3b5a5c-e223-4a62-bd8d-84b653889697","Type":"ContainerDied","Data":"d43a1b42c663e3b29097b7ec141835a84338509d833d0e22054298d5467d7321"} Feb 18 14:23:10 crc kubenswrapper[4817]: I0218 14:23:10.383556 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb3b5a5c-e223-4a62-bd8d-84b653889697","Type":"ContainerDied","Data":"9a2db3452a0648fd66ddab73b7fe8b82d893560d0f12f682da453bdb12048dbd"} Feb 18 14:23:10 crc kubenswrapper[4817]: I0218 14:23:10.386016 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-6mxmm" event={"ID":"2e4e668a-0bfc-430d-8796-9ed775e01913","Type":"ContainerStarted","Data":"48eec7203a92500b53a432814248b1ba6b6019c370a3555217314aa7397779b1"} Feb 18 14:23:10 crc kubenswrapper[4817]: I0218 14:23:10.717103 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-6mxmm" podStartSLOduration=2.6204395959999998 podStartE2EDuration="3.717078587s" podCreationTimestamp="2026-02-18 14:23:07 +0000 UTC" firstStartedPulling="2026-02-18 14:23:08.389448375 +0000 UTC m=+1450.964984358" lastFinishedPulling="2026-02-18 14:23:09.486087366 +0000 UTC m=+1452.061623349" observedRunningTime="2026-02-18 14:23:10.403815494 +0000 UTC m=+1452.979351487" watchObservedRunningTime="2026-02-18 14:23:10.717078587 +0000 UTC m=+1453.292614570" Feb 18 14:23:10 crc kubenswrapper[4817]: I0218 14:23:10.717636 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.164266 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.250922 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb3b5a5c-e223-4a62-bd8d-84b653889697-combined-ca-bundle\") pod \"bb3b5a5c-e223-4a62-bd8d-84b653889697\" (UID: \"bb3b5a5c-e223-4a62-bd8d-84b653889697\") " Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.251318 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb3b5a5c-e223-4a62-bd8d-84b653889697-scripts\") pod \"bb3b5a5c-e223-4a62-bd8d-84b653889697\" (UID: \"bb3b5a5c-e223-4a62-bd8d-84b653889697\") " Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.251355 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb3b5a5c-e223-4a62-bd8d-84b653889697-ceilometer-tls-certs\") pod \"bb3b5a5c-e223-4a62-bd8d-84b653889697\" (UID: \"bb3b5a5c-e223-4a62-bd8d-84b653889697\") " Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.251442 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb3b5a5c-e223-4a62-bd8d-84b653889697-sg-core-conf-yaml\") pod \"bb3b5a5c-e223-4a62-bd8d-84b653889697\" (UID: \"bb3b5a5c-e223-4a62-bd8d-84b653889697\") " Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.251606 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb3b5a5c-e223-4a62-bd8d-84b653889697-config-data\") pod \"bb3b5a5c-e223-4a62-bd8d-84b653889697\" (UID: \"bb3b5a5c-e223-4a62-bd8d-84b653889697\") " Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.251665 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb3b5a5c-e223-4a62-bd8d-84b653889697-log-httpd\") pod \"bb3b5a5c-e223-4a62-bd8d-84b653889697\" (UID: \"bb3b5a5c-e223-4a62-bd8d-84b653889697\") " Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.251735 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb3b5a5c-e223-4a62-bd8d-84b653889697-run-httpd\") pod \"bb3b5a5c-e223-4a62-bd8d-84b653889697\" (UID: \"bb3b5a5c-e223-4a62-bd8d-84b653889697\") " Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.251854 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78tqz\" (UniqueName: \"kubernetes.io/projected/bb3b5a5c-e223-4a62-bd8d-84b653889697-kube-api-access-78tqz\") pod \"bb3b5a5c-e223-4a62-bd8d-84b653889697\" (UID: \"bb3b5a5c-e223-4a62-bd8d-84b653889697\") " Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.256134 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb3b5a5c-e223-4a62-bd8d-84b653889697-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bb3b5a5c-e223-4a62-bd8d-84b653889697" (UID: "bb3b5a5c-e223-4a62-bd8d-84b653889697"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.256567 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb3b5a5c-e223-4a62-bd8d-84b653889697-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bb3b5a5c-e223-4a62-bd8d-84b653889697" (UID: "bb3b5a5c-e223-4a62-bd8d-84b653889697"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.260459 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb3b5a5c-e223-4a62-bd8d-84b653889697-scripts" (OuterVolumeSpecName: "scripts") pod "bb3b5a5c-e223-4a62-bd8d-84b653889697" (UID: "bb3b5a5c-e223-4a62-bd8d-84b653889697"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.274507 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb3b5a5c-e223-4a62-bd8d-84b653889697-kube-api-access-78tqz" (OuterVolumeSpecName: "kube-api-access-78tqz") pod "bb3b5a5c-e223-4a62-bd8d-84b653889697" (UID: "bb3b5a5c-e223-4a62-bd8d-84b653889697"). InnerVolumeSpecName "kube-api-access-78tqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.286285 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb3b5a5c-e223-4a62-bd8d-84b653889697-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bb3b5a5c-e223-4a62-bd8d-84b653889697" (UID: "bb3b5a5c-e223-4a62-bd8d-84b653889697"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.355779 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78tqz\" (UniqueName: \"kubernetes.io/projected/bb3b5a5c-e223-4a62-bd8d-84b653889697-kube-api-access-78tqz\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.355810 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb3b5a5c-e223-4a62-bd8d-84b653889697-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.355820 4817 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb3b5a5c-e223-4a62-bd8d-84b653889697-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.355828 4817 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb3b5a5c-e223-4a62-bd8d-84b653889697-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.355837 4817 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb3b5a5c-e223-4a62-bd8d-84b653889697-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.381244 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb3b5a5c-e223-4a62-bd8d-84b653889697-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "bb3b5a5c-e223-4a62-bd8d-84b653889697" (UID: "bb3b5a5c-e223-4a62-bd8d-84b653889697"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.425703 4817 generic.go:334] "Generic (PLEG): container finished" podID="bb3b5a5c-e223-4a62-bd8d-84b653889697" containerID="5e4d0d64a00da56827db8a745e9fd2024aa3dd7a4e8562cc4593fbb36a925600" exitCode=0 Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.425750 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb3b5a5c-e223-4a62-bd8d-84b653889697","Type":"ContainerDied","Data":"5e4d0d64a00da56827db8a745e9fd2024aa3dd7a4e8562cc4593fbb36a925600"} Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.425790 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb3b5a5c-e223-4a62-bd8d-84b653889697","Type":"ContainerDied","Data":"aad2875d36aec620b1222763db8c0ba44e405b561423e2fcb439049b97f1ca7f"} Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.425812 4817 scope.go:117] "RemoveContainer" containerID="85a99f903affb0d19004e0ab694954d6c64026ff3efe8234504281e5fde89681" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.425819 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.430523 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb3b5a5c-e223-4a62-bd8d-84b653889697-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb3b5a5c-e223-4a62-bd8d-84b653889697" (UID: "bb3b5a5c-e223-4a62-bd8d-84b653889697"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.442856 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb3b5a5c-e223-4a62-bd8d-84b653889697-config-data" (OuterVolumeSpecName: "config-data") pod "bb3b5a5c-e223-4a62-bd8d-84b653889697" (UID: "bb3b5a5c-e223-4a62-bd8d-84b653889697"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.449650 4817 scope.go:117] "RemoveContainer" containerID="d43a1b42c663e3b29097b7ec141835a84338509d833d0e22054298d5467d7321" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.459308 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb3b5a5c-e223-4a62-bd8d-84b653889697-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.459347 4817 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb3b5a5c-e223-4a62-bd8d-84b653889697-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.459360 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb3b5a5c-e223-4a62-bd8d-84b653889697-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.474420 4817 scope.go:117] "RemoveContainer" containerID="5e4d0d64a00da56827db8a745e9fd2024aa3dd7a4e8562cc4593fbb36a925600" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.498836 4817 scope.go:117] "RemoveContainer" containerID="9a2db3452a0648fd66ddab73b7fe8b82d893560d0f12f682da453bdb12048dbd" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.523748 4817 scope.go:117] "RemoveContainer" containerID="85a99f903affb0d19004e0ab694954d6c64026ff3efe8234504281e5fde89681" Feb 18 14:23:12 crc kubenswrapper[4817]: E0218 14:23:12.524267 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85a99f903affb0d19004e0ab694954d6c64026ff3efe8234504281e5fde89681\": container with ID starting with 85a99f903affb0d19004e0ab694954d6c64026ff3efe8234504281e5fde89681 not found: ID does not exist" containerID="85a99f903affb0d19004e0ab694954d6c64026ff3efe8234504281e5fde89681" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.524300 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85a99f903affb0d19004e0ab694954d6c64026ff3efe8234504281e5fde89681"} err="failed to get container status \"85a99f903affb0d19004e0ab694954d6c64026ff3efe8234504281e5fde89681\": rpc error: code = NotFound desc = could not find container \"85a99f903affb0d19004e0ab694954d6c64026ff3efe8234504281e5fde89681\": container with ID starting with 85a99f903affb0d19004e0ab694954d6c64026ff3efe8234504281e5fde89681 not found: ID does not exist" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.524325 4817 scope.go:117] "RemoveContainer" containerID="d43a1b42c663e3b29097b7ec141835a84338509d833d0e22054298d5467d7321" Feb 18 14:23:12 crc kubenswrapper[4817]: E0218 14:23:12.524630 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d43a1b42c663e3b29097b7ec141835a84338509d833d0e22054298d5467d7321\": container with ID starting with d43a1b42c663e3b29097b7ec141835a84338509d833d0e22054298d5467d7321 not found: ID does not exist" containerID="d43a1b42c663e3b29097b7ec141835a84338509d833d0e22054298d5467d7321" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.524654 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d43a1b42c663e3b29097b7ec141835a84338509d833d0e22054298d5467d7321"} err="failed to get container status \"d43a1b42c663e3b29097b7ec141835a84338509d833d0e22054298d5467d7321\": rpc error: code = NotFound desc = could not find container \"d43a1b42c663e3b29097b7ec141835a84338509d833d0e22054298d5467d7321\": container with ID starting with d43a1b42c663e3b29097b7ec141835a84338509d833d0e22054298d5467d7321 not found: ID does not exist" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.524667 4817 scope.go:117] "RemoveContainer" containerID="5e4d0d64a00da56827db8a745e9fd2024aa3dd7a4e8562cc4593fbb36a925600" Feb 18 14:23:12 crc kubenswrapper[4817]: E0218 14:23:12.525222 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e4d0d64a00da56827db8a745e9fd2024aa3dd7a4e8562cc4593fbb36a925600\": container with ID starting with 5e4d0d64a00da56827db8a745e9fd2024aa3dd7a4e8562cc4593fbb36a925600 not found: ID does not exist" containerID="5e4d0d64a00da56827db8a745e9fd2024aa3dd7a4e8562cc4593fbb36a925600" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.525290 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e4d0d64a00da56827db8a745e9fd2024aa3dd7a4e8562cc4593fbb36a925600"} err="failed to get container status \"5e4d0d64a00da56827db8a745e9fd2024aa3dd7a4e8562cc4593fbb36a925600\": rpc error: code = NotFound desc = could not find container \"5e4d0d64a00da56827db8a745e9fd2024aa3dd7a4e8562cc4593fbb36a925600\": container with ID starting with 5e4d0d64a00da56827db8a745e9fd2024aa3dd7a4e8562cc4593fbb36a925600 not found: ID does not exist" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.525321 4817 scope.go:117] "RemoveContainer" containerID="9a2db3452a0648fd66ddab73b7fe8b82d893560d0f12f682da453bdb12048dbd" Feb 18 14:23:12 crc kubenswrapper[4817]: E0218 14:23:12.525644 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a2db3452a0648fd66ddab73b7fe8b82d893560d0f12f682da453bdb12048dbd\": container with ID starting with 9a2db3452a0648fd66ddab73b7fe8b82d893560d0f12f682da453bdb12048dbd not found: ID does not exist" containerID="9a2db3452a0648fd66ddab73b7fe8b82d893560d0f12f682da453bdb12048dbd" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.525677 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a2db3452a0648fd66ddab73b7fe8b82d893560d0f12f682da453bdb12048dbd"} err="failed to get container status \"9a2db3452a0648fd66ddab73b7fe8b82d893560d0f12f682da453bdb12048dbd\": rpc error: code = NotFound desc = could not find container \"9a2db3452a0648fd66ddab73b7fe8b82d893560d0f12f682da453bdb12048dbd\": container with ID starting with 9a2db3452a0648fd66ddab73b7fe8b82d893560d0f12f682da453bdb12048dbd not found: ID does not exist" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.798489 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.813031 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.900657 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.900729 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.904620 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:23:12 crc kubenswrapper[4817]: E0218 14:23:12.905265 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb3b5a5c-e223-4a62-bd8d-84b653889697" containerName="proxy-httpd" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.905290 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb3b5a5c-e223-4a62-bd8d-84b653889697" containerName="proxy-httpd" Feb 18 14:23:12 crc kubenswrapper[4817]: E0218 14:23:12.905316 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb3b5a5c-e223-4a62-bd8d-84b653889697" containerName="ceilometer-notification-agent" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.905325 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb3b5a5c-e223-4a62-bd8d-84b653889697" containerName="ceilometer-notification-agent" Feb 18 14:23:12 crc kubenswrapper[4817]: E0218 14:23:12.905345 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb3b5a5c-e223-4a62-bd8d-84b653889697" containerName="sg-core" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.905356 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb3b5a5c-e223-4a62-bd8d-84b653889697" containerName="sg-core" Feb 18 14:23:12 crc kubenswrapper[4817]: E0218 14:23:12.905374 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb3b5a5c-e223-4a62-bd8d-84b653889697" containerName="ceilometer-central-agent" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.905382 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb3b5a5c-e223-4a62-bd8d-84b653889697" containerName="ceilometer-central-agent" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.905718 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb3b5a5c-e223-4a62-bd8d-84b653889697" containerName="sg-core" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.905748 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb3b5a5c-e223-4a62-bd8d-84b653889697" containerName="ceilometer-notification-agent" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.905768 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb3b5a5c-e223-4a62-bd8d-84b653889697" containerName="ceilometer-central-agent" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.905797 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb3b5a5c-e223-4a62-bd8d-84b653889697" containerName="proxy-httpd" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.908828 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.916322 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.916684 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.917525 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 18 14:23:12 crc kubenswrapper[4817]: I0218 14:23:12.918863 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:23:13 crc kubenswrapper[4817]: I0218 14:23:13.104798 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8-log-httpd\") pod \"ceilometer-0\" (UID: \"b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8\") " pod="openstack/ceilometer-0" Feb 18 14:23:13 crc kubenswrapper[4817]: I0218 14:23:13.105229 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8\") " pod="openstack/ceilometer-0" Feb 18 14:23:13 crc kubenswrapper[4817]: I0218 14:23:13.105276 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8-config-data\") pod \"ceilometer-0\" (UID: \"b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8\") " pod="openstack/ceilometer-0" Feb 18 14:23:13 crc kubenswrapper[4817]: I0218 14:23:13.105303 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8-scripts\") pod \"ceilometer-0\" (UID: \"b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8\") " pod="openstack/ceilometer-0" Feb 18 14:23:13 crc kubenswrapper[4817]: I0218 14:23:13.105332 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8-run-httpd\") pod \"ceilometer-0\" (UID: \"b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8\") " pod="openstack/ceilometer-0" Feb 18 14:23:13 crc kubenswrapper[4817]: I0218 14:23:13.105386 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhwpg\" (UniqueName: \"kubernetes.io/projected/b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8-kube-api-access-qhwpg\") pod \"ceilometer-0\" (UID: \"b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8\") " pod="openstack/ceilometer-0" Feb 18 14:23:13 crc kubenswrapper[4817]: I0218 14:23:13.105588 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8\") " pod="openstack/ceilometer-0" Feb 18 14:23:13 crc kubenswrapper[4817]: I0218 14:23:13.105654 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8\") " pod="openstack/ceilometer-0" Feb 18 14:23:13 crc kubenswrapper[4817]: I0218 14:23:13.210236 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8\") " pod="openstack/ceilometer-0" Feb 18 14:23:13 crc kubenswrapper[4817]: I0218 14:23:13.211222 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8-config-data\") pod \"ceilometer-0\" (UID: \"b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8\") " pod="openstack/ceilometer-0" Feb 18 14:23:13 crc kubenswrapper[4817]: I0218 14:23:13.211325 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8-scripts\") pod \"ceilometer-0\" (UID: \"b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8\") " pod="openstack/ceilometer-0" Feb 18 14:23:13 crc kubenswrapper[4817]: I0218 14:23:13.211797 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8-run-httpd\") pod \"ceilometer-0\" (UID: \"b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8\") " pod="openstack/ceilometer-0" Feb 18 14:23:13 crc kubenswrapper[4817]: I0218 14:23:13.211959 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhwpg\" (UniqueName: \"kubernetes.io/projected/b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8-kube-api-access-qhwpg\") pod \"ceilometer-0\" (UID: \"b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8\") " pod="openstack/ceilometer-0" Feb 18 14:23:13 crc kubenswrapper[4817]: I0218 14:23:13.212456 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8-run-httpd\") pod \"ceilometer-0\" (UID: \"b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8\") " pod="openstack/ceilometer-0" Feb 18 14:23:13 crc kubenswrapper[4817]: I0218 14:23:13.214093 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8\") " pod="openstack/ceilometer-0" Feb 18 14:23:13 crc kubenswrapper[4817]: I0218 14:23:13.214534 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8\") " pod="openstack/ceilometer-0" Feb 18 14:23:13 crc kubenswrapper[4817]: I0218 14:23:13.216849 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8-log-httpd\") pod \"ceilometer-0\" (UID: \"b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8\") " pod="openstack/ceilometer-0" Feb 18 14:23:13 crc kubenswrapper[4817]: I0218 14:23:13.217783 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8-scripts\") pod \"ceilometer-0\" (UID: \"b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8\") " pod="openstack/ceilometer-0" Feb 18 14:23:13 crc kubenswrapper[4817]: I0218 14:23:13.217968 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8\") " pod="openstack/ceilometer-0" Feb 18 14:23:13 crc kubenswrapper[4817]: I0218 14:23:13.218375 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8-log-httpd\") pod \"ceilometer-0\" (UID: \"b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8\") " pod="openstack/ceilometer-0" Feb 18 14:23:13 crc kubenswrapper[4817]: I0218 14:23:13.222832 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8\") " pod="openstack/ceilometer-0" Feb 18 14:23:13 crc kubenswrapper[4817]: I0218 14:23:13.223236 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8\") " pod="openstack/ceilometer-0" Feb 18 14:23:13 crc kubenswrapper[4817]: I0218 14:23:13.223455 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8-config-data\") pod \"ceilometer-0\" (UID: \"b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8\") " pod="openstack/ceilometer-0" Feb 18 14:23:13 crc kubenswrapper[4817]: I0218 14:23:13.236196 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhwpg\" (UniqueName: \"kubernetes.io/projected/b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8-kube-api-access-qhwpg\") pod \"ceilometer-0\" (UID: \"b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8\") " pod="openstack/ceilometer-0" Feb 18 14:23:13 crc kubenswrapper[4817]: I0218 14:23:13.236704 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 18 14:23:13 crc kubenswrapper[4817]: I0218 14:23:13.471189 4817 generic.go:334] "Generic (PLEG): container finished" podID="2e4e668a-0bfc-430d-8796-9ed775e01913" containerID="48eec7203a92500b53a432814248b1ba6b6019c370a3555217314aa7397779b1" exitCode=0 Feb 18 14:23:13 crc kubenswrapper[4817]: I0218 14:23:13.471244 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-6mxmm" event={"ID":"2e4e668a-0bfc-430d-8796-9ed775e01913","Type":"ContainerDied","Data":"48eec7203a92500b53a432814248b1ba6b6019c370a3555217314aa7397779b1"} Feb 18 14:23:13 crc kubenswrapper[4817]: I0218 14:23:13.779940 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 18 14:23:14 crc kubenswrapper[4817]: I0218 14:23:14.186644 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb3b5a5c-e223-4a62-bd8d-84b653889697" path="/var/lib/kubelet/pods/bb3b5a5c-e223-4a62-bd8d-84b653889697/volumes" Feb 18 14:23:14 crc kubenswrapper[4817]: I0218 14:23:14.485457 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8","Type":"ContainerStarted","Data":"beba5d037098ff349c4897a539d521e07894596a48a8229050a4bf49816fb519"} Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.163496 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-6mxmm" Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.267214 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e4e668a-0bfc-430d-8796-9ed775e01913-combined-ca-bundle\") pod \"2e4e668a-0bfc-430d-8796-9ed775e01913\" (UID: \"2e4e668a-0bfc-430d-8796-9ed775e01913\") " Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.267359 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/2e4e668a-0bfc-430d-8796-9ed775e01913-certs\") pod \"2e4e668a-0bfc-430d-8796-9ed775e01913\" (UID: \"2e4e668a-0bfc-430d-8796-9ed775e01913\") " Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.267393 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e4e668a-0bfc-430d-8796-9ed775e01913-scripts\") pod \"2e4e668a-0bfc-430d-8796-9ed775e01913\" (UID: \"2e4e668a-0bfc-430d-8796-9ed775e01913\") " Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.267456 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e4e668a-0bfc-430d-8796-9ed775e01913-config-data\") pod \"2e4e668a-0bfc-430d-8796-9ed775e01913\" (UID: \"2e4e668a-0bfc-430d-8796-9ed775e01913\") " Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.267562 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgpgr\" (UniqueName: \"kubernetes.io/projected/2e4e668a-0bfc-430d-8796-9ed775e01913-kube-api-access-rgpgr\") pod \"2e4e668a-0bfc-430d-8796-9ed775e01913\" (UID: \"2e4e668a-0bfc-430d-8796-9ed775e01913\") " Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.274021 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e4e668a-0bfc-430d-8796-9ed775e01913-kube-api-access-rgpgr" (OuterVolumeSpecName: "kube-api-access-rgpgr") pod "2e4e668a-0bfc-430d-8796-9ed775e01913" (UID: "2e4e668a-0bfc-430d-8796-9ed775e01913"). InnerVolumeSpecName "kube-api-access-rgpgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.295204 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e4e668a-0bfc-430d-8796-9ed775e01913-scripts" (OuterVolumeSpecName: "scripts") pod "2e4e668a-0bfc-430d-8796-9ed775e01913" (UID: "2e4e668a-0bfc-430d-8796-9ed775e01913"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.295242 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e4e668a-0bfc-430d-8796-9ed775e01913-certs" (OuterVolumeSpecName: "certs") pod "2e4e668a-0bfc-430d-8796-9ed775e01913" (UID: "2e4e668a-0bfc-430d-8796-9ed775e01913"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.308736 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e4e668a-0bfc-430d-8796-9ed775e01913-config-data" (OuterVolumeSpecName: "config-data") pod "2e4e668a-0bfc-430d-8796-9ed775e01913" (UID: "2e4e668a-0bfc-430d-8796-9ed775e01913"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.329504 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e4e668a-0bfc-430d-8796-9ed775e01913-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e4e668a-0bfc-430d-8796-9ed775e01913" (UID: "2e4e668a-0bfc-430d-8796-9ed775e01913"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.370566 4817 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/2e4e668a-0bfc-430d-8796-9ed775e01913-certs\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.370605 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e4e668a-0bfc-430d-8796-9ed775e01913-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.370618 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e4e668a-0bfc-430d-8796-9ed775e01913-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.370630 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgpgr\" (UniqueName: \"kubernetes.io/projected/2e4e668a-0bfc-430d-8796-9ed775e01913-kube-api-access-rgpgr\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.370642 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e4e668a-0bfc-430d-8796-9ed775e01913-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.500327 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-6mxmm" event={"ID":"2e4e668a-0bfc-430d-8796-9ed775e01913","Type":"ContainerDied","Data":"a7d20d234e195966956494ead38ec33cb3335a382b4ff29943ffaeaece23a53c"} Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.500376 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7d20d234e195966956494ead38ec33cb3335a382b4ff29943ffaeaece23a53c" Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.500443 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-6mxmm" Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.605552 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-jgz44"] Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.624572 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-jgz44"] Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.710667 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-rc7gd"] Feb 18 14:23:15 crc kubenswrapper[4817]: E0218 14:23:15.711302 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e4e668a-0bfc-430d-8796-9ed775e01913" containerName="cloudkitty-db-sync" Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.711326 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e4e668a-0bfc-430d-8796-9ed775e01913" containerName="cloudkitty-db-sync" Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.711602 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e4e668a-0bfc-430d-8796-9ed775e01913" containerName="cloudkitty-db-sync" Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.712639 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-rc7gd" Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.715590 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 18 14:23:15 crc kubenswrapper[4817]: E0218 14:23:15.720349 4817 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e4e668a_0bfc_430d_8796_9ed775e01913.slice/crio-a7d20d234e195966956494ead38ec33cb3335a382b4ff29943ffaeaece23a53c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e4e668a_0bfc_430d_8796_9ed775e01913.slice\": RecentStats: unable to find data in memory cache]" Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.755635 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-rc7gd"] Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.779710 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2eec2d51-e854-4ecd-aa31-fd94387e6aa6-scripts\") pod \"cloudkitty-storageinit-rc7gd\" (UID: \"2eec2d51-e854-4ecd-aa31-fd94387e6aa6\") " pod="openstack/cloudkitty-storageinit-rc7gd" Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.779786 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9dx8\" (UniqueName: \"kubernetes.io/projected/2eec2d51-e854-4ecd-aa31-fd94387e6aa6-kube-api-access-k9dx8\") pod \"cloudkitty-storageinit-rc7gd\" (UID: \"2eec2d51-e854-4ecd-aa31-fd94387e6aa6\") " pod="openstack/cloudkitty-storageinit-rc7gd" Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.779880 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/2eec2d51-e854-4ecd-aa31-fd94387e6aa6-certs\") pod \"cloudkitty-storageinit-rc7gd\" (UID: \"2eec2d51-e854-4ecd-aa31-fd94387e6aa6\") " pod="openstack/cloudkitty-storageinit-rc7gd" Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.780090 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eec2d51-e854-4ecd-aa31-fd94387e6aa6-config-data\") pod \"cloudkitty-storageinit-rc7gd\" (UID: \"2eec2d51-e854-4ecd-aa31-fd94387e6aa6\") " pod="openstack/cloudkitty-storageinit-rc7gd" Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.780924 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eec2d51-e854-4ecd-aa31-fd94387e6aa6-combined-ca-bundle\") pod \"cloudkitty-storageinit-rc7gd\" (UID: \"2eec2d51-e854-4ecd-aa31-fd94387e6aa6\") " pod="openstack/cloudkitty-storageinit-rc7gd" Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.883355 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eec2d51-e854-4ecd-aa31-fd94387e6aa6-combined-ca-bundle\") pod \"cloudkitty-storageinit-rc7gd\" (UID: \"2eec2d51-e854-4ecd-aa31-fd94387e6aa6\") " pod="openstack/cloudkitty-storageinit-rc7gd" Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.883496 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2eec2d51-e854-4ecd-aa31-fd94387e6aa6-scripts\") pod \"cloudkitty-storageinit-rc7gd\" (UID: \"2eec2d51-e854-4ecd-aa31-fd94387e6aa6\") " pod="openstack/cloudkitty-storageinit-rc7gd" Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.883530 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9dx8\" (UniqueName: \"kubernetes.io/projected/2eec2d51-e854-4ecd-aa31-fd94387e6aa6-kube-api-access-k9dx8\") pod \"cloudkitty-storageinit-rc7gd\" (UID: \"2eec2d51-e854-4ecd-aa31-fd94387e6aa6\") " pod="openstack/cloudkitty-storageinit-rc7gd" Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.883598 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/2eec2d51-e854-4ecd-aa31-fd94387e6aa6-certs\") pod \"cloudkitty-storageinit-rc7gd\" (UID: \"2eec2d51-e854-4ecd-aa31-fd94387e6aa6\") " pod="openstack/cloudkitty-storageinit-rc7gd" Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.883647 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eec2d51-e854-4ecd-aa31-fd94387e6aa6-config-data\") pod \"cloudkitty-storageinit-rc7gd\" (UID: \"2eec2d51-e854-4ecd-aa31-fd94387e6aa6\") " pod="openstack/cloudkitty-storageinit-rc7gd" Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.890733 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/2eec2d51-e854-4ecd-aa31-fd94387e6aa6-certs\") pod \"cloudkitty-storageinit-rc7gd\" (UID: \"2eec2d51-e854-4ecd-aa31-fd94387e6aa6\") " pod="openstack/cloudkitty-storageinit-rc7gd" Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.890827 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eec2d51-e854-4ecd-aa31-fd94387e6aa6-combined-ca-bundle\") pod \"cloudkitty-storageinit-rc7gd\" (UID: \"2eec2d51-e854-4ecd-aa31-fd94387e6aa6\") " pod="openstack/cloudkitty-storageinit-rc7gd" Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.891219 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eec2d51-e854-4ecd-aa31-fd94387e6aa6-config-data\") pod \"cloudkitty-storageinit-rc7gd\" (UID: \"2eec2d51-e854-4ecd-aa31-fd94387e6aa6\") " pod="openstack/cloudkitty-storageinit-rc7gd" Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.901561 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2eec2d51-e854-4ecd-aa31-fd94387e6aa6-scripts\") pod \"cloudkitty-storageinit-rc7gd\" (UID: \"2eec2d51-e854-4ecd-aa31-fd94387e6aa6\") " pod="openstack/cloudkitty-storageinit-rc7gd" Feb 18 14:23:15 crc kubenswrapper[4817]: I0218 14:23:15.912434 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9dx8\" (UniqueName: \"kubernetes.io/projected/2eec2d51-e854-4ecd-aa31-fd94387e6aa6-kube-api-access-k9dx8\") pod \"cloudkitty-storageinit-rc7gd\" (UID: \"2eec2d51-e854-4ecd-aa31-fd94387e6aa6\") " pod="openstack/cloudkitty-storageinit-rc7gd" Feb 18 14:23:16 crc kubenswrapper[4817]: I0218 14:23:16.046842 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-rc7gd" Feb 18 14:23:16 crc kubenswrapper[4817]: I0218 14:23:16.055174 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="14e634c8-da00-43a5-96a8-33e8bf806873" containerName="rabbitmq" containerID="cri-o://2187986625b1e0afd9d63e29f131284ecbd1b9924fcf0bd96558e3a42054490b" gracePeriod=604795 Feb 18 14:23:16 crc kubenswrapper[4817]: I0218 14:23:16.177718 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1" containerName="rabbitmq" containerID="cri-o://e26ecc0e0a3b4457bde2bdb563a90b212c385b31af07a4839118b11a69829aaf" gracePeriod=604794 Feb 18 14:23:16 crc kubenswrapper[4817]: I0218 14:23:16.191980 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6f7d4df-bc28-4a01-a044-091894ac27c2" path="/var/lib/kubelet/pods/c6f7d4df-bc28-4a01-a044-091894ac27c2/volumes" Feb 18 14:23:18 crc kubenswrapper[4817]: I0218 14:23:18.581425 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.107:5671: connect: connection refused" Feb 18 14:23:18 crc kubenswrapper[4817]: I0218 14:23:18.922585 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="14e634c8-da00-43a5-96a8-33e8bf806873" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.108:5671: connect: connection refused" Feb 18 14:23:18 crc kubenswrapper[4817]: W0218 14:23:18.972119 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2eec2d51_e854_4ecd_aa31_fd94387e6aa6.slice/crio-7ffeb995945e16d5551677678282632ea028a07126c5a847765342d1947f7bd4 WatchSource:0}: Error finding container 7ffeb995945e16d5551677678282632ea028a07126c5a847765342d1947f7bd4: Status 404 returned error can't find the container with id 7ffeb995945e16d5551677678282632ea028a07126c5a847765342d1947f7bd4 Feb 18 14:23:18 crc kubenswrapper[4817]: I0218 14:23:18.972910 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-rc7gd"] Feb 18 14:23:19 crc kubenswrapper[4817]: I0218 14:23:19.556353 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8","Type":"ContainerStarted","Data":"36b69b4841bdc321921f404b3633842c62cb66254627db27e89dc08b97d83e96"} Feb 18 14:23:19 crc kubenswrapper[4817]: I0218 14:23:19.559010 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-rc7gd" event={"ID":"2eec2d51-e854-4ecd-aa31-fd94387e6aa6","Type":"ContainerStarted","Data":"4ba7a12f52d396a8ef524aad1ccad889df65d9d3b445891e515c2ba8acfa3569"} Feb 18 14:23:19 crc kubenswrapper[4817]: I0218 14:23:19.559054 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-rc7gd" event={"ID":"2eec2d51-e854-4ecd-aa31-fd94387e6aa6","Type":"ContainerStarted","Data":"7ffeb995945e16d5551677678282632ea028a07126c5a847765342d1947f7bd4"} Feb 18 14:23:19 crc kubenswrapper[4817]: I0218 14:23:19.589918 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-rc7gd" podStartSLOduration=4.58989651 podStartE2EDuration="4.58989651s" podCreationTimestamp="2026-02-18 14:23:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:23:19.587111948 +0000 UTC m=+1462.162647931" watchObservedRunningTime="2026-02-18 14:23:19.58989651 +0000 UTC m=+1462.165432493" Feb 18 14:23:20 crc kubenswrapper[4817]: I0218 14:23:20.576419 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8","Type":"ContainerStarted","Data":"1b6eda4ff25cbed65b3dc87f687cee291502782174cf5bcceaa1d4333221a990"} Feb 18 14:23:21 crc kubenswrapper[4817]: I0218 14:23:21.588406 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8","Type":"ContainerStarted","Data":"15ceb9325d46b5500d954fe498b08a4ddde739de79d4129039b946372a6151be"} Feb 18 14:23:21 crc kubenswrapper[4817]: I0218 14:23:21.592695 4817 generic.go:334] "Generic (PLEG): container finished" podID="2eec2d51-e854-4ecd-aa31-fd94387e6aa6" containerID="4ba7a12f52d396a8ef524aad1ccad889df65d9d3b445891e515c2ba8acfa3569" exitCode=0 Feb 18 14:23:21 crc kubenswrapper[4817]: I0218 14:23:21.592757 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-rc7gd" event={"ID":"2eec2d51-e854-4ecd-aa31-fd94387e6aa6","Type":"ContainerDied","Data":"4ba7a12f52d396a8ef524aad1ccad889df65d9d3b445891e515c2ba8acfa3569"} Feb 18 14:23:22 crc kubenswrapper[4817]: I0218 14:23:22.616676 4817 generic.go:334] "Generic (PLEG): container finished" podID="d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1" containerID="e26ecc0e0a3b4457bde2bdb563a90b212c385b31af07a4839118b11a69829aaf" exitCode=0 Feb 18 14:23:22 crc kubenswrapper[4817]: I0218 14:23:22.616890 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1","Type":"ContainerDied","Data":"e26ecc0e0a3b4457bde2bdb563a90b212c385b31af07a4839118b11a69829aaf"} Feb 18 14:23:22 crc kubenswrapper[4817]: I0218 14:23:22.635886 4817 generic.go:334] "Generic (PLEG): container finished" podID="14e634c8-da00-43a5-96a8-33e8bf806873" containerID="2187986625b1e0afd9d63e29f131284ecbd1b9924fcf0bd96558e3a42054490b" exitCode=0 Feb 18 14:23:22 crc kubenswrapper[4817]: I0218 14:23:22.636075 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"14e634c8-da00-43a5-96a8-33e8bf806873","Type":"ContainerDied","Data":"2187986625b1e0afd9d63e29f131284ecbd1b9924fcf0bd96558e3a42054490b"} Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.047552 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.158725 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-rc7gd" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.191118 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/14e634c8-da00-43a5-96a8-33e8bf806873-pod-info\") pod \"14e634c8-da00-43a5-96a8-33e8bf806873\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.191282 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/14e634c8-da00-43a5-96a8-33e8bf806873-rabbitmq-tls\") pod \"14e634c8-da00-43a5-96a8-33e8bf806873\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.193476 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1b819bed-6fd2-438e-9959-c6456132bba7\") pod \"14e634c8-da00-43a5-96a8-33e8bf806873\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.193556 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99sfr\" (UniqueName: \"kubernetes.io/projected/14e634c8-da00-43a5-96a8-33e8bf806873-kube-api-access-99sfr\") pod \"14e634c8-da00-43a5-96a8-33e8bf806873\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.193607 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/14e634c8-da00-43a5-96a8-33e8bf806873-rabbitmq-plugins\") pod \"14e634c8-da00-43a5-96a8-33e8bf806873\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.193645 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/14e634c8-da00-43a5-96a8-33e8bf806873-rabbitmq-erlang-cookie\") pod \"14e634c8-da00-43a5-96a8-33e8bf806873\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.193675 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/14e634c8-da00-43a5-96a8-33e8bf806873-server-conf\") pod \"14e634c8-da00-43a5-96a8-33e8bf806873\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.193769 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/14e634c8-da00-43a5-96a8-33e8bf806873-config-data\") pod \"14e634c8-da00-43a5-96a8-33e8bf806873\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.193847 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/14e634c8-da00-43a5-96a8-33e8bf806873-erlang-cookie-secret\") pod \"14e634c8-da00-43a5-96a8-33e8bf806873\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.193875 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/14e634c8-da00-43a5-96a8-33e8bf806873-rabbitmq-confd\") pod \"14e634c8-da00-43a5-96a8-33e8bf806873\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.193907 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/14e634c8-da00-43a5-96a8-33e8bf806873-plugins-conf\") pod \"14e634c8-da00-43a5-96a8-33e8bf806873\" (UID: \"14e634c8-da00-43a5-96a8-33e8bf806873\") " Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.225282 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14e634c8-da00-43a5-96a8-33e8bf806873-kube-api-access-99sfr" (OuterVolumeSpecName: "kube-api-access-99sfr") pod "14e634c8-da00-43a5-96a8-33e8bf806873" (UID: "14e634c8-da00-43a5-96a8-33e8bf806873"). InnerVolumeSpecName "kube-api-access-99sfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.236744 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/14e634c8-da00-43a5-96a8-33e8bf806873-pod-info" (OuterVolumeSpecName: "pod-info") pod "14e634c8-da00-43a5-96a8-33e8bf806873" (UID: "14e634c8-da00-43a5-96a8-33e8bf806873"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.248545 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14e634c8-da00-43a5-96a8-33e8bf806873-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "14e634c8-da00-43a5-96a8-33e8bf806873" (UID: "14e634c8-da00-43a5-96a8-33e8bf806873"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.260292 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14e634c8-da00-43a5-96a8-33e8bf806873-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "14e634c8-da00-43a5-96a8-33e8bf806873" (UID: "14e634c8-da00-43a5-96a8-33e8bf806873"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.260775 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14e634c8-da00-43a5-96a8-33e8bf806873-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "14e634c8-da00-43a5-96a8-33e8bf806873" (UID: "14e634c8-da00-43a5-96a8-33e8bf806873"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.282587 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14e634c8-da00-43a5-96a8-33e8bf806873-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "14e634c8-da00-43a5-96a8-33e8bf806873" (UID: "14e634c8-da00-43a5-96a8-33e8bf806873"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.284926 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14e634c8-da00-43a5-96a8-33e8bf806873-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "14e634c8-da00-43a5-96a8-33e8bf806873" (UID: "14e634c8-da00-43a5-96a8-33e8bf806873"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.288197 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1b819bed-6fd2-438e-9959-c6456132bba7" (OuterVolumeSpecName: "persistence") pod "14e634c8-da00-43a5-96a8-33e8bf806873" (UID: "14e634c8-da00-43a5-96a8-33e8bf806873"). InnerVolumeSpecName "pvc-1b819bed-6fd2-438e-9959-c6456132bba7". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.296152 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/2eec2d51-e854-4ecd-aa31-fd94387e6aa6-certs\") pod \"2eec2d51-e854-4ecd-aa31-fd94387e6aa6\" (UID: \"2eec2d51-e854-4ecd-aa31-fd94387e6aa6\") " Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.296229 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2eec2d51-e854-4ecd-aa31-fd94387e6aa6-scripts\") pod \"2eec2d51-e854-4ecd-aa31-fd94387e6aa6\" (UID: \"2eec2d51-e854-4ecd-aa31-fd94387e6aa6\") " Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.296312 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eec2d51-e854-4ecd-aa31-fd94387e6aa6-combined-ca-bundle\") pod \"2eec2d51-e854-4ecd-aa31-fd94387e6aa6\" (UID: \"2eec2d51-e854-4ecd-aa31-fd94387e6aa6\") " Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.296434 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9dx8\" (UniqueName: \"kubernetes.io/projected/2eec2d51-e854-4ecd-aa31-fd94387e6aa6-kube-api-access-k9dx8\") pod \"2eec2d51-e854-4ecd-aa31-fd94387e6aa6\" (UID: \"2eec2d51-e854-4ecd-aa31-fd94387e6aa6\") " Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.296571 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eec2d51-e854-4ecd-aa31-fd94387e6aa6-config-data\") pod \"2eec2d51-e854-4ecd-aa31-fd94387e6aa6\" (UID: \"2eec2d51-e854-4ecd-aa31-fd94387e6aa6\") " Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.297431 4817 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/14e634c8-da00-43a5-96a8-33e8bf806873-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.297472 4817 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-1b819bed-6fd2-438e-9959-c6456132bba7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1b819bed-6fd2-438e-9959-c6456132bba7\") on node \"crc\" " Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.297488 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99sfr\" (UniqueName: \"kubernetes.io/projected/14e634c8-da00-43a5-96a8-33e8bf806873-kube-api-access-99sfr\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.297501 4817 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/14e634c8-da00-43a5-96a8-33e8bf806873-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.297513 4817 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/14e634c8-da00-43a5-96a8-33e8bf806873-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.297526 4817 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/14e634c8-da00-43a5-96a8-33e8bf806873-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.297537 4817 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/14e634c8-da00-43a5-96a8-33e8bf806873-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.297549 4817 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/14e634c8-da00-43a5-96a8-33e8bf806873-pod-info\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.304765 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eec2d51-e854-4ecd-aa31-fd94387e6aa6-certs" (OuterVolumeSpecName: "certs") pod "2eec2d51-e854-4ecd-aa31-fd94387e6aa6" (UID: "2eec2d51-e854-4ecd-aa31-fd94387e6aa6"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.323159 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eec2d51-e854-4ecd-aa31-fd94387e6aa6-scripts" (OuterVolumeSpecName: "scripts") pod "2eec2d51-e854-4ecd-aa31-fd94387e6aa6" (UID: "2eec2d51-e854-4ecd-aa31-fd94387e6aa6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.323940 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eec2d51-e854-4ecd-aa31-fd94387e6aa6-kube-api-access-k9dx8" (OuterVolumeSpecName: "kube-api-access-k9dx8") pod "2eec2d51-e854-4ecd-aa31-fd94387e6aa6" (UID: "2eec2d51-e854-4ecd-aa31-fd94387e6aa6"). InnerVolumeSpecName "kube-api-access-k9dx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.363439 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14e634c8-da00-43a5-96a8-33e8bf806873-server-conf" (OuterVolumeSpecName: "server-conf") pod "14e634c8-da00-43a5-96a8-33e8bf806873" (UID: "14e634c8-da00-43a5-96a8-33e8bf806873"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.396193 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14e634c8-da00-43a5-96a8-33e8bf806873-config-data" (OuterVolumeSpecName: "config-data") pod "14e634c8-da00-43a5-96a8-33e8bf806873" (UID: "14e634c8-da00-43a5-96a8-33e8bf806873"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.398649 4817 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.399183 4817 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-1b819bed-6fd2-438e-9959-c6456132bba7" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1b819bed-6fd2-438e-9959-c6456132bba7") on node "crc" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.399745 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.400518 4817 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/2eec2d51-e854-4ecd-aa31-fd94387e6aa6-certs\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.400537 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2eec2d51-e854-4ecd-aa31-fd94387e6aa6-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.400546 4817 reconciler_common.go:293] "Volume detached for volume \"pvc-1b819bed-6fd2-438e-9959-c6456132bba7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1b819bed-6fd2-438e-9959-c6456132bba7\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.400556 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9dx8\" (UniqueName: \"kubernetes.io/projected/2eec2d51-e854-4ecd-aa31-fd94387e6aa6-kube-api-access-k9dx8\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.400566 4817 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/14e634c8-da00-43a5-96a8-33e8bf806873-server-conf\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.400782 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/14e634c8-da00-43a5-96a8-33e8bf806873-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.414127 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eec2d51-e854-4ecd-aa31-fd94387e6aa6-config-data" (OuterVolumeSpecName: "config-data") pod "2eec2d51-e854-4ecd-aa31-fd94387e6aa6" (UID: "2eec2d51-e854-4ecd-aa31-fd94387e6aa6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.437769 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eec2d51-e854-4ecd-aa31-fd94387e6aa6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2eec2d51-e854-4ecd-aa31-fd94387e6aa6" (UID: "2eec2d51-e854-4ecd-aa31-fd94387e6aa6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.502496 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-pod-info\") pod \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.502909 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgh5g\" (UniqueName: \"kubernetes.io/projected/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-kube-api-access-wgh5g\") pod \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.502976 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-config-data\") pod \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.503103 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-rabbitmq-confd\") pod \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.503175 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-rabbitmq-erlang-cookie\") pod \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.503254 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-rabbitmq-tls\") pod \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.504708 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ea7e34d-f8e0-47a0-a00a-1c7ce9082423\") pod \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.504760 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-plugins-conf\") pod \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.504786 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-server-conf\") pod \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.504859 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-rabbitmq-plugins\") pod \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.504924 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-erlang-cookie-secret\") pod \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\" (UID: \"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1\") " Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.514104 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1" (UID: "d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.520389 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eec2d51-e854-4ecd-aa31-fd94387e6aa6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.520432 4817 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.520446 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eec2d51-e854-4ecd-aa31-fd94387e6aa6-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.522711 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1" (UID: "d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.523661 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1" (UID: "d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.531571 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-kube-api-access-wgh5g" (OuterVolumeSpecName: "kube-api-access-wgh5g") pod "d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1" (UID: "d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1"). InnerVolumeSpecName "kube-api-access-wgh5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.537571 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-pod-info" (OuterVolumeSpecName: "pod-info") pod "d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1" (UID: "d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.539022 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1" (UID: "d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.542431 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1" (UID: "d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.627179 4817 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.627244 4817 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.627256 4817 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.627267 4817 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.627277 4817 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-pod-info\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.627288 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgh5g\" (UniqueName: \"kubernetes.io/projected/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-kube-api-access-wgh5g\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.628870 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ea7e34d-f8e0-47a0-a00a-1c7ce9082423" (OuterVolumeSpecName: "persistence") pod "d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1" (UID: "d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1"). InnerVolumeSpecName "pvc-8ea7e34d-f8e0-47a0-a00a-1c7ce9082423". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.653902 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-config-data" (OuterVolumeSpecName: "config-data") pod "d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1" (UID: "d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.668126 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-server-conf" (OuterVolumeSpecName: "server-conf") pod "d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1" (UID: "d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.679635 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"14e634c8-da00-43a5-96a8-33e8bf806873","Type":"ContainerDied","Data":"cc1428f1cfb51837027e2a64e977cce278d40c00f9339e7400e81a892ceddf10"} Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.679697 4817 scope.go:117] "RemoveContainer" containerID="2187986625b1e0afd9d63e29f131284ecbd1b9924fcf0bd96558e3a42054490b" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.679873 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.686105 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-rc7gd" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.686645 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-rc7gd" event={"ID":"2eec2d51-e854-4ecd-aa31-fd94387e6aa6","Type":"ContainerDied","Data":"7ffeb995945e16d5551677678282632ea028a07126c5a847765342d1947f7bd4"} Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.686678 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ffeb995945e16d5551677678282632ea028a07126c5a847765342d1947f7bd4" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.695476 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.697013 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1","Type":"ContainerDied","Data":"8a663cf915c818bbce22dc9d945e60e5103401059ab3d8fe702524bb142600a5"} Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.732232 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.732486 4817 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-8ea7e34d-f8e0-47a0-a00a-1c7ce9082423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ea7e34d-f8e0-47a0-a00a-1c7ce9082423\") on node \"crc\" " Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.732567 4817 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-server-conf\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.772213 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14e634c8-da00-43a5-96a8-33e8bf806873-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "14e634c8-da00-43a5-96a8-33e8bf806873" (UID: "14e634c8-da00-43a5-96a8-33e8bf806873"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.799752 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.800229 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="542029d6-ef61-49d1-88a7-c206c64e193a" containerName="cloudkitty-proc" containerID="cri-o://ed628f07772e2e05d3087689a31bbe40ceac0ae62b9496713f1d17d667347f91" gracePeriod=30 Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.815859 4817 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.816171 4817 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-8ea7e34d-f8e0-47a0-a00a-1c7ce9082423" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ea7e34d-f8e0-47a0-a00a-1c7ce9082423") on node "crc" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.819664 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.820106 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="eaa4f949-7d98-4a16-a5ed-ba88f9e89820" containerName="cloudkitty-api" containerID="cri-o://fec6236b0102236a0ffc04af329220acf3c2ccbfcacfee8aa9a74f16d2df810a" gracePeriod=30 Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.820298 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="eaa4f949-7d98-4a16-a5ed-ba88f9e89820" containerName="cloudkitty-api-log" containerID="cri-o://64325c38df09c7e7258bae291e3bc9a3bc32ce81b5be56316cf60579570582a1" gracePeriod=30 Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.834365 4817 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/14e634c8-da00-43a5-96a8-33e8bf806873-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.834664 4817 reconciler_common.go:293] "Volume detached for volume \"pvc-8ea7e34d-f8e0-47a0-a00a-1c7ce9082423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ea7e34d-f8e0-47a0-a00a-1c7ce9082423\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.844574 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1" (UID: "d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.885250 4817 scope.go:117] "RemoveContainer" containerID="9a5e9d303e366c3bb999a78527cf3761a74f192fe80921e3e7ff942f31d4204e" Feb 18 14:23:23 crc kubenswrapper[4817]: I0218 14:23:23.938762 4817 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.032180 4817 scope.go:117] "RemoveContainer" containerID="e26ecc0e0a3b4457bde2bdb563a90b212c385b31af07a4839118b11a69829aaf" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.057679 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.071045 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.081133 4817 scope.go:117] "RemoveContainer" containerID="3a7b1cf4852e8739319f69c8a261a36817c331f73c78b92b691e2916854cabb5" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.088047 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.107815 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.122428 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 14:23:24 crc kubenswrapper[4817]: E0218 14:23:24.122854 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14e634c8-da00-43a5-96a8-33e8bf806873" containerName="rabbitmq" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.122870 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="14e634c8-da00-43a5-96a8-33e8bf806873" containerName="rabbitmq" Feb 18 14:23:24 crc kubenswrapper[4817]: E0218 14:23:24.122886 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eec2d51-e854-4ecd-aa31-fd94387e6aa6" containerName="cloudkitty-storageinit" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.122895 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eec2d51-e854-4ecd-aa31-fd94387e6aa6" containerName="cloudkitty-storageinit" Feb 18 14:23:24 crc kubenswrapper[4817]: E0218 14:23:24.122911 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1" containerName="setup-container" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.122918 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1" containerName="setup-container" Feb 18 14:23:24 crc kubenswrapper[4817]: E0218 14:23:24.122934 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1" containerName="rabbitmq" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.122940 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1" containerName="rabbitmq" Feb 18 14:23:24 crc kubenswrapper[4817]: E0218 14:23:24.122971 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14e634c8-da00-43a5-96a8-33e8bf806873" containerName="setup-container" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.122996 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="14e634c8-da00-43a5-96a8-33e8bf806873" containerName="setup-container" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.123161 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="14e634c8-da00-43a5-96a8-33e8bf806873" containerName="rabbitmq" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.123178 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eec2d51-e854-4ecd-aa31-fd94387e6aa6" containerName="cloudkitty-storageinit" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.123192 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1" containerName="rabbitmq" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.124546 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.133477 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-5rlj7" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.133714 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.133849 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.134015 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.134117 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.134251 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.134402 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.157181 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.161811 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.176582 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.176790 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-v8dct" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.176904 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.177079 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.177204 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.177452 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.177569 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.230077 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14e634c8-da00-43a5-96a8-33e8bf806873" path="/var/lib/kubelet/pods/14e634c8-da00-43a5-96a8-33e8bf806873/volumes" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.233163 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1" path="/var/lib/kubelet/pods/d6c1fa65-e9bf-4b06-a643-2095fbc8e7d1/volumes" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.241866 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.241924 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.256116 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vtjw\" (UniqueName: \"kubernetes.io/projected/e19f3906-864f-49f8-b3f1-e3cfbcae4133-kube-api-access-6vtjw\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19f3906-864f-49f8-b3f1-e3cfbcae4133\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.256201 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e19f3906-864f-49f8-b3f1-e3cfbcae4133-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19f3906-864f-49f8-b3f1-e3cfbcae4133\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.256237 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e19f3906-864f-49f8-b3f1-e3cfbcae4133-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19f3906-864f-49f8-b3f1-e3cfbcae4133\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.256274 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-679j8\" (UniqueName: \"kubernetes.io/projected/f49989fd-6326-4020-aba0-45b49ed37872-kube-api-access-679j8\") pod \"rabbitmq-server-0\" (UID: \"f49989fd-6326-4020-aba0-45b49ed37872\") " pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.256297 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f49989fd-6326-4020-aba0-45b49ed37872-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f49989fd-6326-4020-aba0-45b49ed37872\") " pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.256335 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e19f3906-864f-49f8-b3f1-e3cfbcae4133-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19f3906-864f-49f8-b3f1-e3cfbcae4133\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.256367 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f49989fd-6326-4020-aba0-45b49ed37872-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f49989fd-6326-4020-aba0-45b49ed37872\") " pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.256391 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f49989fd-6326-4020-aba0-45b49ed37872-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f49989fd-6326-4020-aba0-45b49ed37872\") " pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.256425 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e19f3906-864f-49f8-b3f1-e3cfbcae4133-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19f3906-864f-49f8-b3f1-e3cfbcae4133\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.256460 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f49989fd-6326-4020-aba0-45b49ed37872-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f49989fd-6326-4020-aba0-45b49ed37872\") " pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.256519 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e19f3906-864f-49f8-b3f1-e3cfbcae4133-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19f3906-864f-49f8-b3f1-e3cfbcae4133\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.256567 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f49989fd-6326-4020-aba0-45b49ed37872-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f49989fd-6326-4020-aba0-45b49ed37872\") " pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.256658 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8ea7e34d-f8e0-47a0-a00a-1c7ce9082423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ea7e34d-f8e0-47a0-a00a-1c7ce9082423\") pod \"rabbitmq-server-0\" (UID: \"f49989fd-6326-4020-aba0-45b49ed37872\") " pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.256689 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f49989fd-6326-4020-aba0-45b49ed37872-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f49989fd-6326-4020-aba0-45b49ed37872\") " pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.256720 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1b819bed-6fd2-438e-9959-c6456132bba7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1b819bed-6fd2-438e-9959-c6456132bba7\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19f3906-864f-49f8-b3f1-e3cfbcae4133\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.256751 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e19f3906-864f-49f8-b3f1-e3cfbcae4133-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19f3906-864f-49f8-b3f1-e3cfbcae4133\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.256808 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f49989fd-6326-4020-aba0-45b49ed37872-config-data\") pod \"rabbitmq-server-0\" (UID: \"f49989fd-6326-4020-aba0-45b49ed37872\") " pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.256862 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e19f3906-864f-49f8-b3f1-e3cfbcae4133-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19f3906-864f-49f8-b3f1-e3cfbcae4133\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.256901 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f49989fd-6326-4020-aba0-45b49ed37872-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f49989fd-6326-4020-aba0-45b49ed37872\") " pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.256932 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f49989fd-6326-4020-aba0-45b49ed37872-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f49989fd-6326-4020-aba0-45b49ed37872\") " pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.256967 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e19f3906-864f-49f8-b3f1-e3cfbcae4133-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19f3906-864f-49f8-b3f1-e3cfbcae4133\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.257081 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e19f3906-864f-49f8-b3f1-e3cfbcae4133-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19f3906-864f-49f8-b3f1-e3cfbcae4133\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.368412 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vtjw\" (UniqueName: \"kubernetes.io/projected/e19f3906-864f-49f8-b3f1-e3cfbcae4133-kube-api-access-6vtjw\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19f3906-864f-49f8-b3f1-e3cfbcae4133\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.368484 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e19f3906-864f-49f8-b3f1-e3cfbcae4133-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19f3906-864f-49f8-b3f1-e3cfbcae4133\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.368538 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e19f3906-864f-49f8-b3f1-e3cfbcae4133-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19f3906-864f-49f8-b3f1-e3cfbcae4133\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.368582 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-679j8\" (UniqueName: \"kubernetes.io/projected/f49989fd-6326-4020-aba0-45b49ed37872-kube-api-access-679j8\") pod \"rabbitmq-server-0\" (UID: \"f49989fd-6326-4020-aba0-45b49ed37872\") " pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.368605 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f49989fd-6326-4020-aba0-45b49ed37872-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f49989fd-6326-4020-aba0-45b49ed37872\") " pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.368657 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e19f3906-864f-49f8-b3f1-e3cfbcae4133-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19f3906-864f-49f8-b3f1-e3cfbcae4133\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.368687 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f49989fd-6326-4020-aba0-45b49ed37872-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f49989fd-6326-4020-aba0-45b49ed37872\") " pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.368703 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f49989fd-6326-4020-aba0-45b49ed37872-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f49989fd-6326-4020-aba0-45b49ed37872\") " pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.368754 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e19f3906-864f-49f8-b3f1-e3cfbcae4133-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19f3906-864f-49f8-b3f1-e3cfbcae4133\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.368820 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f49989fd-6326-4020-aba0-45b49ed37872-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f49989fd-6326-4020-aba0-45b49ed37872\") " pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.368857 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e19f3906-864f-49f8-b3f1-e3cfbcae4133-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19f3906-864f-49f8-b3f1-e3cfbcae4133\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.368902 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f49989fd-6326-4020-aba0-45b49ed37872-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f49989fd-6326-4020-aba0-45b49ed37872\") " pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.368977 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8ea7e34d-f8e0-47a0-a00a-1c7ce9082423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ea7e34d-f8e0-47a0-a00a-1c7ce9082423\") pod \"rabbitmq-server-0\" (UID: \"f49989fd-6326-4020-aba0-45b49ed37872\") " pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.369024 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f49989fd-6326-4020-aba0-45b49ed37872-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f49989fd-6326-4020-aba0-45b49ed37872\") " pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.369054 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1b819bed-6fd2-438e-9959-c6456132bba7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1b819bed-6fd2-438e-9959-c6456132bba7\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19f3906-864f-49f8-b3f1-e3cfbcae4133\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.369087 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e19f3906-864f-49f8-b3f1-e3cfbcae4133-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19f3906-864f-49f8-b3f1-e3cfbcae4133\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.369146 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f49989fd-6326-4020-aba0-45b49ed37872-config-data\") pod \"rabbitmq-server-0\" (UID: \"f49989fd-6326-4020-aba0-45b49ed37872\") " pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.369195 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e19f3906-864f-49f8-b3f1-e3cfbcae4133-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19f3906-864f-49f8-b3f1-e3cfbcae4133\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.369232 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f49989fd-6326-4020-aba0-45b49ed37872-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f49989fd-6326-4020-aba0-45b49ed37872\") " pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.369265 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f49989fd-6326-4020-aba0-45b49ed37872-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f49989fd-6326-4020-aba0-45b49ed37872\") " pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.369308 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e19f3906-864f-49f8-b3f1-e3cfbcae4133-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19f3906-864f-49f8-b3f1-e3cfbcae4133\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.369330 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e19f3906-864f-49f8-b3f1-e3cfbcae4133-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19f3906-864f-49f8-b3f1-e3cfbcae4133\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.369858 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e19f3906-864f-49f8-b3f1-e3cfbcae4133-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19f3906-864f-49f8-b3f1-e3cfbcae4133\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.370270 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f49989fd-6326-4020-aba0-45b49ed37872-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f49989fd-6326-4020-aba0-45b49ed37872\") " pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.370641 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e19f3906-864f-49f8-b3f1-e3cfbcae4133-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19f3906-864f-49f8-b3f1-e3cfbcae4133\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.378547 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e19f3906-864f-49f8-b3f1-e3cfbcae4133-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19f3906-864f-49f8-b3f1-e3cfbcae4133\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.378859 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f49989fd-6326-4020-aba0-45b49ed37872-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f49989fd-6326-4020-aba0-45b49ed37872\") " pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.380253 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e19f3906-864f-49f8-b3f1-e3cfbcae4133-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19f3906-864f-49f8-b3f1-e3cfbcae4133\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.380814 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f49989fd-6326-4020-aba0-45b49ed37872-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f49989fd-6326-4020-aba0-45b49ed37872\") " pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.383864 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e19f3906-864f-49f8-b3f1-e3cfbcae4133-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19f3906-864f-49f8-b3f1-e3cfbcae4133\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.385088 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f49989fd-6326-4020-aba0-45b49ed37872-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f49989fd-6326-4020-aba0-45b49ed37872\") " pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.389753 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f49989fd-6326-4020-aba0-45b49ed37872-config-data\") pod \"rabbitmq-server-0\" (UID: \"f49989fd-6326-4020-aba0-45b49ed37872\") " pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.390297 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e19f3906-864f-49f8-b3f1-e3cfbcae4133-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19f3906-864f-49f8-b3f1-e3cfbcae4133\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.390859 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f49989fd-6326-4020-aba0-45b49ed37872-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f49989fd-6326-4020-aba0-45b49ed37872\") " pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.391507 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e19f3906-864f-49f8-b3f1-e3cfbcae4133-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19f3906-864f-49f8-b3f1-e3cfbcae4133\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.397176 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f49989fd-6326-4020-aba0-45b49ed37872-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f49989fd-6326-4020-aba0-45b49ed37872\") " pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.398120 4817 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.401550 4817 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8ea7e34d-f8e0-47a0-a00a-1c7ce9082423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ea7e34d-f8e0-47a0-a00a-1c7ce9082423\") pod \"rabbitmq-server-0\" (UID: \"f49989fd-6326-4020-aba0-45b49ed37872\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ec3ed5eb3e080d5af3d3f8c52d88ffd1c20f1133cdda296c9c2f6b5efdb03bc5/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.398489 4817 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.401876 4817 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1b819bed-6fd2-438e-9959-c6456132bba7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1b819bed-6fd2-438e-9959-c6456132bba7\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19f3906-864f-49f8-b3f1-e3cfbcae4133\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/03221d3a9d26cd285b43664e9f0aaceceb14476da6a82478bb8f80895eef44b7/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.403680 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f49989fd-6326-4020-aba0-45b49ed37872-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f49989fd-6326-4020-aba0-45b49ed37872\") " pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.407689 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f49989fd-6326-4020-aba0-45b49ed37872-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f49989fd-6326-4020-aba0-45b49ed37872\") " pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.410548 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e19f3906-864f-49f8-b3f1-e3cfbcae4133-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19f3906-864f-49f8-b3f1-e3cfbcae4133\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.413912 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e19f3906-864f-49f8-b3f1-e3cfbcae4133-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19f3906-864f-49f8-b3f1-e3cfbcae4133\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.418328 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vtjw\" (UniqueName: \"kubernetes.io/projected/e19f3906-864f-49f8-b3f1-e3cfbcae4133-kube-api-access-6vtjw\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19f3906-864f-49f8-b3f1-e3cfbcae4133\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.429647 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-679j8\" (UniqueName: \"kubernetes.io/projected/f49989fd-6326-4020-aba0-45b49ed37872-kube-api-access-679j8\") pod \"rabbitmq-server-0\" (UID: \"f49989fd-6326-4020-aba0-45b49ed37872\") " pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.655022 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8ea7e34d-f8e0-47a0-a00a-1c7ce9082423\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8ea7e34d-f8e0-47a0-a00a-1c7ce9082423\") pod \"rabbitmq-server-0\" (UID: \"f49989fd-6326-4020-aba0-45b49ed37872\") " pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.663917 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1b819bed-6fd2-438e-9959-c6456132bba7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1b819bed-6fd2-438e-9959-c6456132bba7\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19f3906-864f-49f8-b3f1-e3cfbcae4133\") " pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.713186 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8","Type":"ContainerStarted","Data":"0060fd0964db5e4484f2dd8e95ad53506ad4323d7269aa4a2f0e5dd6e0162ae9"} Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.713408 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.722701 4817 generic.go:334] "Generic (PLEG): container finished" podID="eaa4f949-7d98-4a16-a5ed-ba88f9e89820" containerID="64325c38df09c7e7258bae291e3bc9a3bc32ce81b5be56316cf60579570582a1" exitCode=143 Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.722952 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"eaa4f949-7d98-4a16-a5ed-ba88f9e89820","Type":"ContainerDied","Data":"64325c38df09c7e7258bae291e3bc9a3bc32ce81b5be56316cf60579570582a1"} Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.752726 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.946483283 podStartE2EDuration="12.75270086s" podCreationTimestamp="2026-02-18 14:23:12 +0000 UTC" firstStartedPulling="2026-02-18 14:23:13.787810615 +0000 UTC m=+1456.363346598" lastFinishedPulling="2026-02-18 14:23:23.594028192 +0000 UTC m=+1466.169564175" observedRunningTime="2026-02-18 14:23:24.742624499 +0000 UTC m=+1467.318160482" watchObservedRunningTime="2026-02-18 14:23:24.75270086 +0000 UTC m=+1467.328236843" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.777620 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 18 14:23:24 crc kubenswrapper[4817]: I0218 14:23:24.834207 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:23:25 crc kubenswrapper[4817]: I0218 14:23:25.393384 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 18 14:23:25 crc kubenswrapper[4817]: I0218 14:23:25.434403 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 18 14:23:25 crc kubenswrapper[4817]: W0218 14:23:25.434725 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf49989fd_6326_4020_aba0_45b49ed37872.slice/crio-e95c8f62f1a2ae95f4b18ec6cb5dd03a183d14d7dd4edbdd774c553983690c7b WatchSource:0}: Error finding container e95c8f62f1a2ae95f4b18ec6cb5dd03a183d14d7dd4edbdd774c553983690c7b: Status 404 returned error can't find the container with id e95c8f62f1a2ae95f4b18ec6cb5dd03a183d14d7dd4edbdd774c553983690c7b Feb 18 14:23:25 crc kubenswrapper[4817]: I0218 14:23:25.442738 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-api-0" podUID="eaa4f949-7d98-4a16-a5ed-ba88f9e89820" containerName="cloudkitty-api" probeResult="failure" output="Get \"https://10.217.0.210:8889/healthcheck\": read tcp 10.217.0.2:43920->10.217.0.210:8889: read: connection reset by peer" Feb 18 14:23:25 crc kubenswrapper[4817]: W0218 14:23:25.481807 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode19f3906_864f_49f8_b3f1_e3cfbcae4133.slice/crio-1a8134a38ef4e382c6c1bd55b847886ace8010890555be5ecd7b2cf24f1c4f5c WatchSource:0}: Error finding container 1a8134a38ef4e382c6c1bd55b847886ace8010890555be5ecd7b2cf24f1c4f5c: Status 404 returned error can't find the container with id 1a8134a38ef4e382c6c1bd55b847886ace8010890555be5ecd7b2cf24f1c4f5c Feb 18 14:23:25 crc kubenswrapper[4817]: I0218 14:23:25.739303 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f49989fd-6326-4020-aba0-45b49ed37872","Type":"ContainerStarted","Data":"e95c8f62f1a2ae95f4b18ec6cb5dd03a183d14d7dd4edbdd774c553983690c7b"} Feb 18 14:23:25 crc kubenswrapper[4817]: I0218 14:23:25.743018 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e19f3906-864f-49f8-b3f1-e3cfbcae4133","Type":"ContainerStarted","Data":"1a8134a38ef4e382c6c1bd55b847886ace8010890555be5ecd7b2cf24f1c4f5c"} Feb 18 14:23:25 crc kubenswrapper[4817]: I0218 14:23:25.748461 4817 generic.go:334] "Generic (PLEG): container finished" podID="eaa4f949-7d98-4a16-a5ed-ba88f9e89820" containerID="fec6236b0102236a0ffc04af329220acf3c2ccbfcacfee8aa9a74f16d2df810a" exitCode=0 Feb 18 14:23:25 crc kubenswrapper[4817]: I0218 14:23:25.748597 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"eaa4f949-7d98-4a16-a5ed-ba88f9e89820","Type":"ContainerDied","Data":"fec6236b0102236a0ffc04af329220acf3c2ccbfcacfee8aa9a74f16d2df810a"} Feb 18 14:23:25 crc kubenswrapper[4817]: I0218 14:23:25.755936 4817 generic.go:334] "Generic (PLEG): container finished" podID="542029d6-ef61-49d1-88a7-c206c64e193a" containerID="ed628f07772e2e05d3087689a31bbe40ceac0ae62b9496713f1d17d667347f91" exitCode=0 Feb 18 14:23:25 crc kubenswrapper[4817]: I0218 14:23:25.757893 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"542029d6-ef61-49d1-88a7-c206c64e193a","Type":"ContainerDied","Data":"ed628f07772e2e05d3087689a31bbe40ceac0ae62b9496713f1d17d667347f91"} Feb 18 14:23:25 crc kubenswrapper[4817]: I0218 14:23:25.758018 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"542029d6-ef61-49d1-88a7-c206c64e193a","Type":"ContainerDied","Data":"1d038fe129143d145b72ed4c3401bd5ad4b2053bdcda35c44413a585ce7000b8"} Feb 18 14:23:25 crc kubenswrapper[4817]: I0218 14:23:25.758042 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d038fe129143d145b72ed4c3401bd5ad4b2053bdcda35c44413a585ce7000b8" Feb 18 14:23:25 crc kubenswrapper[4817]: I0218 14:23:25.911248 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 18 14:23:25 crc kubenswrapper[4817]: I0218 14:23:25.940374 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/542029d6-ef61-49d1-88a7-c206c64e193a-config-data\") pod \"542029d6-ef61-49d1-88a7-c206c64e193a\" (UID: \"542029d6-ef61-49d1-88a7-c206c64e193a\") " Feb 18 14:23:25 crc kubenswrapper[4817]: I0218 14:23:25.940464 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/542029d6-ef61-49d1-88a7-c206c64e193a-combined-ca-bundle\") pod \"542029d6-ef61-49d1-88a7-c206c64e193a\" (UID: \"542029d6-ef61-49d1-88a7-c206c64e193a\") " Feb 18 14:23:25 crc kubenswrapper[4817]: I0218 14:23:25.940521 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kkcz\" (UniqueName: \"kubernetes.io/projected/542029d6-ef61-49d1-88a7-c206c64e193a-kube-api-access-9kkcz\") pod \"542029d6-ef61-49d1-88a7-c206c64e193a\" (UID: \"542029d6-ef61-49d1-88a7-c206c64e193a\") " Feb 18 14:23:25 crc kubenswrapper[4817]: I0218 14:23:25.940774 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/542029d6-ef61-49d1-88a7-c206c64e193a-config-data-custom\") pod \"542029d6-ef61-49d1-88a7-c206c64e193a\" (UID: \"542029d6-ef61-49d1-88a7-c206c64e193a\") " Feb 18 14:23:25 crc kubenswrapper[4817]: I0218 14:23:25.940890 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/542029d6-ef61-49d1-88a7-c206c64e193a-scripts\") pod \"542029d6-ef61-49d1-88a7-c206c64e193a\" (UID: \"542029d6-ef61-49d1-88a7-c206c64e193a\") " Feb 18 14:23:25 crc kubenswrapper[4817]: I0218 14:23:25.941159 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/542029d6-ef61-49d1-88a7-c206c64e193a-certs\") pod \"542029d6-ef61-49d1-88a7-c206c64e193a\" (UID: \"542029d6-ef61-49d1-88a7-c206c64e193a\") " Feb 18 14:23:25 crc kubenswrapper[4817]: I0218 14:23:25.949596 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/542029d6-ef61-49d1-88a7-c206c64e193a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "542029d6-ef61-49d1-88a7-c206c64e193a" (UID: "542029d6-ef61-49d1-88a7-c206c64e193a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:23:25 crc kubenswrapper[4817]: I0218 14:23:25.952678 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/542029d6-ef61-49d1-88a7-c206c64e193a-scripts" (OuterVolumeSpecName: "scripts") pod "542029d6-ef61-49d1-88a7-c206c64e193a" (UID: "542029d6-ef61-49d1-88a7-c206c64e193a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.044102 4817 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/542029d6-ef61-49d1-88a7-c206c64e193a-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.044402 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/542029d6-ef61-49d1-88a7-c206c64e193a-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.048787 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/542029d6-ef61-49d1-88a7-c206c64e193a-kube-api-access-9kkcz" (OuterVolumeSpecName: "kube-api-access-9kkcz") pod "542029d6-ef61-49d1-88a7-c206c64e193a" (UID: "542029d6-ef61-49d1-88a7-c206c64e193a"). InnerVolumeSpecName "kube-api-access-9kkcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.051140 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/542029d6-ef61-49d1-88a7-c206c64e193a-certs" (OuterVolumeSpecName: "certs") pod "542029d6-ef61-49d1-88a7-c206c64e193a" (UID: "542029d6-ef61-49d1-88a7-c206c64e193a"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.077164 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.090776 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/542029d6-ef61-49d1-88a7-c206c64e193a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "542029d6-ef61-49d1-88a7-c206c64e193a" (UID: "542029d6-ef61-49d1-88a7-c206c64e193a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.120621 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/542029d6-ef61-49d1-88a7-c206c64e193a-config-data" (OuterVolumeSpecName: "config-data") pod "542029d6-ef61-49d1-88a7-c206c64e193a" (UID: "542029d6-ef61-49d1-88a7-c206c64e193a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.147330 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-public-tls-certs\") pod \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.147454 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-config-data-custom\") pod \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.147732 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-combined-ca-bundle\") pod \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.147854 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-certs\") pod \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.147893 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-internal-tls-certs\") pod \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.147934 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-logs\") pod \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.148043 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-scripts\") pod \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.148166 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-config-data\") pod \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.148256 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2cdm\" (UniqueName: \"kubernetes.io/projected/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-kube-api-access-n2cdm\") pod \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\" (UID: \"eaa4f949-7d98-4a16-a5ed-ba88f9e89820\") " Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.149276 4817 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/542029d6-ef61-49d1-88a7-c206c64e193a-certs\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.149310 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/542029d6-ef61-49d1-88a7-c206c64e193a-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.149324 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/542029d6-ef61-49d1-88a7-c206c64e193a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.149339 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kkcz\" (UniqueName: \"kubernetes.io/projected/542029d6-ef61-49d1-88a7-c206c64e193a-kube-api-access-9kkcz\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.150827 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-logs" (OuterVolumeSpecName: "logs") pod "eaa4f949-7d98-4a16-a5ed-ba88f9e89820" (UID: "eaa4f949-7d98-4a16-a5ed-ba88f9e89820"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.154904 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-kube-api-access-n2cdm" (OuterVolumeSpecName: "kube-api-access-n2cdm") pod "eaa4f949-7d98-4a16-a5ed-ba88f9e89820" (UID: "eaa4f949-7d98-4a16-a5ed-ba88f9e89820"). InnerVolumeSpecName "kube-api-access-n2cdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.156975 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-certs" (OuterVolumeSpecName: "certs") pod "eaa4f949-7d98-4a16-a5ed-ba88f9e89820" (UID: "eaa4f949-7d98-4a16-a5ed-ba88f9e89820"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.171809 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "eaa4f949-7d98-4a16-a5ed-ba88f9e89820" (UID: "eaa4f949-7d98-4a16-a5ed-ba88f9e89820"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.173178 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-scripts" (OuterVolumeSpecName: "scripts") pod "eaa4f949-7d98-4a16-a5ed-ba88f9e89820" (UID: "eaa4f949-7d98-4a16-a5ed-ba88f9e89820"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:23:26 crc kubenswrapper[4817]: E0218 14:23:26.208620 4817 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeaa4f949_7d98_4a16_a5ed_ba88f9e89820.slice/crio-conmon-fec6236b0102236a0ffc04af329220acf3c2ccbfcacfee8aa9a74f16d2df810a.scope\": RecentStats: unable to find data in memory cache]" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.214340 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-config-data" (OuterVolumeSpecName: "config-data") pod "eaa4f949-7d98-4a16-a5ed-ba88f9e89820" (UID: "eaa4f949-7d98-4a16-a5ed-ba88f9e89820"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.229459 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eaa4f949-7d98-4a16-a5ed-ba88f9e89820" (UID: "eaa4f949-7d98-4a16-a5ed-ba88f9e89820"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.253932 4817 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-logs\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.254344 4817 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-scripts\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.254359 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.254372 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2cdm\" (UniqueName: \"kubernetes.io/projected/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-kube-api-access-n2cdm\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.254388 4817 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.254400 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.254412 4817 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-certs\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.274525 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "eaa4f949-7d98-4a16-a5ed-ba88f9e89820" (UID: "eaa4f949-7d98-4a16-a5ed-ba88f9e89820"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.306158 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "eaa4f949-7d98-4a16-a5ed-ba88f9e89820" (UID: "eaa4f949-7d98-4a16-a5ed-ba88f9e89820"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.356629 4817 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.356672 4817 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaa4f949-7d98-4a16-a5ed-ba88f9e89820-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.773830 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.773845 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"eaa4f949-7d98-4a16-a5ed-ba88f9e89820","Type":"ContainerDied","Data":"5bd3db7d0de001e87bd71405af9e1e34d5442053f8913579094238d1dbf59dac"} Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.773868 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.773965 4817 scope.go:117] "RemoveContainer" containerID="fec6236b0102236a0ffc04af329220acf3c2ccbfcacfee8aa9a74f16d2df810a" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.818168 4817 scope.go:117] "RemoveContainer" containerID="64325c38df09c7e7258bae291e3bc9a3bc32ce81b5be56316cf60579570582a1" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.832051 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.853952 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.875707 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.899115 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 18 14:23:26 crc kubenswrapper[4817]: E0218 14:23:26.899595 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaa4f949-7d98-4a16-a5ed-ba88f9e89820" containerName="cloudkitty-api" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.899615 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaa4f949-7d98-4a16-a5ed-ba88f9e89820" containerName="cloudkitty-api" Feb 18 14:23:26 crc kubenswrapper[4817]: E0218 14:23:26.899631 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaa4f949-7d98-4a16-a5ed-ba88f9e89820" containerName="cloudkitty-api-log" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.899638 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaa4f949-7d98-4a16-a5ed-ba88f9e89820" containerName="cloudkitty-api-log" Feb 18 14:23:26 crc kubenswrapper[4817]: E0218 14:23:26.899658 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="542029d6-ef61-49d1-88a7-c206c64e193a" containerName="cloudkitty-proc" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.899665 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="542029d6-ef61-49d1-88a7-c206c64e193a" containerName="cloudkitty-proc" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.899840 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaa4f949-7d98-4a16-a5ed-ba88f9e89820" containerName="cloudkitty-api" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.899864 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="542029d6-ef61-49d1-88a7-c206c64e193a" containerName="cloudkitty-proc" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.899872 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaa4f949-7d98-4a16-a5ed-ba88f9e89820" containerName="cloudkitty-api-log" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.900624 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.905455 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.905698 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.906124 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.906331 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-zgqz6" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.906491 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.923026 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.941583 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.984464 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.984615 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xshdk\" (UniqueName: \"kubernetes.io/projected/ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad-kube-api-access-xshdk\") pod \"cloudkitty-proc-0\" (UID: \"ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.997351 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad-scripts\") pod \"cloudkitty-proc-0\" (UID: \"ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.997529 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad-certs\") pod \"cloudkitty-proc-0\" (UID: \"ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.997637 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.997728 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:23:26 crc kubenswrapper[4817]: I0218 14:23:26.997938 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad-config-data\") pod \"cloudkitty-proc-0\" (UID: \"ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.002816 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.005237 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.005586 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.008308 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.030363 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.100482 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09f7d0fc-a70f-4296-82f1-1cdd302a4a60-logs\") pod \"cloudkitty-api-0\" (UID: \"09f7d0fc-a70f-4296-82f1-1cdd302a4a60\") " pod="openstack/cloudkitty-api-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.100546 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09f7d0fc-a70f-4296-82f1-1cdd302a4a60-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"09f7d0fc-a70f-4296-82f1-1cdd302a4a60\") " pod="openstack/cloudkitty-api-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.100587 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad-config-data\") pod \"cloudkitty-proc-0\" (UID: \"ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.100664 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.100715 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/09f7d0fc-a70f-4296-82f1-1cdd302a4a60-certs\") pod \"cloudkitty-api-0\" (UID: \"09f7d0fc-a70f-4296-82f1-1cdd302a4a60\") " pod="openstack/cloudkitty-api-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.100733 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xshdk\" (UniqueName: \"kubernetes.io/projected/ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad-kube-api-access-xshdk\") pod \"cloudkitty-proc-0\" (UID: \"ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.100754 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad-scripts\") pod \"cloudkitty-proc-0\" (UID: \"ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.100777 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/09f7d0fc-a70f-4296-82f1-1cdd302a4a60-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"09f7d0fc-a70f-4296-82f1-1cdd302a4a60\") " pod="openstack/cloudkitty-api-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.100795 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09f7d0fc-a70f-4296-82f1-1cdd302a4a60-config-data\") pod \"cloudkitty-api-0\" (UID: \"09f7d0fc-a70f-4296-82f1-1cdd302a4a60\") " pod="openstack/cloudkitty-api-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.100822 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad-certs\") pod \"cloudkitty-proc-0\" (UID: \"ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.100847 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09f7d0fc-a70f-4296-82f1-1cdd302a4a60-scripts\") pod \"cloudkitty-api-0\" (UID: \"09f7d0fc-a70f-4296-82f1-1cdd302a4a60\") " pod="openstack/cloudkitty-api-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.100868 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09f7d0fc-a70f-4296-82f1-1cdd302a4a60-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"09f7d0fc-a70f-4296-82f1-1cdd302a4a60\") " pod="openstack/cloudkitty-api-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.100892 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09f7d0fc-a70f-4296-82f1-1cdd302a4a60-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"09f7d0fc-a70f-4296-82f1-1cdd302a4a60\") " pod="openstack/cloudkitty-api-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.100909 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.100932 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmttw\" (UniqueName: \"kubernetes.io/projected/09f7d0fc-a70f-4296-82f1-1cdd302a4a60-kube-api-access-xmttw\") pod \"cloudkitty-api-0\" (UID: \"09f7d0fc-a70f-4296-82f1-1cdd302a4a60\") " pod="openstack/cloudkitty-api-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.203319 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/09f7d0fc-a70f-4296-82f1-1cdd302a4a60-certs\") pod \"cloudkitty-api-0\" (UID: \"09f7d0fc-a70f-4296-82f1-1cdd302a4a60\") " pod="openstack/cloudkitty-api-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.203705 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/09f7d0fc-a70f-4296-82f1-1cdd302a4a60-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"09f7d0fc-a70f-4296-82f1-1cdd302a4a60\") " pod="openstack/cloudkitty-api-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.203728 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09f7d0fc-a70f-4296-82f1-1cdd302a4a60-config-data\") pod \"cloudkitty-api-0\" (UID: \"09f7d0fc-a70f-4296-82f1-1cdd302a4a60\") " pod="openstack/cloudkitty-api-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.203782 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09f7d0fc-a70f-4296-82f1-1cdd302a4a60-scripts\") pod \"cloudkitty-api-0\" (UID: \"09f7d0fc-a70f-4296-82f1-1cdd302a4a60\") " pod="openstack/cloudkitty-api-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.203811 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09f7d0fc-a70f-4296-82f1-1cdd302a4a60-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"09f7d0fc-a70f-4296-82f1-1cdd302a4a60\") " pod="openstack/cloudkitty-api-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.203842 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09f7d0fc-a70f-4296-82f1-1cdd302a4a60-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"09f7d0fc-a70f-4296-82f1-1cdd302a4a60\") " pod="openstack/cloudkitty-api-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.203879 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmttw\" (UniqueName: \"kubernetes.io/projected/09f7d0fc-a70f-4296-82f1-1cdd302a4a60-kube-api-access-xmttw\") pod \"cloudkitty-api-0\" (UID: \"09f7d0fc-a70f-4296-82f1-1cdd302a4a60\") " pod="openstack/cloudkitty-api-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.203906 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09f7d0fc-a70f-4296-82f1-1cdd302a4a60-logs\") pod \"cloudkitty-api-0\" (UID: \"09f7d0fc-a70f-4296-82f1-1cdd302a4a60\") " pod="openstack/cloudkitty-api-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.203939 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09f7d0fc-a70f-4296-82f1-1cdd302a4a60-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"09f7d0fc-a70f-4296-82f1-1cdd302a4a60\") " pod="openstack/cloudkitty-api-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.204944 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09f7d0fc-a70f-4296-82f1-1cdd302a4a60-logs\") pod \"cloudkitty-api-0\" (UID: \"09f7d0fc-a70f-4296-82f1-1cdd302a4a60\") " pod="openstack/cloudkitty-api-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.250791 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.250957 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad-certs\") pod \"cloudkitty-proc-0\" (UID: \"ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.251159 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.251857 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad-config-data\") pod \"cloudkitty-proc-0\" (UID: \"ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.253120 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad-scripts\") pod \"cloudkitty-proc-0\" (UID: \"ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.253126 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09f7d0fc-a70f-4296-82f1-1cdd302a4a60-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"09f7d0fc-a70f-4296-82f1-1cdd302a4a60\") " pod="openstack/cloudkitty-api-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.253190 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xshdk\" (UniqueName: \"kubernetes.io/projected/ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad-kube-api-access-xshdk\") pod \"cloudkitty-proc-0\" (UID: \"ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad\") " pod="openstack/cloudkitty-proc-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.253200 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/09f7d0fc-a70f-4296-82f1-1cdd302a4a60-certs\") pod \"cloudkitty-api-0\" (UID: \"09f7d0fc-a70f-4296-82f1-1cdd302a4a60\") " pod="openstack/cloudkitty-api-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.256546 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09f7d0fc-a70f-4296-82f1-1cdd302a4a60-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"09f7d0fc-a70f-4296-82f1-1cdd302a4a60\") " pod="openstack/cloudkitty-api-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.257120 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09f7d0fc-a70f-4296-82f1-1cdd302a4a60-config-data\") pod \"cloudkitty-api-0\" (UID: \"09f7d0fc-a70f-4296-82f1-1cdd302a4a60\") " pod="openstack/cloudkitty-api-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.258650 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09f7d0fc-a70f-4296-82f1-1cdd302a4a60-scripts\") pod \"cloudkitty-api-0\" (UID: \"09f7d0fc-a70f-4296-82f1-1cdd302a4a60\") " pod="openstack/cloudkitty-api-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.259768 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/09f7d0fc-a70f-4296-82f1-1cdd302a4a60-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"09f7d0fc-a70f-4296-82f1-1cdd302a4a60\") " pod="openstack/cloudkitty-api-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.270777 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmttw\" (UniqueName: \"kubernetes.io/projected/09f7d0fc-a70f-4296-82f1-1cdd302a4a60-kube-api-access-xmttw\") pod \"cloudkitty-api-0\" (UID: \"09f7d0fc-a70f-4296-82f1-1cdd302a4a60\") " pod="openstack/cloudkitty-api-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.279865 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09f7d0fc-a70f-4296-82f1-1cdd302a4a60-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"09f7d0fc-a70f-4296-82f1-1cdd302a4a60\") " pod="openstack/cloudkitty-api-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.336483 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.538058 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.792297 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f49989fd-6326-4020-aba0-45b49ed37872","Type":"ContainerStarted","Data":"76bb09153cb11303694519dadf556026c02ba550bffb5ac15c6eafddb920d686"} Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.798140 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e19f3906-864f-49f8-b3f1-e3cfbcae4133","Type":"ContainerStarted","Data":"d44c9d63a6330045a9fb9164d56ca185d80fcb7977ef909fd3029fa204fa25c2"} Feb 18 14:23:27 crc kubenswrapper[4817]: I0218 14:23:27.981513 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 18 14:23:28 crc kubenswrapper[4817]: I0218 14:23:28.185320 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="542029d6-ef61-49d1-88a7-c206c64e193a" path="/var/lib/kubelet/pods/542029d6-ef61-49d1-88a7-c206c64e193a/volumes" Feb 18 14:23:28 crc kubenswrapper[4817]: I0218 14:23:28.186359 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaa4f949-7d98-4a16-a5ed-ba88f9e89820" path="/var/lib/kubelet/pods/eaa4f949-7d98-4a16-a5ed-ba88f9e89820/volumes" Feb 18 14:23:28 crc kubenswrapper[4817]: I0218 14:23:28.236921 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 18 14:23:28 crc kubenswrapper[4817]: I0218 14:23:28.513351 4817 scope.go:117] "RemoveContainer" containerID="7852bca6be599402e9ef8a2226e86dbd411f7a7941dbf4d06f006e6602f294f0" Feb 18 14:23:28 crc kubenswrapper[4817]: I0218 14:23:28.604776 4817 scope.go:117] "RemoveContainer" containerID="dbaeaf695a0606c28a668d8f270b73ce7468bb3cb34c594fa65af74723208d2d" Feb 18 14:23:28 crc kubenswrapper[4817]: I0218 14:23:28.674480 4817 scope.go:117] "RemoveContainer" containerID="4b71496c76fcfa64e3b94ed413e473b6c3559b2cd28dd667d386cc78bbcb413e" Feb 18 14:23:28 crc kubenswrapper[4817]: I0218 14:23:28.720704 4817 scope.go:117] "RemoveContainer" containerID="3e253b2e3d34a6940187b42d285539b0c2670e0a2e80d4122ef03184dfff4251" Feb 18 14:23:28 crc kubenswrapper[4817]: I0218 14:23:28.835698 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"09f7d0fc-a70f-4296-82f1-1cdd302a4a60","Type":"ContainerStarted","Data":"82ae5946f6ec71f8c65952712f6b691899016f5ee4240b7ff86114f0a2ee76b9"} Feb 18 14:23:28 crc kubenswrapper[4817]: I0218 14:23:28.835796 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"09f7d0fc-a70f-4296-82f1-1cdd302a4a60","Type":"ContainerStarted","Data":"7a7fad78c227aa64b5b4b933b92820010954d5a3a8178fb64c3d1d2d6d9fdb66"} Feb 18 14:23:28 crc kubenswrapper[4817]: I0218 14:23:28.835817 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"09f7d0fc-a70f-4296-82f1-1cdd302a4a60","Type":"ContainerStarted","Data":"40d8d2bee554c2cc9c681e640d945262522f2e0ae066e88c7c6e60b8a36ac3b0"} Feb 18 14:23:28 crc kubenswrapper[4817]: I0218 14:23:28.835940 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Feb 18 14:23:28 crc kubenswrapper[4817]: I0218 14:23:28.844465 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad","Type":"ContainerStarted","Data":"a5de2342ead4b40d8199abd545744ffc8d4885eac8a022594b374482bae07e1c"} Feb 18 14:23:28 crc kubenswrapper[4817]: I0218 14:23:28.901354 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=2.901330344 podStartE2EDuration="2.901330344s" podCreationTimestamp="2026-02-18 14:23:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:23:28.868808511 +0000 UTC m=+1471.444344494" watchObservedRunningTime="2026-02-18 14:23:28.901330344 +0000 UTC m=+1471.476866327" Feb 18 14:23:29 crc kubenswrapper[4817]: I0218 14:23:29.464320 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74554f47dc-zx2qs"] Feb 18 14:23:29 crc kubenswrapper[4817]: I0218 14:23:29.467019 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74554f47dc-zx2qs" Feb 18 14:23:29 crc kubenswrapper[4817]: I0218 14:23:29.472689 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Feb 18 14:23:29 crc kubenswrapper[4817]: I0218 14:23:29.533274 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74554f47dc-zx2qs"] Feb 18 14:23:29 crc kubenswrapper[4817]: I0218 14:23:29.663043 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-dns-swift-storage-0\") pod \"dnsmasq-dns-74554f47dc-zx2qs\" (UID: \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\") " pod="openstack/dnsmasq-dns-74554f47dc-zx2qs" Feb 18 14:23:29 crc kubenswrapper[4817]: I0218 14:23:29.663152 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-dns-svc\") pod \"dnsmasq-dns-74554f47dc-zx2qs\" (UID: \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\") " pod="openstack/dnsmasq-dns-74554f47dc-zx2qs" Feb 18 14:23:29 crc kubenswrapper[4817]: I0218 14:23:29.663217 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-ovsdbserver-sb\") pod \"dnsmasq-dns-74554f47dc-zx2qs\" (UID: \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\") " pod="openstack/dnsmasq-dns-74554f47dc-zx2qs" Feb 18 14:23:29 crc kubenswrapper[4817]: I0218 14:23:29.663244 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-ovsdbserver-nb\") pod \"dnsmasq-dns-74554f47dc-zx2qs\" (UID: \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\") " pod="openstack/dnsmasq-dns-74554f47dc-zx2qs" Feb 18 14:23:29 crc kubenswrapper[4817]: I0218 14:23:29.663295 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-config\") pod \"dnsmasq-dns-74554f47dc-zx2qs\" (UID: \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\") " pod="openstack/dnsmasq-dns-74554f47dc-zx2qs" Feb 18 14:23:29 crc kubenswrapper[4817]: I0218 14:23:29.663314 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx9ln\" (UniqueName: \"kubernetes.io/projected/41ea7547-0242-42c0-9f0f-2f24a8c346ec-kube-api-access-xx9ln\") pod \"dnsmasq-dns-74554f47dc-zx2qs\" (UID: \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\") " pod="openstack/dnsmasq-dns-74554f47dc-zx2qs" Feb 18 14:23:29 crc kubenswrapper[4817]: I0218 14:23:29.663671 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-openstack-edpm-ipam\") pod \"dnsmasq-dns-74554f47dc-zx2qs\" (UID: \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\") " pod="openstack/dnsmasq-dns-74554f47dc-zx2qs" Feb 18 14:23:29 crc kubenswrapper[4817]: I0218 14:23:29.764911 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-openstack-edpm-ipam\") pod \"dnsmasq-dns-74554f47dc-zx2qs\" (UID: \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\") " pod="openstack/dnsmasq-dns-74554f47dc-zx2qs" Feb 18 14:23:29 crc kubenswrapper[4817]: I0218 14:23:29.765931 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-dns-swift-storage-0\") pod \"dnsmasq-dns-74554f47dc-zx2qs\" (UID: \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\") " pod="openstack/dnsmasq-dns-74554f47dc-zx2qs" Feb 18 14:23:29 crc kubenswrapper[4817]: I0218 14:23:29.766061 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-openstack-edpm-ipam\") pod \"dnsmasq-dns-74554f47dc-zx2qs\" (UID: \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\") " pod="openstack/dnsmasq-dns-74554f47dc-zx2qs" Feb 18 14:23:29 crc kubenswrapper[4817]: I0218 14:23:29.766683 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-dns-swift-storage-0\") pod \"dnsmasq-dns-74554f47dc-zx2qs\" (UID: \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\") " pod="openstack/dnsmasq-dns-74554f47dc-zx2qs" Feb 18 14:23:29 crc kubenswrapper[4817]: I0218 14:23:29.766880 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-dns-svc\") pod \"dnsmasq-dns-74554f47dc-zx2qs\" (UID: \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\") " pod="openstack/dnsmasq-dns-74554f47dc-zx2qs" Feb 18 14:23:29 crc kubenswrapper[4817]: I0218 14:23:29.767492 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-dns-svc\") pod \"dnsmasq-dns-74554f47dc-zx2qs\" (UID: \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\") " pod="openstack/dnsmasq-dns-74554f47dc-zx2qs" Feb 18 14:23:29 crc kubenswrapper[4817]: I0218 14:23:29.767626 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-ovsdbserver-sb\") pod \"dnsmasq-dns-74554f47dc-zx2qs\" (UID: \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\") " pod="openstack/dnsmasq-dns-74554f47dc-zx2qs" Feb 18 14:23:29 crc kubenswrapper[4817]: I0218 14:23:29.767656 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-ovsdbserver-nb\") pod \"dnsmasq-dns-74554f47dc-zx2qs\" (UID: \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\") " pod="openstack/dnsmasq-dns-74554f47dc-zx2qs" Feb 18 14:23:29 crc kubenswrapper[4817]: I0218 14:23:29.768259 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-ovsdbserver-sb\") pod \"dnsmasq-dns-74554f47dc-zx2qs\" (UID: \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\") " pod="openstack/dnsmasq-dns-74554f47dc-zx2qs" Feb 18 14:23:29 crc kubenswrapper[4817]: I0218 14:23:29.768405 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-config\") pod \"dnsmasq-dns-74554f47dc-zx2qs\" (UID: \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\") " pod="openstack/dnsmasq-dns-74554f47dc-zx2qs" Feb 18 14:23:29 crc kubenswrapper[4817]: I0218 14:23:29.768425 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx9ln\" (UniqueName: \"kubernetes.io/projected/41ea7547-0242-42c0-9f0f-2f24a8c346ec-kube-api-access-xx9ln\") pod \"dnsmasq-dns-74554f47dc-zx2qs\" (UID: \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\") " pod="openstack/dnsmasq-dns-74554f47dc-zx2qs" Feb 18 14:23:29 crc kubenswrapper[4817]: I0218 14:23:29.768509 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-ovsdbserver-nb\") pod \"dnsmasq-dns-74554f47dc-zx2qs\" (UID: \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\") " pod="openstack/dnsmasq-dns-74554f47dc-zx2qs" Feb 18 14:23:29 crc kubenswrapper[4817]: I0218 14:23:29.769022 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-config\") pod \"dnsmasq-dns-74554f47dc-zx2qs\" (UID: \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\") " pod="openstack/dnsmasq-dns-74554f47dc-zx2qs" Feb 18 14:23:29 crc kubenswrapper[4817]: I0218 14:23:29.801252 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx9ln\" (UniqueName: \"kubernetes.io/projected/41ea7547-0242-42c0-9f0f-2f24a8c346ec-kube-api-access-xx9ln\") pod \"dnsmasq-dns-74554f47dc-zx2qs\" (UID: \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\") " pod="openstack/dnsmasq-dns-74554f47dc-zx2qs" Feb 18 14:23:29 crc kubenswrapper[4817]: I0218 14:23:29.802432 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74554f47dc-zx2qs" Feb 18 14:23:29 crc kubenswrapper[4817]: I0218 14:23:29.886079 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad","Type":"ContainerStarted","Data":"d29b639b861f81008005efaacc600db41e561ca3cdf4192d57ce867e411f4be8"} Feb 18 14:23:29 crc kubenswrapper[4817]: I0218 14:23:29.921240 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=3.271666915 podStartE2EDuration="3.921214267s" podCreationTimestamp="2026-02-18 14:23:26 +0000 UTC" firstStartedPulling="2026-02-18 14:23:28.239916434 +0000 UTC m=+1470.815452427" lastFinishedPulling="2026-02-18 14:23:28.889463796 +0000 UTC m=+1471.464999779" observedRunningTime="2026-02-18 14:23:29.907926083 +0000 UTC m=+1472.483462076" watchObservedRunningTime="2026-02-18 14:23:29.921214267 +0000 UTC m=+1472.496750250" Feb 18 14:23:30 crc kubenswrapper[4817]: I0218 14:23:30.142039 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74554f47dc-zx2qs"] Feb 18 14:23:30 crc kubenswrapper[4817]: W0218 14:23:30.151650 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41ea7547_0242_42c0_9f0f_2f24a8c346ec.slice/crio-6af55609751b52df33135e6292bc7688abdf190c51d6af76a6067b9fc7c12299 WatchSource:0}: Error finding container 6af55609751b52df33135e6292bc7688abdf190c51d6af76a6067b9fc7c12299: Status 404 returned error can't find the container with id 6af55609751b52df33135e6292bc7688abdf190c51d6af76a6067b9fc7c12299 Feb 18 14:23:30 crc kubenswrapper[4817]: I0218 14:23:30.897526 4817 generic.go:334] "Generic (PLEG): container finished" podID="41ea7547-0242-42c0-9f0f-2f24a8c346ec" containerID="81edf4c490b073fd22c49add984af56a4027318d5a35da98319890e9305b25a6" exitCode=0 Feb 18 14:23:30 crc kubenswrapper[4817]: I0218 14:23:30.897656 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74554f47dc-zx2qs" event={"ID":"41ea7547-0242-42c0-9f0f-2f24a8c346ec","Type":"ContainerDied","Data":"81edf4c490b073fd22c49add984af56a4027318d5a35da98319890e9305b25a6"} Feb 18 14:23:30 crc kubenswrapper[4817]: I0218 14:23:30.897952 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74554f47dc-zx2qs" event={"ID":"41ea7547-0242-42c0-9f0f-2f24a8c346ec","Type":"ContainerStarted","Data":"6af55609751b52df33135e6292bc7688abdf190c51d6af76a6067b9fc7c12299"} Feb 18 14:23:31 crc kubenswrapper[4817]: I0218 14:23:31.909548 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74554f47dc-zx2qs" event={"ID":"41ea7547-0242-42c0-9f0f-2f24a8c346ec","Type":"ContainerStarted","Data":"3d29d0223f32902a334c91b2332fed5ad35603e815e598b25abd608bd0f1410d"} Feb 18 14:23:31 crc kubenswrapper[4817]: I0218 14:23:31.910789 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74554f47dc-zx2qs" Feb 18 14:23:32 crc kubenswrapper[4817]: I0218 14:23:32.342827 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74554f47dc-zx2qs" podStartSLOduration=3.342809513 podStartE2EDuration="3.342809513s" podCreationTimestamp="2026-02-18 14:23:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:23:32.34232129 +0000 UTC m=+1474.917857283" watchObservedRunningTime="2026-02-18 14:23:32.342809513 +0000 UTC m=+1474.918345496" Feb 18 14:23:39 crc kubenswrapper[4817]: I0218 14:23:39.804153 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74554f47dc-zx2qs" Feb 18 14:23:39 crc kubenswrapper[4817]: I0218 14:23:39.892378 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64c8b5dcc-9xs2l"] Feb 18 14:23:39 crc kubenswrapper[4817]: I0218 14:23:39.892598 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-64c8b5dcc-9xs2l" podUID="b08decc2-ae57-4858-93bd-acc42ae42148" containerName="dnsmasq-dns" containerID="cri-o://56c1e054a5fe3b8e91d26c5b54e34e0521ee5f8873010f66b22a5af0e89f1122" gracePeriod=10 Feb 18 14:23:40 crc kubenswrapper[4817]: I0218 14:23:40.104858 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bb494c7f-kmtc2"] Feb 18 14:23:40 crc kubenswrapper[4817]: I0218 14:23:40.108379 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bb494c7f-kmtc2" Feb 18 14:23:40 crc kubenswrapper[4817]: I0218 14:23:40.144704 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bb494c7f-kmtc2"] Feb 18 14:23:40 crc kubenswrapper[4817]: I0218 14:23:40.205328 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s5pr\" (UniqueName: \"kubernetes.io/projected/d48ebb6a-086e-4e2e-b196-5f30c0a82b14-kube-api-access-2s5pr\") pod \"dnsmasq-dns-7bb494c7f-kmtc2\" (UID: \"d48ebb6a-086e-4e2e-b196-5f30c0a82b14\") " pod="openstack/dnsmasq-dns-7bb494c7f-kmtc2" Feb 18 14:23:40 crc kubenswrapper[4817]: I0218 14:23:40.205419 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d48ebb6a-086e-4e2e-b196-5f30c0a82b14-ovsdbserver-nb\") pod \"dnsmasq-dns-7bb494c7f-kmtc2\" (UID: \"d48ebb6a-086e-4e2e-b196-5f30c0a82b14\") " pod="openstack/dnsmasq-dns-7bb494c7f-kmtc2" Feb 18 14:23:40 crc kubenswrapper[4817]: I0218 14:23:40.205463 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d48ebb6a-086e-4e2e-b196-5f30c0a82b14-dns-svc\") pod \"dnsmasq-dns-7bb494c7f-kmtc2\" (UID: \"d48ebb6a-086e-4e2e-b196-5f30c0a82b14\") " pod="openstack/dnsmasq-dns-7bb494c7f-kmtc2" Feb 18 14:23:40 crc kubenswrapper[4817]: I0218 14:23:40.205498 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d48ebb6a-086e-4e2e-b196-5f30c0a82b14-ovsdbserver-sb\") pod \"dnsmasq-dns-7bb494c7f-kmtc2\" (UID: \"d48ebb6a-086e-4e2e-b196-5f30c0a82b14\") " pod="openstack/dnsmasq-dns-7bb494c7f-kmtc2" Feb 18 14:23:40 crc kubenswrapper[4817]: I0218 14:23:40.205588 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d48ebb6a-086e-4e2e-b196-5f30c0a82b14-config\") pod \"dnsmasq-dns-7bb494c7f-kmtc2\" (UID: \"d48ebb6a-086e-4e2e-b196-5f30c0a82b14\") " pod="openstack/dnsmasq-dns-7bb494c7f-kmtc2" Feb 18 14:23:40 crc kubenswrapper[4817]: I0218 14:23:40.205726 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d48ebb6a-086e-4e2e-b196-5f30c0a82b14-dns-swift-storage-0\") pod \"dnsmasq-dns-7bb494c7f-kmtc2\" (UID: \"d48ebb6a-086e-4e2e-b196-5f30c0a82b14\") " pod="openstack/dnsmasq-dns-7bb494c7f-kmtc2" Feb 18 14:23:40 crc kubenswrapper[4817]: I0218 14:23:40.205801 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d48ebb6a-086e-4e2e-b196-5f30c0a82b14-openstack-edpm-ipam\") pod \"dnsmasq-dns-7bb494c7f-kmtc2\" (UID: \"d48ebb6a-086e-4e2e-b196-5f30c0a82b14\") " pod="openstack/dnsmasq-dns-7bb494c7f-kmtc2" Feb 18 14:23:40 crc kubenswrapper[4817]: I0218 14:23:40.308387 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d48ebb6a-086e-4e2e-b196-5f30c0a82b14-openstack-edpm-ipam\") pod \"dnsmasq-dns-7bb494c7f-kmtc2\" (UID: \"d48ebb6a-086e-4e2e-b196-5f30c0a82b14\") " pod="openstack/dnsmasq-dns-7bb494c7f-kmtc2" Feb 18 14:23:40 crc kubenswrapper[4817]: I0218 14:23:40.308460 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s5pr\" (UniqueName: \"kubernetes.io/projected/d48ebb6a-086e-4e2e-b196-5f30c0a82b14-kube-api-access-2s5pr\") pod \"dnsmasq-dns-7bb494c7f-kmtc2\" (UID: \"d48ebb6a-086e-4e2e-b196-5f30c0a82b14\") " pod="openstack/dnsmasq-dns-7bb494c7f-kmtc2" Feb 18 14:23:40 crc kubenswrapper[4817]: I0218 14:23:40.308523 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d48ebb6a-086e-4e2e-b196-5f30c0a82b14-ovsdbserver-nb\") pod \"dnsmasq-dns-7bb494c7f-kmtc2\" (UID: \"d48ebb6a-086e-4e2e-b196-5f30c0a82b14\") " pod="openstack/dnsmasq-dns-7bb494c7f-kmtc2" Feb 18 14:23:40 crc kubenswrapper[4817]: I0218 14:23:40.308579 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d48ebb6a-086e-4e2e-b196-5f30c0a82b14-dns-svc\") pod \"dnsmasq-dns-7bb494c7f-kmtc2\" (UID: \"d48ebb6a-086e-4e2e-b196-5f30c0a82b14\") " pod="openstack/dnsmasq-dns-7bb494c7f-kmtc2" Feb 18 14:23:40 crc kubenswrapper[4817]: I0218 14:23:40.308612 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d48ebb6a-086e-4e2e-b196-5f30c0a82b14-ovsdbserver-sb\") pod \"dnsmasq-dns-7bb494c7f-kmtc2\" (UID: \"d48ebb6a-086e-4e2e-b196-5f30c0a82b14\") " pod="openstack/dnsmasq-dns-7bb494c7f-kmtc2" Feb 18 14:23:40 crc kubenswrapper[4817]: I0218 14:23:40.308684 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d48ebb6a-086e-4e2e-b196-5f30c0a82b14-config\") pod \"dnsmasq-dns-7bb494c7f-kmtc2\" (UID: \"d48ebb6a-086e-4e2e-b196-5f30c0a82b14\") " pod="openstack/dnsmasq-dns-7bb494c7f-kmtc2" Feb 18 14:23:40 crc kubenswrapper[4817]: I0218 14:23:40.308816 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d48ebb6a-086e-4e2e-b196-5f30c0a82b14-dns-swift-storage-0\") pod \"dnsmasq-dns-7bb494c7f-kmtc2\" (UID: \"d48ebb6a-086e-4e2e-b196-5f30c0a82b14\") " pod="openstack/dnsmasq-dns-7bb494c7f-kmtc2" Feb 18 14:23:40 crc kubenswrapper[4817]: I0218 14:23:40.309828 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d48ebb6a-086e-4e2e-b196-5f30c0a82b14-dns-swift-storage-0\") pod \"dnsmasq-dns-7bb494c7f-kmtc2\" (UID: \"d48ebb6a-086e-4e2e-b196-5f30c0a82b14\") " pod="openstack/dnsmasq-dns-7bb494c7f-kmtc2" Feb 18 14:23:40 crc kubenswrapper[4817]: I0218 14:23:40.310631 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d48ebb6a-086e-4e2e-b196-5f30c0a82b14-ovsdbserver-sb\") pod \"dnsmasq-dns-7bb494c7f-kmtc2\" (UID: \"d48ebb6a-086e-4e2e-b196-5f30c0a82b14\") " pod="openstack/dnsmasq-dns-7bb494c7f-kmtc2" Feb 18 14:23:40 crc kubenswrapper[4817]: I0218 14:23:40.311338 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d48ebb6a-086e-4e2e-b196-5f30c0a82b14-openstack-edpm-ipam\") pod \"dnsmasq-dns-7bb494c7f-kmtc2\" (UID: \"d48ebb6a-086e-4e2e-b196-5f30c0a82b14\") " pod="openstack/dnsmasq-dns-7bb494c7f-kmtc2" Feb 18 14:23:40 crc kubenswrapper[4817]: I0218 14:23:40.311357 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d48ebb6a-086e-4e2e-b196-5f30c0a82b14-dns-svc\") pod \"dnsmasq-dns-7bb494c7f-kmtc2\" (UID: \"d48ebb6a-086e-4e2e-b196-5f30c0a82b14\") " pod="openstack/dnsmasq-dns-7bb494c7f-kmtc2" Feb 18 14:23:40 crc kubenswrapper[4817]: I0218 14:23:40.311748 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d48ebb6a-086e-4e2e-b196-5f30c0a82b14-ovsdbserver-nb\") pod \"dnsmasq-dns-7bb494c7f-kmtc2\" (UID: \"d48ebb6a-086e-4e2e-b196-5f30c0a82b14\") " pod="openstack/dnsmasq-dns-7bb494c7f-kmtc2" Feb 18 14:23:40 crc kubenswrapper[4817]: I0218 14:23:40.311792 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d48ebb6a-086e-4e2e-b196-5f30c0a82b14-config\") pod \"dnsmasq-dns-7bb494c7f-kmtc2\" (UID: \"d48ebb6a-086e-4e2e-b196-5f30c0a82b14\") " pod="openstack/dnsmasq-dns-7bb494c7f-kmtc2" Feb 18 14:23:40 crc kubenswrapper[4817]: I0218 14:23:40.354146 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s5pr\" (UniqueName: \"kubernetes.io/projected/d48ebb6a-086e-4e2e-b196-5f30c0a82b14-kube-api-access-2s5pr\") pod \"dnsmasq-dns-7bb494c7f-kmtc2\" (UID: \"d48ebb6a-086e-4e2e-b196-5f30c0a82b14\") " pod="openstack/dnsmasq-dns-7bb494c7f-kmtc2" Feb 18 14:23:40 crc kubenswrapper[4817]: I0218 14:23:40.429469 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bb494c7f-kmtc2" Feb 18 14:23:40 crc kubenswrapper[4817]: I0218 14:23:40.950338 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bb494c7f-kmtc2"] Feb 18 14:23:41 crc kubenswrapper[4817]: I0218 14:23:41.039024 4817 generic.go:334] "Generic (PLEG): container finished" podID="b08decc2-ae57-4858-93bd-acc42ae42148" containerID="56c1e054a5fe3b8e91d26c5b54e34e0521ee5f8873010f66b22a5af0e89f1122" exitCode=0 Feb 18 14:23:41 crc kubenswrapper[4817]: I0218 14:23:41.039426 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64c8b5dcc-9xs2l" event={"ID":"b08decc2-ae57-4858-93bd-acc42ae42148","Type":"ContainerDied","Data":"56c1e054a5fe3b8e91d26c5b54e34e0521ee5f8873010f66b22a5af0e89f1122"} Feb 18 14:23:41 crc kubenswrapper[4817]: I0218 14:23:41.042287 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bb494c7f-kmtc2" event={"ID":"d48ebb6a-086e-4e2e-b196-5f30c0a82b14","Type":"ContainerStarted","Data":"5f4ca20ecc734325a36186ccf83ae34ba76613445f50bba032691e92bcd8ac9f"} Feb 18 14:23:41 crc kubenswrapper[4817]: I0218 14:23:41.259786 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64c8b5dcc-9xs2l" Feb 18 14:23:41 crc kubenswrapper[4817]: I0218 14:23:41.353724 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b08decc2-ae57-4858-93bd-acc42ae42148-ovsdbserver-sb\") pod \"b08decc2-ae57-4858-93bd-acc42ae42148\" (UID: \"b08decc2-ae57-4858-93bd-acc42ae42148\") " Feb 18 14:23:41 crc kubenswrapper[4817]: I0218 14:23:41.353926 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b08decc2-ae57-4858-93bd-acc42ae42148-dns-svc\") pod \"b08decc2-ae57-4858-93bd-acc42ae42148\" (UID: \"b08decc2-ae57-4858-93bd-acc42ae42148\") " Feb 18 14:23:41 crc kubenswrapper[4817]: I0218 14:23:41.353967 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b08decc2-ae57-4858-93bd-acc42ae42148-config\") pod \"b08decc2-ae57-4858-93bd-acc42ae42148\" (UID: \"b08decc2-ae57-4858-93bd-acc42ae42148\") " Feb 18 14:23:41 crc kubenswrapper[4817]: I0218 14:23:41.354015 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bm24\" (UniqueName: \"kubernetes.io/projected/b08decc2-ae57-4858-93bd-acc42ae42148-kube-api-access-9bm24\") pod \"b08decc2-ae57-4858-93bd-acc42ae42148\" (UID: \"b08decc2-ae57-4858-93bd-acc42ae42148\") " Feb 18 14:23:41 crc kubenswrapper[4817]: I0218 14:23:41.354145 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b08decc2-ae57-4858-93bd-acc42ae42148-dns-swift-storage-0\") pod \"b08decc2-ae57-4858-93bd-acc42ae42148\" (UID: \"b08decc2-ae57-4858-93bd-acc42ae42148\") " Feb 18 14:23:41 crc kubenswrapper[4817]: I0218 14:23:41.354175 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b08decc2-ae57-4858-93bd-acc42ae42148-ovsdbserver-nb\") pod \"b08decc2-ae57-4858-93bd-acc42ae42148\" (UID: \"b08decc2-ae57-4858-93bd-acc42ae42148\") " Feb 18 14:23:41 crc kubenswrapper[4817]: I0218 14:23:41.408251 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b08decc2-ae57-4858-93bd-acc42ae42148-kube-api-access-9bm24" (OuterVolumeSpecName: "kube-api-access-9bm24") pod "b08decc2-ae57-4858-93bd-acc42ae42148" (UID: "b08decc2-ae57-4858-93bd-acc42ae42148"). InnerVolumeSpecName "kube-api-access-9bm24". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:23:41 crc kubenswrapper[4817]: I0218 14:23:41.460186 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bm24\" (UniqueName: \"kubernetes.io/projected/b08decc2-ae57-4858-93bd-acc42ae42148-kube-api-access-9bm24\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:41 crc kubenswrapper[4817]: I0218 14:23:41.482612 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b08decc2-ae57-4858-93bd-acc42ae42148-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b08decc2-ae57-4858-93bd-acc42ae42148" (UID: "b08decc2-ae57-4858-93bd-acc42ae42148"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:23:41 crc kubenswrapper[4817]: I0218 14:23:41.521705 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b08decc2-ae57-4858-93bd-acc42ae42148-config" (OuterVolumeSpecName: "config") pod "b08decc2-ae57-4858-93bd-acc42ae42148" (UID: "b08decc2-ae57-4858-93bd-acc42ae42148"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:23:41 crc kubenswrapper[4817]: I0218 14:23:41.522055 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b08decc2-ae57-4858-93bd-acc42ae42148-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b08decc2-ae57-4858-93bd-acc42ae42148" (UID: "b08decc2-ae57-4858-93bd-acc42ae42148"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:23:41 crc kubenswrapper[4817]: I0218 14:23:41.527646 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b08decc2-ae57-4858-93bd-acc42ae42148-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b08decc2-ae57-4858-93bd-acc42ae42148" (UID: "b08decc2-ae57-4858-93bd-acc42ae42148"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:23:41 crc kubenswrapper[4817]: I0218 14:23:41.548232 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b08decc2-ae57-4858-93bd-acc42ae42148-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b08decc2-ae57-4858-93bd-acc42ae42148" (UID: "b08decc2-ae57-4858-93bd-acc42ae42148"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:23:41 crc kubenswrapper[4817]: I0218 14:23:41.562679 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b08decc2-ae57-4858-93bd-acc42ae42148-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:41 crc kubenswrapper[4817]: I0218 14:23:41.562926 4817 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b08decc2-ae57-4858-93bd-acc42ae42148-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:41 crc kubenswrapper[4817]: I0218 14:23:41.563009 4817 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b08decc2-ae57-4858-93bd-acc42ae42148-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:41 crc kubenswrapper[4817]: I0218 14:23:41.563077 4817 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b08decc2-ae57-4858-93bd-acc42ae42148-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:41 crc kubenswrapper[4817]: I0218 14:23:41.563132 4817 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b08decc2-ae57-4858-93bd-acc42ae42148-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:42 crc kubenswrapper[4817]: I0218 14:23:42.057427 4817 generic.go:334] "Generic (PLEG): container finished" podID="d48ebb6a-086e-4e2e-b196-5f30c0a82b14" containerID="1383b7b927775a146c5324773ed37bc9e58d641c12a6d5a8d4db98af5322502f" exitCode=0 Feb 18 14:23:42 crc kubenswrapper[4817]: I0218 14:23:42.057546 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bb494c7f-kmtc2" event={"ID":"d48ebb6a-086e-4e2e-b196-5f30c0a82b14","Type":"ContainerDied","Data":"1383b7b927775a146c5324773ed37bc9e58d641c12a6d5a8d4db98af5322502f"} Feb 18 14:23:42 crc kubenswrapper[4817]: I0218 14:23:42.063043 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64c8b5dcc-9xs2l" event={"ID":"b08decc2-ae57-4858-93bd-acc42ae42148","Type":"ContainerDied","Data":"e4df5b4345d8240ace1757e3c36822caec3ea6c5940df9dc7c066aaddfb481e8"} Feb 18 14:23:42 crc kubenswrapper[4817]: I0218 14:23:42.063108 4817 scope.go:117] "RemoveContainer" containerID="56c1e054a5fe3b8e91d26c5b54e34e0521ee5f8873010f66b22a5af0e89f1122" Feb 18 14:23:42 crc kubenswrapper[4817]: I0218 14:23:42.063146 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64c8b5dcc-9xs2l" Feb 18 14:23:42 crc kubenswrapper[4817]: I0218 14:23:42.299123 4817 scope.go:117] "RemoveContainer" containerID="287c80c5d3d514702d73a712c09c3cfc643b05e69403d0ae2d41731b6006066f" Feb 18 14:23:42 crc kubenswrapper[4817]: I0218 14:23:42.304251 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64c8b5dcc-9xs2l"] Feb 18 14:23:42 crc kubenswrapper[4817]: I0218 14:23:42.315693 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64c8b5dcc-9xs2l"] Feb 18 14:23:42 crc kubenswrapper[4817]: I0218 14:23:42.863271 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:23:42 crc kubenswrapper[4817]: I0218 14:23:42.863345 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:23:42 crc kubenswrapper[4817]: I0218 14:23:42.863402 4817 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" Feb 18 14:23:42 crc kubenswrapper[4817]: I0218 14:23:42.864303 4817 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f904d428b6eee9716ba5ad8fa384beb59b260ceb6de6d026ad8fd0ef911a200e"} pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 14:23:42 crc kubenswrapper[4817]: I0218 14:23:42.864375 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" containerID="cri-o://f904d428b6eee9716ba5ad8fa384beb59b260ceb6de6d026ad8fd0ef911a200e" gracePeriod=600 Feb 18 14:23:43 crc kubenswrapper[4817]: I0218 14:23:43.079791 4817 generic.go:334] "Generic (PLEG): container finished" podID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerID="f904d428b6eee9716ba5ad8fa384beb59b260ceb6de6d026ad8fd0ef911a200e" exitCode=0 Feb 18 14:23:43 crc kubenswrapper[4817]: I0218 14:23:43.079871 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerDied","Data":"f904d428b6eee9716ba5ad8fa384beb59b260ceb6de6d026ad8fd0ef911a200e"} Feb 18 14:23:43 crc kubenswrapper[4817]: I0218 14:23:43.080344 4817 scope.go:117] "RemoveContainer" containerID="bd719d9fe372437c635a5966e962ebc51e7647a95b5fd6491500726f444d522f" Feb 18 14:23:43 crc kubenswrapper[4817]: I0218 14:23:43.084516 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bb494c7f-kmtc2" event={"ID":"d48ebb6a-086e-4e2e-b196-5f30c0a82b14","Type":"ContainerStarted","Data":"5b23ead4a1eb9a836fda2ab556c84063ba7c5d98d7e164b0cbcfa87534323033"} Feb 18 14:23:43 crc kubenswrapper[4817]: I0218 14:23:43.084700 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7bb494c7f-kmtc2" Feb 18 14:23:43 crc kubenswrapper[4817]: I0218 14:23:43.121876 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7bb494c7f-kmtc2" podStartSLOduration=3.121851458 podStartE2EDuration="3.121851458s" podCreationTimestamp="2026-02-18 14:23:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:23:43.107116376 +0000 UTC m=+1485.682652369" watchObservedRunningTime="2026-02-18 14:23:43.121851458 +0000 UTC m=+1485.697387441" Feb 18 14:23:43 crc kubenswrapper[4817]: I0218 14:23:43.253956 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 18 14:23:44 crc kubenswrapper[4817]: I0218 14:23:44.106430 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerStarted","Data":"77626d3d8b06dd06e4699408c75ed0f84d2534db6eae4ab257458701371a9858"} Feb 18 14:23:44 crc kubenswrapper[4817]: I0218 14:23:44.185317 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b08decc2-ae57-4858-93bd-acc42ae42148" path="/var/lib/kubelet/pods/b08decc2-ae57-4858-93bd-acc42ae42148/volumes" Feb 18 14:23:50 crc kubenswrapper[4817]: I0218 14:23:50.431271 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7bb494c7f-kmtc2" Feb 18 14:23:50 crc kubenswrapper[4817]: I0218 14:23:50.504134 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74554f47dc-zx2qs"] Feb 18 14:23:50 crc kubenswrapper[4817]: I0218 14:23:50.504421 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74554f47dc-zx2qs" podUID="41ea7547-0242-42c0-9f0f-2f24a8c346ec" containerName="dnsmasq-dns" containerID="cri-o://3d29d0223f32902a334c91b2332fed5ad35603e815e598b25abd608bd0f1410d" gracePeriod=10 Feb 18 14:23:51 crc kubenswrapper[4817]: I0218 14:23:51.141419 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74554f47dc-zx2qs" Feb 18 14:23:51 crc kubenswrapper[4817]: I0218 14:23:51.178226 4817 generic.go:334] "Generic (PLEG): container finished" podID="41ea7547-0242-42c0-9f0f-2f24a8c346ec" containerID="3d29d0223f32902a334c91b2332fed5ad35603e815e598b25abd608bd0f1410d" exitCode=0 Feb 18 14:23:51 crc kubenswrapper[4817]: I0218 14:23:51.178276 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74554f47dc-zx2qs" event={"ID":"41ea7547-0242-42c0-9f0f-2f24a8c346ec","Type":"ContainerDied","Data":"3d29d0223f32902a334c91b2332fed5ad35603e815e598b25abd608bd0f1410d"} Feb 18 14:23:51 crc kubenswrapper[4817]: I0218 14:23:51.178312 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74554f47dc-zx2qs" event={"ID":"41ea7547-0242-42c0-9f0f-2f24a8c346ec","Type":"ContainerDied","Data":"6af55609751b52df33135e6292bc7688abdf190c51d6af76a6067b9fc7c12299"} Feb 18 14:23:51 crc kubenswrapper[4817]: I0218 14:23:51.178333 4817 scope.go:117] "RemoveContainer" containerID="3d29d0223f32902a334c91b2332fed5ad35603e815e598b25abd608bd0f1410d" Feb 18 14:23:51 crc kubenswrapper[4817]: I0218 14:23:51.178488 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74554f47dc-zx2qs" Feb 18 14:23:51 crc kubenswrapper[4817]: I0218 14:23:51.231313 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-ovsdbserver-nb\") pod \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\" (UID: \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\") " Feb 18 14:23:51 crc kubenswrapper[4817]: I0218 14:23:51.231393 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-config\") pod \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\" (UID: \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\") " Feb 18 14:23:51 crc kubenswrapper[4817]: I0218 14:23:51.231441 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-ovsdbserver-sb\") pod \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\" (UID: \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\") " Feb 18 14:23:51 crc kubenswrapper[4817]: I0218 14:23:51.231490 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx9ln\" (UniqueName: \"kubernetes.io/projected/41ea7547-0242-42c0-9f0f-2f24a8c346ec-kube-api-access-xx9ln\") pod \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\" (UID: \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\") " Feb 18 14:23:51 crc kubenswrapper[4817]: I0218 14:23:51.231756 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-dns-swift-storage-0\") pod \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\" (UID: \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\") " Feb 18 14:23:51 crc kubenswrapper[4817]: I0218 14:23:51.231882 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-dns-svc\") pod \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\" (UID: \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\") " Feb 18 14:23:51 crc kubenswrapper[4817]: I0218 14:23:51.231909 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-openstack-edpm-ipam\") pod \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\" (UID: \"41ea7547-0242-42c0-9f0f-2f24a8c346ec\") " Feb 18 14:23:51 crc kubenswrapper[4817]: I0218 14:23:51.244289 4817 scope.go:117] "RemoveContainer" containerID="81edf4c490b073fd22c49add984af56a4027318d5a35da98319890e9305b25a6" Feb 18 14:23:51 crc kubenswrapper[4817]: I0218 14:23:51.253150 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41ea7547-0242-42c0-9f0f-2f24a8c346ec-kube-api-access-xx9ln" (OuterVolumeSpecName: "kube-api-access-xx9ln") pod "41ea7547-0242-42c0-9f0f-2f24a8c346ec" (UID: "41ea7547-0242-42c0-9f0f-2f24a8c346ec"). InnerVolumeSpecName "kube-api-access-xx9ln". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:23:51 crc kubenswrapper[4817]: I0218 14:23:51.317352 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "41ea7547-0242-42c0-9f0f-2f24a8c346ec" (UID: "41ea7547-0242-42c0-9f0f-2f24a8c346ec"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:23:51 crc kubenswrapper[4817]: I0218 14:23:51.321603 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "41ea7547-0242-42c0-9f0f-2f24a8c346ec" (UID: "41ea7547-0242-42c0-9f0f-2f24a8c346ec"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:23:51 crc kubenswrapper[4817]: I0218 14:23:51.335099 4817 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:51 crc kubenswrapper[4817]: I0218 14:23:51.335137 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx9ln\" (UniqueName: \"kubernetes.io/projected/41ea7547-0242-42c0-9f0f-2f24a8c346ec-kube-api-access-xx9ln\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:51 crc kubenswrapper[4817]: I0218 14:23:51.335149 4817 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:51 crc kubenswrapper[4817]: I0218 14:23:51.339948 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "41ea7547-0242-42c0-9f0f-2f24a8c346ec" (UID: "41ea7547-0242-42c0-9f0f-2f24a8c346ec"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:23:51 crc kubenswrapper[4817]: I0218 14:23:51.353789 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "41ea7547-0242-42c0-9f0f-2f24a8c346ec" (UID: "41ea7547-0242-42c0-9f0f-2f24a8c346ec"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:23:51 crc kubenswrapper[4817]: I0218 14:23:51.358751 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "41ea7547-0242-42c0-9f0f-2f24a8c346ec" (UID: "41ea7547-0242-42c0-9f0f-2f24a8c346ec"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:23:51 crc kubenswrapper[4817]: I0218 14:23:51.362839 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-config" (OuterVolumeSpecName: "config") pod "41ea7547-0242-42c0-9f0f-2f24a8c346ec" (UID: "41ea7547-0242-42c0-9f0f-2f24a8c346ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:23:51 crc kubenswrapper[4817]: I0218 14:23:51.388952 4817 scope.go:117] "RemoveContainer" containerID="3d29d0223f32902a334c91b2332fed5ad35603e815e598b25abd608bd0f1410d" Feb 18 14:23:51 crc kubenswrapper[4817]: E0218 14:23:51.389563 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d29d0223f32902a334c91b2332fed5ad35603e815e598b25abd608bd0f1410d\": container with ID starting with 3d29d0223f32902a334c91b2332fed5ad35603e815e598b25abd608bd0f1410d not found: ID does not exist" containerID="3d29d0223f32902a334c91b2332fed5ad35603e815e598b25abd608bd0f1410d" Feb 18 14:23:51 crc kubenswrapper[4817]: I0218 14:23:51.389624 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d29d0223f32902a334c91b2332fed5ad35603e815e598b25abd608bd0f1410d"} err="failed to get container status \"3d29d0223f32902a334c91b2332fed5ad35603e815e598b25abd608bd0f1410d\": rpc error: code = NotFound desc = could not find container \"3d29d0223f32902a334c91b2332fed5ad35603e815e598b25abd608bd0f1410d\": container with ID starting with 3d29d0223f32902a334c91b2332fed5ad35603e815e598b25abd608bd0f1410d not found: ID does not exist" Feb 18 14:23:51 crc kubenswrapper[4817]: I0218 14:23:51.389659 4817 scope.go:117] "RemoveContainer" containerID="81edf4c490b073fd22c49add984af56a4027318d5a35da98319890e9305b25a6" Feb 18 14:23:51 crc kubenswrapper[4817]: E0218 14:23:51.390177 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81edf4c490b073fd22c49add984af56a4027318d5a35da98319890e9305b25a6\": container with ID starting with 81edf4c490b073fd22c49add984af56a4027318d5a35da98319890e9305b25a6 not found: ID does not exist" containerID="81edf4c490b073fd22c49add984af56a4027318d5a35da98319890e9305b25a6" Feb 18 14:23:51 crc kubenswrapper[4817]: I0218 14:23:51.390219 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81edf4c490b073fd22c49add984af56a4027318d5a35da98319890e9305b25a6"} err="failed to get container status \"81edf4c490b073fd22c49add984af56a4027318d5a35da98319890e9305b25a6\": rpc error: code = NotFound desc = could not find container \"81edf4c490b073fd22c49add984af56a4027318d5a35da98319890e9305b25a6\": container with ID starting with 81edf4c490b073fd22c49add984af56a4027318d5a35da98319890e9305b25a6 not found: ID does not exist" Feb 18 14:23:51 crc kubenswrapper[4817]: I0218 14:23:51.438656 4817 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:51 crc kubenswrapper[4817]: I0218 14:23:51.438699 4817 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-config\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:51 crc kubenswrapper[4817]: I0218 14:23:51.438709 4817 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:51 crc kubenswrapper[4817]: I0218 14:23:51.438720 4817 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/41ea7547-0242-42c0-9f0f-2f24a8c346ec-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 14:23:51 crc kubenswrapper[4817]: I0218 14:23:51.527460 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74554f47dc-zx2qs"] Feb 18 14:23:51 crc kubenswrapper[4817]: I0218 14:23:51.539601 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74554f47dc-zx2qs"] Feb 18 14:23:52 crc kubenswrapper[4817]: I0218 14:23:52.184287 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41ea7547-0242-42c0-9f0f-2f24a8c346ec" path="/var/lib/kubelet/pods/41ea7547-0242-42c0-9f0f-2f24a8c346ec/volumes" Feb 18 14:23:59 crc kubenswrapper[4817]: I0218 14:23:59.084194 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-svdph"] Feb 18 14:23:59 crc kubenswrapper[4817]: E0218 14:23:59.085216 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41ea7547-0242-42c0-9f0f-2f24a8c346ec" containerName="init" Feb 18 14:23:59 crc kubenswrapper[4817]: I0218 14:23:59.085230 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="41ea7547-0242-42c0-9f0f-2f24a8c346ec" containerName="init" Feb 18 14:23:59 crc kubenswrapper[4817]: E0218 14:23:59.085248 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b08decc2-ae57-4858-93bd-acc42ae42148" containerName="init" Feb 18 14:23:59 crc kubenswrapper[4817]: I0218 14:23:59.085256 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="b08decc2-ae57-4858-93bd-acc42ae42148" containerName="init" Feb 18 14:23:59 crc kubenswrapper[4817]: E0218 14:23:59.085275 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41ea7547-0242-42c0-9f0f-2f24a8c346ec" containerName="dnsmasq-dns" Feb 18 14:23:59 crc kubenswrapper[4817]: I0218 14:23:59.085281 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="41ea7547-0242-42c0-9f0f-2f24a8c346ec" containerName="dnsmasq-dns" Feb 18 14:23:59 crc kubenswrapper[4817]: E0218 14:23:59.085305 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b08decc2-ae57-4858-93bd-acc42ae42148" containerName="dnsmasq-dns" Feb 18 14:23:59 crc kubenswrapper[4817]: I0218 14:23:59.085311 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="b08decc2-ae57-4858-93bd-acc42ae42148" containerName="dnsmasq-dns" Feb 18 14:23:59 crc kubenswrapper[4817]: I0218 14:23:59.085475 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="b08decc2-ae57-4858-93bd-acc42ae42148" containerName="dnsmasq-dns" Feb 18 14:23:59 crc kubenswrapper[4817]: I0218 14:23:59.085499 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="41ea7547-0242-42c0-9f0f-2f24a8c346ec" containerName="dnsmasq-dns" Feb 18 14:23:59 crc kubenswrapper[4817]: I0218 14:23:59.086258 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-svdph" Feb 18 14:23:59 crc kubenswrapper[4817]: I0218 14:23:59.089758 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x8jkl" Feb 18 14:23:59 crc kubenswrapper[4817]: I0218 14:23:59.090011 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 14:23:59 crc kubenswrapper[4817]: I0218 14:23:59.090170 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 14:23:59 crc kubenswrapper[4817]: I0218 14:23:59.090824 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 14:23:59 crc kubenswrapper[4817]: I0218 14:23:59.123110 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-svdph"] Feb 18 14:23:59 crc kubenswrapper[4817]: I0218 14:23:59.202188 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c90beed-8bc0-4b1c-9c6c-2279e303fbb1-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-svdph\" (UID: \"3c90beed-8bc0-4b1c-9c6c-2279e303fbb1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-svdph" Feb 18 14:23:59 crc kubenswrapper[4817]: I0218 14:23:59.202335 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c90beed-8bc0-4b1c-9c6c-2279e303fbb1-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-svdph\" (UID: \"3c90beed-8bc0-4b1c-9c6c-2279e303fbb1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-svdph" Feb 18 14:23:59 crc kubenswrapper[4817]: I0218 14:23:59.202813 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f5b8\" (UniqueName: \"kubernetes.io/projected/3c90beed-8bc0-4b1c-9c6c-2279e303fbb1-kube-api-access-6f5b8\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-svdph\" (UID: \"3c90beed-8bc0-4b1c-9c6c-2279e303fbb1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-svdph" Feb 18 14:23:59 crc kubenswrapper[4817]: I0218 14:23:59.202969 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3c90beed-8bc0-4b1c-9c6c-2279e303fbb1-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-svdph\" (UID: \"3c90beed-8bc0-4b1c-9c6c-2279e303fbb1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-svdph" Feb 18 14:23:59 crc kubenswrapper[4817]: I0218 14:23:59.305291 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c90beed-8bc0-4b1c-9c6c-2279e303fbb1-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-svdph\" (UID: \"3c90beed-8bc0-4b1c-9c6c-2279e303fbb1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-svdph" Feb 18 14:23:59 crc kubenswrapper[4817]: I0218 14:23:59.305430 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f5b8\" (UniqueName: \"kubernetes.io/projected/3c90beed-8bc0-4b1c-9c6c-2279e303fbb1-kube-api-access-6f5b8\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-svdph\" (UID: \"3c90beed-8bc0-4b1c-9c6c-2279e303fbb1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-svdph" Feb 18 14:23:59 crc kubenswrapper[4817]: I0218 14:23:59.305494 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3c90beed-8bc0-4b1c-9c6c-2279e303fbb1-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-svdph\" (UID: \"3c90beed-8bc0-4b1c-9c6c-2279e303fbb1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-svdph" Feb 18 14:23:59 crc kubenswrapper[4817]: I0218 14:23:59.305566 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c90beed-8bc0-4b1c-9c6c-2279e303fbb1-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-svdph\" (UID: \"3c90beed-8bc0-4b1c-9c6c-2279e303fbb1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-svdph" Feb 18 14:23:59 crc kubenswrapper[4817]: I0218 14:23:59.312628 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c90beed-8bc0-4b1c-9c6c-2279e303fbb1-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-svdph\" (UID: \"3c90beed-8bc0-4b1c-9c6c-2279e303fbb1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-svdph" Feb 18 14:23:59 crc kubenswrapper[4817]: I0218 14:23:59.313090 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c90beed-8bc0-4b1c-9c6c-2279e303fbb1-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-svdph\" (UID: \"3c90beed-8bc0-4b1c-9c6c-2279e303fbb1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-svdph" Feb 18 14:23:59 crc kubenswrapper[4817]: I0218 14:23:59.320826 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3c90beed-8bc0-4b1c-9c6c-2279e303fbb1-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-svdph\" (UID: \"3c90beed-8bc0-4b1c-9c6c-2279e303fbb1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-svdph" Feb 18 14:23:59 crc kubenswrapper[4817]: I0218 14:23:59.321291 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f5b8\" (UniqueName: \"kubernetes.io/projected/3c90beed-8bc0-4b1c-9c6c-2279e303fbb1-kube-api-access-6f5b8\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-svdph\" (UID: \"3c90beed-8bc0-4b1c-9c6c-2279e303fbb1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-svdph" Feb 18 14:23:59 crc kubenswrapper[4817]: I0218 14:23:59.403717 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-svdph" Feb 18 14:24:00 crc kubenswrapper[4817]: I0218 14:24:00.188495 4817 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 14:24:00 crc kubenswrapper[4817]: I0218 14:24:00.200365 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-svdph"] Feb 18 14:24:00 crc kubenswrapper[4817]: I0218 14:24:00.281857 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-svdph" event={"ID":"3c90beed-8bc0-4b1c-9c6c-2279e303fbb1","Type":"ContainerStarted","Data":"ba92e6020c6d40ea911cbb4183c424095cc6e6ee18692774a18cced508e9980d"} Feb 18 14:24:00 crc kubenswrapper[4817]: I0218 14:24:00.283299 4817 generic.go:334] "Generic (PLEG): container finished" podID="e19f3906-864f-49f8-b3f1-e3cfbcae4133" containerID="d44c9d63a6330045a9fb9164d56ca185d80fcb7977ef909fd3029fa204fa25c2" exitCode=0 Feb 18 14:24:00 crc kubenswrapper[4817]: I0218 14:24:00.283397 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e19f3906-864f-49f8-b3f1-e3cfbcae4133","Type":"ContainerDied","Data":"d44c9d63a6330045a9fb9164d56ca185d80fcb7977ef909fd3029fa204fa25c2"} Feb 18 14:24:00 crc kubenswrapper[4817]: I0218 14:24:00.285193 4817 generic.go:334] "Generic (PLEG): container finished" podID="f49989fd-6326-4020-aba0-45b49ed37872" containerID="76bb09153cb11303694519dadf556026c02ba550bffb5ac15c6eafddb920d686" exitCode=0 Feb 18 14:24:00 crc kubenswrapper[4817]: I0218 14:24:00.285233 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f49989fd-6326-4020-aba0-45b49ed37872","Type":"ContainerDied","Data":"76bb09153cb11303694519dadf556026c02ba550bffb5ac15c6eafddb920d686"} Feb 18 14:24:01 crc kubenswrapper[4817]: I0218 14:24:01.310799 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f49989fd-6326-4020-aba0-45b49ed37872","Type":"ContainerStarted","Data":"404c4e0dc2910bec792e016644b529d597ee91977766f061fe5d5af97e494a5f"} Feb 18 14:24:01 crc kubenswrapper[4817]: I0218 14:24:01.311412 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 18 14:24:01 crc kubenswrapper[4817]: I0218 14:24:01.322460 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e19f3906-864f-49f8-b3f1-e3cfbcae4133","Type":"ContainerStarted","Data":"f8653b616bb93b59d2fbacbe1a466c889b742656d898a2ae2fce6f1f33fe455b"} Feb 18 14:24:01 crc kubenswrapper[4817]: I0218 14:24:01.323141 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:24:01 crc kubenswrapper[4817]: I0218 14:24:01.344840 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.344805207 podStartE2EDuration="37.344805207s" podCreationTimestamp="2026-02-18 14:23:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:24:01.339836118 +0000 UTC m=+1503.915372101" watchObservedRunningTime="2026-02-18 14:24:01.344805207 +0000 UTC m=+1503.920341180" Feb 18 14:24:01 crc kubenswrapper[4817]: I0218 14:24:01.376834 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.376803256 podStartE2EDuration="37.376803256s" podCreationTimestamp="2026-02-18 14:23:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 14:24:01.369034275 +0000 UTC m=+1503.944570258" watchObservedRunningTime="2026-02-18 14:24:01.376803256 +0000 UTC m=+1503.952339239" Feb 18 14:24:04 crc kubenswrapper[4817]: I0218 14:24:04.384723 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Feb 18 14:24:12 crc kubenswrapper[4817]: I0218 14:24:12.383225 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cloudkitty-api-0" podUID="09f7d0fc-a70f-4296-82f1-1cdd302a4a60" containerName="cloudkitty-api" probeResult="failure" output="Get \"https://10.217.0.242:8889/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 18 14:24:12 crc kubenswrapper[4817]: I0218 14:24:12.383522 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-api-0" podUID="09f7d0fc-a70f-4296-82f1-1cdd302a4a60" containerName="cloudkitty-api" probeResult="failure" output="Get \"https://10.217.0.242:8889/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 18 14:24:14 crc kubenswrapper[4817]: I0218 14:24:14.780100 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="f49989fd-6326-4020-aba0-45b49ed37872" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.239:5671: connect: connection refused" Feb 18 14:24:14 crc kubenswrapper[4817]: I0218 14:24:14.836609 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="e19f3906-864f-49f8-b3f1-e3cfbcae4133" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.240:5671: connect: connection refused" Feb 18 14:24:16 crc kubenswrapper[4817]: E0218 14:24:16.428640 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest" Feb 18 14:24:16 crc kubenswrapper[4817]: E0218 14:24:16.429093 4817 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 18 14:24:16 crc kubenswrapper[4817]: container &Container{Name:repo-setup-edpm-deployment-openstack-edpm-ipam,Image:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,Command:[],Args:[ansible-runner run /runner -p playbook.yaml -i repo-setup-edpm-deployment-openstack-edpm-ipam],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ANSIBLE_VERBOSITY,Value:2,ValueFrom:nil,},EnvVar{Name:RUNNER_PLAYBOOK,Value: Feb 18 14:24:16 crc kubenswrapper[4817]: - hosts: all Feb 18 14:24:16 crc kubenswrapper[4817]: strategy: linear Feb 18 14:24:16 crc kubenswrapper[4817]: tasks: Feb 18 14:24:16 crc kubenswrapper[4817]: - name: Enable podified-repos Feb 18 14:24:16 crc kubenswrapper[4817]: become: true Feb 18 14:24:16 crc kubenswrapper[4817]: ansible.builtin.shell: | Feb 18 14:24:16 crc kubenswrapper[4817]: set -euxo pipefail Feb 18 14:24:16 crc kubenswrapper[4817]: pushd /var/tmp Feb 18 14:24:16 crc kubenswrapper[4817]: curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz Feb 18 14:24:16 crc kubenswrapper[4817]: pushd repo-setup-main Feb 18 14:24:16 crc kubenswrapper[4817]: python3 -m venv ./venv Feb 18 14:24:16 crc kubenswrapper[4817]: PBR_VERSION=0.0.0 ./venv/bin/pip install ./ Feb 18 14:24:16 crc kubenswrapper[4817]: ./venv/bin/repo-setup current-podified -b antelope Feb 18 14:24:16 crc kubenswrapper[4817]: popd Feb 18 14:24:16 crc kubenswrapper[4817]: rm -rf repo-setup-main Feb 18 14:24:16 crc kubenswrapper[4817]: Feb 18 14:24:16 crc kubenswrapper[4817]: Feb 18 14:24:16 crc kubenswrapper[4817]: ,ValueFrom:nil,},EnvVar{Name:RUNNER_EXTRA_VARS,Value: Feb 18 14:24:16 crc kubenswrapper[4817]: edpm_override_hosts: openstack-edpm-ipam Feb 18 14:24:16 crc kubenswrapper[4817]: edpm_service_type: repo-setup Feb 18 14:24:16 crc kubenswrapper[4817]: Feb 18 14:24:16 crc kubenswrapper[4817]: Feb 18 14:24:16 crc kubenswrapper[4817]: ,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:repo-setup-combined-ca-bundle,ReadOnly:false,MountPath:/var/lib/openstack/cacerts/repo-setup,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key-openstack-edpm-ipam,ReadOnly:false,MountPath:/runner/env/ssh_key/ssh_key_openstack-edpm-ipam,SubPath:ssh_key_openstack-edpm-ipam,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:inventory,ReadOnly:false,MountPath:/runner/inventory/hosts,SubPath:inventory,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6f5b8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:openstack-aee-default-env,},Optional:*true,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod repo-setup-edpm-deployment-openstack-edpm-ipam-svdph_openstack(3c90beed-8bc0-4b1c-9c6c-2279e303fbb1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Feb 18 14:24:16 crc kubenswrapper[4817]: > logger="UnhandledError" Feb 18 14:24:16 crc kubenswrapper[4817]: E0218 14:24:16.430199 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"repo-setup-edpm-deployment-openstack-edpm-ipam\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-svdph" podUID="3c90beed-8bc0-4b1c-9c6c-2279e303fbb1" Feb 18 14:24:16 crc kubenswrapper[4817]: E0218 14:24:16.553489 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"repo-setup-edpm-deployment-openstack-edpm-ipam\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest\\\"\"" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-svdph" podUID="3c90beed-8bc0-4b1c-9c6c-2279e303fbb1" Feb 18 14:24:24 crc kubenswrapper[4817]: I0218 14:24:24.780196 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 18 14:24:24 crc kubenswrapper[4817]: I0218 14:24:24.839324 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 18 14:24:29 crc kubenswrapper[4817]: I0218 14:24:29.110511 4817 scope.go:117] "RemoveContainer" containerID="6666a003b16cfcafe8266b9637813f3eb39a7b98265f5a7618da6683932a4c02" Feb 18 14:24:29 crc kubenswrapper[4817]: I0218 14:24:29.679938 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-svdph" event={"ID":"3c90beed-8bc0-4b1c-9c6c-2279e303fbb1","Type":"ContainerStarted","Data":"f4e66b766d496b447f61689ef4c3cfd67737e5a158010425506b16ebc03c6a4b"} Feb 18 14:24:29 crc kubenswrapper[4817]: I0218 14:24:29.704997 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-svdph" podStartSLOduration=1.689504766 podStartE2EDuration="30.704958317s" podCreationTimestamp="2026-02-18 14:23:59 +0000 UTC" firstStartedPulling="2026-02-18 14:24:00.188253044 +0000 UTC m=+1502.763789027" lastFinishedPulling="2026-02-18 14:24:29.203706595 +0000 UTC m=+1531.779242578" observedRunningTime="2026-02-18 14:24:29.702424871 +0000 UTC m=+1532.277960864" watchObservedRunningTime="2026-02-18 14:24:29.704958317 +0000 UTC m=+1532.280494300" Feb 18 14:24:31 crc kubenswrapper[4817]: I0218 14:24:31.892754 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5b8n8"] Feb 18 14:24:31 crc kubenswrapper[4817]: I0218 14:24:31.901558 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5b8n8" Feb 18 14:24:31 crc kubenswrapper[4817]: I0218 14:24:31.914001 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5b8n8"] Feb 18 14:24:32 crc kubenswrapper[4817]: I0218 14:24:32.049311 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfxbp\" (UniqueName: \"kubernetes.io/projected/e1268dca-bb64-438a-8add-e369fc7c711b-kube-api-access-vfxbp\") pod \"redhat-marketplace-5b8n8\" (UID: \"e1268dca-bb64-438a-8add-e369fc7c711b\") " pod="openshift-marketplace/redhat-marketplace-5b8n8" Feb 18 14:24:32 crc kubenswrapper[4817]: I0218 14:24:32.049422 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1268dca-bb64-438a-8add-e369fc7c711b-utilities\") pod \"redhat-marketplace-5b8n8\" (UID: \"e1268dca-bb64-438a-8add-e369fc7c711b\") " pod="openshift-marketplace/redhat-marketplace-5b8n8" Feb 18 14:24:32 crc kubenswrapper[4817]: I0218 14:24:32.049577 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1268dca-bb64-438a-8add-e369fc7c711b-catalog-content\") pod \"redhat-marketplace-5b8n8\" (UID: \"e1268dca-bb64-438a-8add-e369fc7c711b\") " pod="openshift-marketplace/redhat-marketplace-5b8n8" Feb 18 14:24:32 crc kubenswrapper[4817]: I0218 14:24:32.151511 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfxbp\" (UniqueName: \"kubernetes.io/projected/e1268dca-bb64-438a-8add-e369fc7c711b-kube-api-access-vfxbp\") pod \"redhat-marketplace-5b8n8\" (UID: \"e1268dca-bb64-438a-8add-e369fc7c711b\") " pod="openshift-marketplace/redhat-marketplace-5b8n8" Feb 18 14:24:32 crc kubenswrapper[4817]: I0218 14:24:32.151669 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1268dca-bb64-438a-8add-e369fc7c711b-utilities\") pod \"redhat-marketplace-5b8n8\" (UID: \"e1268dca-bb64-438a-8add-e369fc7c711b\") " pod="openshift-marketplace/redhat-marketplace-5b8n8" Feb 18 14:24:32 crc kubenswrapper[4817]: I0218 14:24:32.152263 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1268dca-bb64-438a-8add-e369fc7c711b-utilities\") pod \"redhat-marketplace-5b8n8\" (UID: \"e1268dca-bb64-438a-8add-e369fc7c711b\") " pod="openshift-marketplace/redhat-marketplace-5b8n8" Feb 18 14:24:32 crc kubenswrapper[4817]: I0218 14:24:32.152461 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1268dca-bb64-438a-8add-e369fc7c711b-catalog-content\") pod \"redhat-marketplace-5b8n8\" (UID: \"e1268dca-bb64-438a-8add-e369fc7c711b\") " pod="openshift-marketplace/redhat-marketplace-5b8n8" Feb 18 14:24:32 crc kubenswrapper[4817]: I0218 14:24:32.152785 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1268dca-bb64-438a-8add-e369fc7c711b-catalog-content\") pod \"redhat-marketplace-5b8n8\" (UID: \"e1268dca-bb64-438a-8add-e369fc7c711b\") " pod="openshift-marketplace/redhat-marketplace-5b8n8" Feb 18 14:24:32 crc kubenswrapper[4817]: I0218 14:24:32.173455 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfxbp\" (UniqueName: \"kubernetes.io/projected/e1268dca-bb64-438a-8add-e369fc7c711b-kube-api-access-vfxbp\") pod \"redhat-marketplace-5b8n8\" (UID: \"e1268dca-bb64-438a-8add-e369fc7c711b\") " pod="openshift-marketplace/redhat-marketplace-5b8n8" Feb 18 14:24:32 crc kubenswrapper[4817]: I0218 14:24:32.223292 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5b8n8" Feb 18 14:24:32 crc kubenswrapper[4817]: I0218 14:24:32.720092 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5b8n8"] Feb 18 14:24:32 crc kubenswrapper[4817]: W0218 14:24:32.723641 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1268dca_bb64_438a_8add_e369fc7c711b.slice/crio-34a57a782c5d560d058bbf48f291499866279111d4a504374fe475afb1032a17 WatchSource:0}: Error finding container 34a57a782c5d560d058bbf48f291499866279111d4a504374fe475afb1032a17: Status 404 returned error can't find the container with id 34a57a782c5d560d058bbf48f291499866279111d4a504374fe475afb1032a17 Feb 18 14:24:33 crc kubenswrapper[4817]: I0218 14:24:33.724948 4817 generic.go:334] "Generic (PLEG): container finished" podID="e1268dca-bb64-438a-8add-e369fc7c711b" containerID="0cee58c1eb1bc4389602743dc73633d43a3fe1887a7048ce545f7b89e976b273" exitCode=0 Feb 18 14:24:33 crc kubenswrapper[4817]: I0218 14:24:33.725062 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5b8n8" event={"ID":"e1268dca-bb64-438a-8add-e369fc7c711b","Type":"ContainerDied","Data":"0cee58c1eb1bc4389602743dc73633d43a3fe1887a7048ce545f7b89e976b273"} Feb 18 14:24:33 crc kubenswrapper[4817]: I0218 14:24:33.725296 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5b8n8" event={"ID":"e1268dca-bb64-438a-8add-e369fc7c711b","Type":"ContainerStarted","Data":"34a57a782c5d560d058bbf48f291499866279111d4a504374fe475afb1032a17"} Feb 18 14:24:34 crc kubenswrapper[4817]: I0218 14:24:34.739118 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5b8n8" event={"ID":"e1268dca-bb64-438a-8add-e369fc7c711b","Type":"ContainerStarted","Data":"b18fd2064f4f8f09c95070a2b673baa5759278bad5d139da696e6a623840a80d"} Feb 18 14:24:36 crc kubenswrapper[4817]: I0218 14:24:36.774871 4817 generic.go:334] "Generic (PLEG): container finished" podID="e1268dca-bb64-438a-8add-e369fc7c711b" containerID="b18fd2064f4f8f09c95070a2b673baa5759278bad5d139da696e6a623840a80d" exitCode=0 Feb 18 14:24:36 crc kubenswrapper[4817]: I0218 14:24:36.774947 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5b8n8" event={"ID":"e1268dca-bb64-438a-8add-e369fc7c711b","Type":"ContainerDied","Data":"b18fd2064f4f8f09c95070a2b673baa5759278bad5d139da696e6a623840a80d"} Feb 18 14:24:37 crc kubenswrapper[4817]: I0218 14:24:37.787937 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5b8n8" event={"ID":"e1268dca-bb64-438a-8add-e369fc7c711b","Type":"ContainerStarted","Data":"4ba34bc28364a99e6e64500e38e4c6e7f6bab95d4723fe45bf71aef5c04b0803"} Feb 18 14:24:37 crc kubenswrapper[4817]: I0218 14:24:37.818246 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5b8n8" podStartSLOduration=3.093490915 podStartE2EDuration="6.81822377s" podCreationTimestamp="2026-02-18 14:24:31 +0000 UTC" firstStartedPulling="2026-02-18 14:24:33.727784403 +0000 UTC m=+1536.303320386" lastFinishedPulling="2026-02-18 14:24:37.452517258 +0000 UTC m=+1540.028053241" observedRunningTime="2026-02-18 14:24:37.806172047 +0000 UTC m=+1540.381708030" watchObservedRunningTime="2026-02-18 14:24:37.81822377 +0000 UTC m=+1540.393759753" Feb 18 14:24:42 crc kubenswrapper[4817]: I0218 14:24:42.224046 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5b8n8" Feb 18 14:24:42 crc kubenswrapper[4817]: I0218 14:24:42.224951 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5b8n8" Feb 18 14:24:42 crc kubenswrapper[4817]: I0218 14:24:42.838234 4817 generic.go:334] "Generic (PLEG): container finished" podID="3c90beed-8bc0-4b1c-9c6c-2279e303fbb1" containerID="f4e66b766d496b447f61689ef4c3cfd67737e5a158010425506b16ebc03c6a4b" exitCode=0 Feb 18 14:24:42 crc kubenswrapper[4817]: I0218 14:24:42.838290 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-svdph" event={"ID":"3c90beed-8bc0-4b1c-9c6c-2279e303fbb1","Type":"ContainerDied","Data":"f4e66b766d496b447f61689ef4c3cfd67737e5a158010425506b16ebc03c6a4b"} Feb 18 14:24:43 crc kubenswrapper[4817]: I0218 14:24:43.269239 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-5b8n8" podUID="e1268dca-bb64-438a-8add-e369fc7c711b" containerName="registry-server" probeResult="failure" output=< Feb 18 14:24:43 crc kubenswrapper[4817]: timeout: failed to connect service ":50051" within 1s Feb 18 14:24:43 crc kubenswrapper[4817]: > Feb 18 14:24:44 crc kubenswrapper[4817]: I0218 14:24:44.408877 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-svdph" Feb 18 14:24:44 crc kubenswrapper[4817]: I0218 14:24:44.432791 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6f5b8\" (UniqueName: \"kubernetes.io/projected/3c90beed-8bc0-4b1c-9c6c-2279e303fbb1-kube-api-access-6f5b8\") pod \"3c90beed-8bc0-4b1c-9c6c-2279e303fbb1\" (UID: \"3c90beed-8bc0-4b1c-9c6c-2279e303fbb1\") " Feb 18 14:24:44 crc kubenswrapper[4817]: I0218 14:24:44.433056 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c90beed-8bc0-4b1c-9c6c-2279e303fbb1-inventory\") pod \"3c90beed-8bc0-4b1c-9c6c-2279e303fbb1\" (UID: \"3c90beed-8bc0-4b1c-9c6c-2279e303fbb1\") " Feb 18 14:24:44 crc kubenswrapper[4817]: I0218 14:24:44.433139 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c90beed-8bc0-4b1c-9c6c-2279e303fbb1-repo-setup-combined-ca-bundle\") pod \"3c90beed-8bc0-4b1c-9c6c-2279e303fbb1\" (UID: \"3c90beed-8bc0-4b1c-9c6c-2279e303fbb1\") " Feb 18 14:24:44 crc kubenswrapper[4817]: I0218 14:24:44.433177 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3c90beed-8bc0-4b1c-9c6c-2279e303fbb1-ssh-key-openstack-edpm-ipam\") pod \"3c90beed-8bc0-4b1c-9c6c-2279e303fbb1\" (UID: \"3c90beed-8bc0-4b1c-9c6c-2279e303fbb1\") " Feb 18 14:24:44 crc kubenswrapper[4817]: I0218 14:24:44.444241 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c90beed-8bc0-4b1c-9c6c-2279e303fbb1-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "3c90beed-8bc0-4b1c-9c6c-2279e303fbb1" (UID: "3c90beed-8bc0-4b1c-9c6c-2279e303fbb1"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:24:44 crc kubenswrapper[4817]: I0218 14:24:44.449279 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c90beed-8bc0-4b1c-9c6c-2279e303fbb1-kube-api-access-6f5b8" (OuterVolumeSpecName: "kube-api-access-6f5b8") pod "3c90beed-8bc0-4b1c-9c6c-2279e303fbb1" (UID: "3c90beed-8bc0-4b1c-9c6c-2279e303fbb1"). InnerVolumeSpecName "kube-api-access-6f5b8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:24:44 crc kubenswrapper[4817]: I0218 14:24:44.482097 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c90beed-8bc0-4b1c-9c6c-2279e303fbb1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3c90beed-8bc0-4b1c-9c6c-2279e303fbb1" (UID: "3c90beed-8bc0-4b1c-9c6c-2279e303fbb1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:24:44 crc kubenswrapper[4817]: I0218 14:24:44.497583 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c90beed-8bc0-4b1c-9c6c-2279e303fbb1-inventory" (OuterVolumeSpecName: "inventory") pod "3c90beed-8bc0-4b1c-9c6c-2279e303fbb1" (UID: "3c90beed-8bc0-4b1c-9c6c-2279e303fbb1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:24:44 crc kubenswrapper[4817]: I0218 14:24:44.535503 4817 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c90beed-8bc0-4b1c-9c6c-2279e303fbb1-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 14:24:44 crc kubenswrapper[4817]: I0218 14:24:44.535541 4817 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c90beed-8bc0-4b1c-9c6c-2279e303fbb1-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:24:44 crc kubenswrapper[4817]: I0218 14:24:44.535553 4817 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3c90beed-8bc0-4b1c-9c6c-2279e303fbb1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 14:24:44 crc kubenswrapper[4817]: I0218 14:24:44.535563 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6f5b8\" (UniqueName: \"kubernetes.io/projected/3c90beed-8bc0-4b1c-9c6c-2279e303fbb1-kube-api-access-6f5b8\") on node \"crc\" DevicePath \"\"" Feb 18 14:24:44 crc kubenswrapper[4817]: I0218 14:24:44.857199 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-svdph" event={"ID":"3c90beed-8bc0-4b1c-9c6c-2279e303fbb1","Type":"ContainerDied","Data":"ba92e6020c6d40ea911cbb4183c424095cc6e6ee18692774a18cced508e9980d"} Feb 18 14:24:44 crc kubenswrapper[4817]: I0218 14:24:44.857418 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba92e6020c6d40ea911cbb4183c424095cc6e6ee18692774a18cced508e9980d" Feb 18 14:24:44 crc kubenswrapper[4817]: I0218 14:24:44.857254 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-svdph" Feb 18 14:24:44 crc kubenswrapper[4817]: I0218 14:24:44.998615 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-hsvp2"] Feb 18 14:24:44 crc kubenswrapper[4817]: E0218 14:24:44.999199 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c90beed-8bc0-4b1c-9c6c-2279e303fbb1" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 18 14:24:44 crc kubenswrapper[4817]: I0218 14:24:44.999223 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c90beed-8bc0-4b1c-9c6c-2279e303fbb1" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 18 14:24:44 crc kubenswrapper[4817]: I0218 14:24:44.999454 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c90beed-8bc0-4b1c-9c6c-2279e303fbb1" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 18 14:24:45 crc kubenswrapper[4817]: I0218 14:24:45.000239 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hsvp2" Feb 18 14:24:45 crc kubenswrapper[4817]: I0218 14:24:45.002772 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 14:24:45 crc kubenswrapper[4817]: I0218 14:24:45.002794 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x8jkl" Feb 18 14:24:45 crc kubenswrapper[4817]: I0218 14:24:45.002864 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 14:24:45 crc kubenswrapper[4817]: I0218 14:24:45.003023 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 14:24:45 crc kubenswrapper[4817]: I0218 14:24:45.010499 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-hsvp2"] Feb 18 14:24:45 crc kubenswrapper[4817]: I0218 14:24:45.042581 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n2xx\" (UniqueName: \"kubernetes.io/projected/d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7-kube-api-access-9n2xx\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hsvp2\" (UID: \"d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hsvp2" Feb 18 14:24:45 crc kubenswrapper[4817]: I0218 14:24:45.042626 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hsvp2\" (UID: \"d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hsvp2" Feb 18 14:24:45 crc kubenswrapper[4817]: I0218 14:24:45.043053 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hsvp2\" (UID: \"d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hsvp2" Feb 18 14:24:45 crc kubenswrapper[4817]: I0218 14:24:45.145064 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hsvp2\" (UID: \"d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hsvp2" Feb 18 14:24:45 crc kubenswrapper[4817]: I0218 14:24:45.145260 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n2xx\" (UniqueName: \"kubernetes.io/projected/d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7-kube-api-access-9n2xx\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hsvp2\" (UID: \"d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hsvp2" Feb 18 14:24:45 crc kubenswrapper[4817]: I0218 14:24:45.145294 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hsvp2\" (UID: \"d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hsvp2" Feb 18 14:24:45 crc kubenswrapper[4817]: I0218 14:24:45.149593 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hsvp2\" (UID: \"d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hsvp2" Feb 18 14:24:45 crc kubenswrapper[4817]: I0218 14:24:45.149694 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hsvp2\" (UID: \"d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hsvp2" Feb 18 14:24:45 crc kubenswrapper[4817]: I0218 14:24:45.161140 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n2xx\" (UniqueName: \"kubernetes.io/projected/d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7-kube-api-access-9n2xx\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hsvp2\" (UID: \"d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hsvp2" Feb 18 14:24:45 crc kubenswrapper[4817]: I0218 14:24:45.317378 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hsvp2" Feb 18 14:24:45 crc kubenswrapper[4817]: W0218 14:24:45.847204 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1c3dd6e_0387_41b5_b7a1_fb2ea1e053f7.slice/crio-dcb6d337c2609f23a43bef4bd8151f3c19fa5297d66b92acc4aaa916da9ee4c9 WatchSource:0}: Error finding container dcb6d337c2609f23a43bef4bd8151f3c19fa5297d66b92acc4aaa916da9ee4c9: Status 404 returned error can't find the container with id dcb6d337c2609f23a43bef4bd8151f3c19fa5297d66b92acc4aaa916da9ee4c9 Feb 18 14:24:45 crc kubenswrapper[4817]: I0218 14:24:45.847494 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-hsvp2"] Feb 18 14:24:45 crc kubenswrapper[4817]: I0218 14:24:45.871311 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hsvp2" event={"ID":"d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7","Type":"ContainerStarted","Data":"dcb6d337c2609f23a43bef4bd8151f3c19fa5297d66b92acc4aaa916da9ee4c9"} Feb 18 14:24:46 crc kubenswrapper[4817]: I0218 14:24:46.884294 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hsvp2" event={"ID":"d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7","Type":"ContainerStarted","Data":"87ca6e3fc37eca51c9676c5745286641f57da1b9bd52a43b854c4b8eb47b11cb"} Feb 18 14:24:49 crc kubenswrapper[4817]: I0218 14:24:49.915829 4817 generic.go:334] "Generic (PLEG): container finished" podID="d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7" containerID="87ca6e3fc37eca51c9676c5745286641f57da1b9bd52a43b854c4b8eb47b11cb" exitCode=0 Feb 18 14:24:49 crc kubenswrapper[4817]: I0218 14:24:49.915935 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hsvp2" event={"ID":"d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7","Type":"ContainerDied","Data":"87ca6e3fc37eca51c9676c5745286641f57da1b9bd52a43b854c4b8eb47b11cb"} Feb 18 14:24:51 crc kubenswrapper[4817]: I0218 14:24:51.471423 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hsvp2" Feb 18 14:24:51 crc kubenswrapper[4817]: I0218 14:24:51.634195 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7-inventory\") pod \"d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7\" (UID: \"d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7\") " Feb 18 14:24:51 crc kubenswrapper[4817]: I0218 14:24:51.634317 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n2xx\" (UniqueName: \"kubernetes.io/projected/d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7-kube-api-access-9n2xx\") pod \"d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7\" (UID: \"d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7\") " Feb 18 14:24:51 crc kubenswrapper[4817]: I0218 14:24:51.634402 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7-ssh-key-openstack-edpm-ipam\") pod \"d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7\" (UID: \"d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7\") " Feb 18 14:24:51 crc kubenswrapper[4817]: I0218 14:24:51.641152 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7-kube-api-access-9n2xx" (OuterVolumeSpecName: "kube-api-access-9n2xx") pod "d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7" (UID: "d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7"). InnerVolumeSpecName "kube-api-access-9n2xx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:24:51 crc kubenswrapper[4817]: I0218 14:24:51.668434 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7-inventory" (OuterVolumeSpecName: "inventory") pod "d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7" (UID: "d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:24:51 crc kubenswrapper[4817]: I0218 14:24:51.669368 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7" (UID: "d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:24:51 crc kubenswrapper[4817]: I0218 14:24:51.737259 4817 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 14:24:51 crc kubenswrapper[4817]: I0218 14:24:51.737319 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n2xx\" (UniqueName: \"kubernetes.io/projected/d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7-kube-api-access-9n2xx\") on node \"crc\" DevicePath \"\"" Feb 18 14:24:51 crc kubenswrapper[4817]: I0218 14:24:51.737333 4817 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 14:24:51 crc kubenswrapper[4817]: I0218 14:24:51.938882 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hsvp2" event={"ID":"d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7","Type":"ContainerDied","Data":"dcb6d337c2609f23a43bef4bd8151f3c19fa5297d66b92acc4aaa916da9ee4c9"} Feb 18 14:24:51 crc kubenswrapper[4817]: I0218 14:24:51.939279 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcb6d337c2609f23a43bef4bd8151f3c19fa5297d66b92acc4aaa916da9ee4c9" Feb 18 14:24:51 crc kubenswrapper[4817]: I0218 14:24:51.938947 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hsvp2" Feb 18 14:24:52 crc kubenswrapper[4817]: I0218 14:24:52.008342 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f"] Feb 18 14:24:52 crc kubenswrapper[4817]: E0218 14:24:52.008855 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 18 14:24:52 crc kubenswrapper[4817]: I0218 14:24:52.008874 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 18 14:24:52 crc kubenswrapper[4817]: I0218 14:24:52.009088 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 18 14:24:52 crc kubenswrapper[4817]: I0218 14:24:52.009815 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f" Feb 18 14:24:52 crc kubenswrapper[4817]: I0218 14:24:52.014276 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 14:24:52 crc kubenswrapper[4817]: I0218 14:24:52.014521 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 14:24:52 crc kubenswrapper[4817]: I0218 14:24:52.014680 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 14:24:52 crc kubenswrapper[4817]: I0218 14:24:52.015259 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x8jkl" Feb 18 14:24:52 crc kubenswrapper[4817]: I0218 14:24:52.020819 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f"] Feb 18 14:24:52 crc kubenswrapper[4817]: I0218 14:24:52.144838 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmjgb\" (UniqueName: \"kubernetes.io/projected/d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9-kube-api-access-hmjgb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f\" (UID: \"d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f" Feb 18 14:24:52 crc kubenswrapper[4817]: I0218 14:24:52.144899 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f\" (UID: \"d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f" Feb 18 14:24:52 crc kubenswrapper[4817]: I0218 14:24:52.144944 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f\" (UID: \"d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f" Feb 18 14:24:52 crc kubenswrapper[4817]: I0218 14:24:52.144999 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f\" (UID: \"d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f" Feb 18 14:24:52 crc kubenswrapper[4817]: I0218 14:24:52.247553 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f\" (UID: \"d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f" Feb 18 14:24:52 crc kubenswrapper[4817]: I0218 14:24:52.247632 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f\" (UID: \"d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f" Feb 18 14:24:52 crc kubenswrapper[4817]: I0218 14:24:52.247792 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmjgb\" (UniqueName: \"kubernetes.io/projected/d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9-kube-api-access-hmjgb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f\" (UID: \"d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f" Feb 18 14:24:52 crc kubenswrapper[4817]: I0218 14:24:52.247827 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f\" (UID: \"d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f" Feb 18 14:24:52 crc kubenswrapper[4817]: I0218 14:24:52.252264 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f\" (UID: \"d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f" Feb 18 14:24:52 crc kubenswrapper[4817]: I0218 14:24:52.252429 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f\" (UID: \"d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f" Feb 18 14:24:52 crc kubenswrapper[4817]: I0218 14:24:52.253162 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f\" (UID: \"d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f" Feb 18 14:24:52 crc kubenswrapper[4817]: I0218 14:24:52.269651 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmjgb\" (UniqueName: \"kubernetes.io/projected/d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9-kube-api-access-hmjgb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f\" (UID: \"d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f" Feb 18 14:24:52 crc kubenswrapper[4817]: I0218 14:24:52.281261 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5b8n8" Feb 18 14:24:52 crc kubenswrapper[4817]: I0218 14:24:52.325522 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f" Feb 18 14:24:52 crc kubenswrapper[4817]: I0218 14:24:52.370513 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5b8n8" Feb 18 14:24:52 crc kubenswrapper[4817]: I0218 14:24:52.518193 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5b8n8"] Feb 18 14:24:52 crc kubenswrapper[4817]: I0218 14:24:52.891868 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f"] Feb 18 14:24:52 crc kubenswrapper[4817]: I0218 14:24:52.950025 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f" event={"ID":"d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9","Type":"ContainerStarted","Data":"af9c8cef9e9c4dd3a625a51c509bcb9c5dbbbd0475f342427c00d6d5b69a787b"} Feb 18 14:24:53 crc kubenswrapper[4817]: I0218 14:24:53.972080 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5b8n8" podUID="e1268dca-bb64-438a-8add-e369fc7c711b" containerName="registry-server" containerID="cri-o://4ba34bc28364a99e6e64500e38e4c6e7f6bab95d4723fe45bf71aef5c04b0803" gracePeriod=2 Feb 18 14:24:53 crc kubenswrapper[4817]: I0218 14:24:53.972208 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f" event={"ID":"d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9","Type":"ContainerStarted","Data":"9730ee73612795a3c8fa4bbc6f82ffd8b9b1aed164c4ff8e05dba8d4e9c606a0"} Feb 18 14:24:53 crc kubenswrapper[4817]: I0218 14:24:53.989908 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f" podStartSLOduration=2.563881041 podStartE2EDuration="2.989885139s" podCreationTimestamp="2026-02-18 14:24:51 +0000 UTC" firstStartedPulling="2026-02-18 14:24:52.895895353 +0000 UTC m=+1555.471431336" lastFinishedPulling="2026-02-18 14:24:53.321899441 +0000 UTC m=+1555.897435434" observedRunningTime="2026-02-18 14:24:53.987075988 +0000 UTC m=+1556.562611981" watchObservedRunningTime="2026-02-18 14:24:53.989885139 +0000 UTC m=+1556.565421122" Feb 18 14:24:54 crc kubenswrapper[4817]: I0218 14:24:54.560716 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5b8n8" Feb 18 14:24:54 crc kubenswrapper[4817]: I0218 14:24:54.611756 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfxbp\" (UniqueName: \"kubernetes.io/projected/e1268dca-bb64-438a-8add-e369fc7c711b-kube-api-access-vfxbp\") pod \"e1268dca-bb64-438a-8add-e369fc7c711b\" (UID: \"e1268dca-bb64-438a-8add-e369fc7c711b\") " Feb 18 14:24:54 crc kubenswrapper[4817]: I0218 14:24:54.611880 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1268dca-bb64-438a-8add-e369fc7c711b-catalog-content\") pod \"e1268dca-bb64-438a-8add-e369fc7c711b\" (UID: \"e1268dca-bb64-438a-8add-e369fc7c711b\") " Feb 18 14:24:54 crc kubenswrapper[4817]: I0218 14:24:54.636390 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1268dca-bb64-438a-8add-e369fc7c711b-kube-api-access-vfxbp" (OuterVolumeSpecName: "kube-api-access-vfxbp") pod "e1268dca-bb64-438a-8add-e369fc7c711b" (UID: "e1268dca-bb64-438a-8add-e369fc7c711b"). InnerVolumeSpecName "kube-api-access-vfxbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:24:54 crc kubenswrapper[4817]: I0218 14:24:54.654452 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1268dca-bb64-438a-8add-e369fc7c711b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e1268dca-bb64-438a-8add-e369fc7c711b" (UID: "e1268dca-bb64-438a-8add-e369fc7c711b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:24:54 crc kubenswrapper[4817]: I0218 14:24:54.713606 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1268dca-bb64-438a-8add-e369fc7c711b-utilities\") pod \"e1268dca-bb64-438a-8add-e369fc7c711b\" (UID: \"e1268dca-bb64-438a-8add-e369fc7c711b\") " Feb 18 14:24:54 crc kubenswrapper[4817]: I0218 14:24:54.714435 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1268dca-bb64-438a-8add-e369fc7c711b-utilities" (OuterVolumeSpecName: "utilities") pod "e1268dca-bb64-438a-8add-e369fc7c711b" (UID: "e1268dca-bb64-438a-8add-e369fc7c711b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:24:54 crc kubenswrapper[4817]: I0218 14:24:54.714560 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfxbp\" (UniqueName: \"kubernetes.io/projected/e1268dca-bb64-438a-8add-e369fc7c711b-kube-api-access-vfxbp\") on node \"crc\" DevicePath \"\"" Feb 18 14:24:54 crc kubenswrapper[4817]: I0218 14:24:54.714586 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1268dca-bb64-438a-8add-e369fc7c711b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 14:24:54 crc kubenswrapper[4817]: I0218 14:24:54.816418 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1268dca-bb64-438a-8add-e369fc7c711b-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 14:24:54 crc kubenswrapper[4817]: I0218 14:24:54.983216 4817 generic.go:334] "Generic (PLEG): container finished" podID="e1268dca-bb64-438a-8add-e369fc7c711b" containerID="4ba34bc28364a99e6e64500e38e4c6e7f6bab95d4723fe45bf71aef5c04b0803" exitCode=0 Feb 18 14:24:54 crc kubenswrapper[4817]: I0218 14:24:54.983306 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5b8n8" Feb 18 14:24:54 crc kubenswrapper[4817]: I0218 14:24:54.983372 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5b8n8" event={"ID":"e1268dca-bb64-438a-8add-e369fc7c711b","Type":"ContainerDied","Data":"4ba34bc28364a99e6e64500e38e4c6e7f6bab95d4723fe45bf71aef5c04b0803"} Feb 18 14:24:54 crc kubenswrapper[4817]: I0218 14:24:54.984539 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5b8n8" event={"ID":"e1268dca-bb64-438a-8add-e369fc7c711b","Type":"ContainerDied","Data":"34a57a782c5d560d058bbf48f291499866279111d4a504374fe475afb1032a17"} Feb 18 14:24:54 crc kubenswrapper[4817]: I0218 14:24:54.984569 4817 scope.go:117] "RemoveContainer" containerID="4ba34bc28364a99e6e64500e38e4c6e7f6bab95d4723fe45bf71aef5c04b0803" Feb 18 14:24:55 crc kubenswrapper[4817]: I0218 14:24:55.019366 4817 scope.go:117] "RemoveContainer" containerID="b18fd2064f4f8f09c95070a2b673baa5759278bad5d139da696e6a623840a80d" Feb 18 14:24:55 crc kubenswrapper[4817]: I0218 14:24:55.040344 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5b8n8"] Feb 18 14:24:55 crc kubenswrapper[4817]: I0218 14:24:55.049237 4817 scope.go:117] "RemoveContainer" containerID="0cee58c1eb1bc4389602743dc73633d43a3fe1887a7048ce545f7b89e976b273" Feb 18 14:24:55 crc kubenswrapper[4817]: I0218 14:24:55.053100 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5b8n8"] Feb 18 14:24:55 crc kubenswrapper[4817]: I0218 14:24:55.106182 4817 scope.go:117] "RemoveContainer" containerID="4ba34bc28364a99e6e64500e38e4c6e7f6bab95d4723fe45bf71aef5c04b0803" Feb 18 14:24:55 crc kubenswrapper[4817]: E0218 14:24:55.106671 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ba34bc28364a99e6e64500e38e4c6e7f6bab95d4723fe45bf71aef5c04b0803\": container with ID starting with 4ba34bc28364a99e6e64500e38e4c6e7f6bab95d4723fe45bf71aef5c04b0803 not found: ID does not exist" containerID="4ba34bc28364a99e6e64500e38e4c6e7f6bab95d4723fe45bf71aef5c04b0803" Feb 18 14:24:55 crc kubenswrapper[4817]: I0218 14:24:55.106710 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ba34bc28364a99e6e64500e38e4c6e7f6bab95d4723fe45bf71aef5c04b0803"} err="failed to get container status \"4ba34bc28364a99e6e64500e38e4c6e7f6bab95d4723fe45bf71aef5c04b0803\": rpc error: code = NotFound desc = could not find container \"4ba34bc28364a99e6e64500e38e4c6e7f6bab95d4723fe45bf71aef5c04b0803\": container with ID starting with 4ba34bc28364a99e6e64500e38e4c6e7f6bab95d4723fe45bf71aef5c04b0803 not found: ID does not exist" Feb 18 14:24:55 crc kubenswrapper[4817]: I0218 14:24:55.106733 4817 scope.go:117] "RemoveContainer" containerID="b18fd2064f4f8f09c95070a2b673baa5759278bad5d139da696e6a623840a80d" Feb 18 14:24:55 crc kubenswrapper[4817]: E0218 14:24:55.107676 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b18fd2064f4f8f09c95070a2b673baa5759278bad5d139da696e6a623840a80d\": container with ID starting with b18fd2064f4f8f09c95070a2b673baa5759278bad5d139da696e6a623840a80d not found: ID does not exist" containerID="b18fd2064f4f8f09c95070a2b673baa5759278bad5d139da696e6a623840a80d" Feb 18 14:24:55 crc kubenswrapper[4817]: I0218 14:24:55.107712 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b18fd2064f4f8f09c95070a2b673baa5759278bad5d139da696e6a623840a80d"} err="failed to get container status \"b18fd2064f4f8f09c95070a2b673baa5759278bad5d139da696e6a623840a80d\": rpc error: code = NotFound desc = could not find container \"b18fd2064f4f8f09c95070a2b673baa5759278bad5d139da696e6a623840a80d\": container with ID starting with b18fd2064f4f8f09c95070a2b673baa5759278bad5d139da696e6a623840a80d not found: ID does not exist" Feb 18 14:24:55 crc kubenswrapper[4817]: I0218 14:24:55.107728 4817 scope.go:117] "RemoveContainer" containerID="0cee58c1eb1bc4389602743dc73633d43a3fe1887a7048ce545f7b89e976b273" Feb 18 14:24:55 crc kubenswrapper[4817]: E0218 14:24:55.108172 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cee58c1eb1bc4389602743dc73633d43a3fe1887a7048ce545f7b89e976b273\": container with ID starting with 0cee58c1eb1bc4389602743dc73633d43a3fe1887a7048ce545f7b89e976b273 not found: ID does not exist" containerID="0cee58c1eb1bc4389602743dc73633d43a3fe1887a7048ce545f7b89e976b273" Feb 18 14:24:55 crc kubenswrapper[4817]: I0218 14:24:55.108198 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cee58c1eb1bc4389602743dc73633d43a3fe1887a7048ce545f7b89e976b273"} err="failed to get container status \"0cee58c1eb1bc4389602743dc73633d43a3fe1887a7048ce545f7b89e976b273\": rpc error: code = NotFound desc = could not find container \"0cee58c1eb1bc4389602743dc73633d43a3fe1887a7048ce545f7b89e976b273\": container with ID starting with 0cee58c1eb1bc4389602743dc73633d43a3fe1887a7048ce545f7b89e976b273 not found: ID does not exist" Feb 18 14:24:56 crc kubenswrapper[4817]: I0218 14:24:56.184450 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1268dca-bb64-438a-8add-e369fc7c711b" path="/var/lib/kubelet/pods/e1268dca-bb64-438a-8add-e369fc7c711b/volumes" Feb 18 14:25:29 crc kubenswrapper[4817]: I0218 14:25:29.323383 4817 scope.go:117] "RemoveContainer" containerID="66021b93b6ee75d0eb93524e345b56d14e0c76bd376e4922b3beebfde1d71b4a" Feb 18 14:25:29 crc kubenswrapper[4817]: I0218 14:25:29.364891 4817 scope.go:117] "RemoveContainer" containerID="00e17bd6056b65696e77d9641aa5c8e9534f3cfe54f718e4038fd916bdae0100" Feb 18 14:25:29 crc kubenswrapper[4817]: I0218 14:25:29.509478 4817 scope.go:117] "RemoveContainer" containerID="c14342e7e3cade0d49811e211d0118efbb76d10f00b8a8c70653c697a6221a7a" Feb 18 14:26:12 crc kubenswrapper[4817]: I0218 14:26:12.863402 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:26:12 crc kubenswrapper[4817]: I0218 14:26:12.863968 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:26:29 crc kubenswrapper[4817]: I0218 14:26:29.752608 4817 scope.go:117] "RemoveContainer" containerID="ed628f07772e2e05d3087689a31bbe40ceac0ae62b9496713f1d17d667347f91" Feb 18 14:26:29 crc kubenswrapper[4817]: I0218 14:26:29.774494 4817 scope.go:117] "RemoveContainer" containerID="693c3b70dfb28f9d779c518cc0749685cca27d25bf824daa09bc85e7ac834e16" Feb 18 14:26:29 crc kubenswrapper[4817]: I0218 14:26:29.799399 4817 scope.go:117] "RemoveContainer" containerID="da730a3033788be135c8c0fdb0570d392868504128e0895329ceaad366c8a371" Feb 18 14:26:29 crc kubenswrapper[4817]: I0218 14:26:29.825601 4817 scope.go:117] "RemoveContainer" containerID="f04251b56135cf24124d6f6b653e2c82951eecc3a89e526f44eabe6a9a119e2f" Feb 18 14:26:29 crc kubenswrapper[4817]: I0218 14:26:29.862490 4817 scope.go:117] "RemoveContainer" containerID="152a3754c1fc459ed0c923bc20716b4ca6647241bf64f323939d93a00532731d" Feb 18 14:26:42 crc kubenswrapper[4817]: I0218 14:26:42.863364 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:26:42 crc kubenswrapper[4817]: I0218 14:26:42.863955 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:27:07 crc kubenswrapper[4817]: I0218 14:27:07.044988 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bd17-account-create-update-v4mb6"] Feb 18 14:27:07 crc kubenswrapper[4817]: I0218 14:27:07.054990 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bd17-account-create-update-v4mb6"] Feb 18 14:27:07 crc kubenswrapper[4817]: I0218 14:27:07.065783 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-268hx"] Feb 18 14:27:07 crc kubenswrapper[4817]: I0218 14:27:07.075647 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-e38b-account-create-update-rlln2"] Feb 18 14:27:07 crc kubenswrapper[4817]: I0218 14:27:07.084362 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-e38b-account-create-update-rlln2"] Feb 18 14:27:07 crc kubenswrapper[4817]: I0218 14:27:07.093054 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-268hx"] Feb 18 14:27:07 crc kubenswrapper[4817]: I0218 14:27:07.101723 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-6bvsw"] Feb 18 14:27:07 crc kubenswrapper[4817]: I0218 14:27:07.110713 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-3519-account-create-update-4tjwq"] Feb 18 14:27:07 crc kubenswrapper[4817]: I0218 14:27:07.121109 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-3519-account-create-update-4tjwq"] Feb 18 14:27:07 crc kubenswrapper[4817]: I0218 14:27:07.129484 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-6bvsw"] Feb 18 14:27:08 crc kubenswrapper[4817]: I0218 14:27:08.028006 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-6dg5k"] Feb 18 14:27:08 crc kubenswrapper[4817]: I0218 14:27:08.040877 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-6dg5k"] Feb 18 14:27:08 crc kubenswrapper[4817]: I0218 14:27:08.188078 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22ca54d4-d910-46fb-9966-792b61e4969b" path="/var/lib/kubelet/pods/22ca54d4-d910-46fb-9966-792b61e4969b/volumes" Feb 18 14:27:08 crc kubenswrapper[4817]: I0218 14:27:08.189367 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45992562-7a07-46a2-93fb-3c5fc00b367c" path="/var/lib/kubelet/pods/45992562-7a07-46a2-93fb-3c5fc00b367c/volumes" Feb 18 14:27:08 crc kubenswrapper[4817]: I0218 14:27:08.189895 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68ca39a4-c529-4e29-b4f0-bea5dde2dab9" path="/var/lib/kubelet/pods/68ca39a4-c529-4e29-b4f0-bea5dde2dab9/volumes" Feb 18 14:27:08 crc kubenswrapper[4817]: I0218 14:27:08.190891 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5efb39b-4f4c-4cae-a4f6-529877efbafb" path="/var/lib/kubelet/pods/d5efb39b-4f4c-4cae-a4f6-529877efbafb/volumes" Feb 18 14:27:08 crc kubenswrapper[4817]: I0218 14:27:08.192592 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de3b6456-c155-43c3-9b35-5832009c7054" path="/var/lib/kubelet/pods/de3b6456-c155-43c3-9b35-5832009c7054/volumes" Feb 18 14:27:08 crc kubenswrapper[4817]: I0218 14:27:08.194745 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eabe27ea-3386-4a4d-bba3-1786b2041d2a" path="/var/lib/kubelet/pods/eabe27ea-3386-4a4d-bba3-1786b2041d2a/volumes" Feb 18 14:27:12 crc kubenswrapper[4817]: I0218 14:27:12.863786 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:27:12 crc kubenswrapper[4817]: I0218 14:27:12.865531 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:27:12 crc kubenswrapper[4817]: I0218 14:27:12.865666 4817 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" Feb 18 14:27:12 crc kubenswrapper[4817]: I0218 14:27:12.866563 4817 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"77626d3d8b06dd06e4699408c75ed0f84d2534db6eae4ab257458701371a9858"} pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 14:27:12 crc kubenswrapper[4817]: I0218 14:27:12.866723 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" containerID="cri-o://77626d3d8b06dd06e4699408c75ed0f84d2534db6eae4ab257458701371a9858" gracePeriod=600 Feb 18 14:27:12 crc kubenswrapper[4817]: E0218 14:27:12.993161 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:27:13 crc kubenswrapper[4817]: I0218 14:27:13.416309 4817 generic.go:334] "Generic (PLEG): container finished" podID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerID="77626d3d8b06dd06e4699408c75ed0f84d2534db6eae4ab257458701371a9858" exitCode=0 Feb 18 14:27:13 crc kubenswrapper[4817]: I0218 14:27:13.416428 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerDied","Data":"77626d3d8b06dd06e4699408c75ed0f84d2534db6eae4ab257458701371a9858"} Feb 18 14:27:13 crc kubenswrapper[4817]: I0218 14:27:13.416715 4817 scope.go:117] "RemoveContainer" containerID="f904d428b6eee9716ba5ad8fa384beb59b260ceb6de6d026ad8fd0ef911a200e" Feb 18 14:27:13 crc kubenswrapper[4817]: I0218 14:27:13.417440 4817 scope.go:117] "RemoveContainer" containerID="77626d3d8b06dd06e4699408c75ed0f84d2534db6eae4ab257458701371a9858" Feb 18 14:27:13 crc kubenswrapper[4817]: E0218 14:27:13.417710 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:27:14 crc kubenswrapper[4817]: I0218 14:27:14.030596 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-8fqk4"] Feb 18 14:27:14 crc kubenswrapper[4817]: I0218 14:27:14.040226 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-8fqk4"] Feb 18 14:27:14 crc kubenswrapper[4817]: I0218 14:27:14.181748 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="206a327c-35a4-4c97-9e11-9792c464b2c3" path="/var/lib/kubelet/pods/206a327c-35a4-4c97-9e11-9792c464b2c3/volumes" Feb 18 14:27:24 crc kubenswrapper[4817]: I0218 14:27:24.048819 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-4q5qq"] Feb 18 14:27:24 crc kubenswrapper[4817]: I0218 14:27:24.063886 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-zqdhk"] Feb 18 14:27:24 crc kubenswrapper[4817]: I0218 14:27:24.075720 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-zqdhk"] Feb 18 14:27:24 crc kubenswrapper[4817]: I0218 14:27:24.085296 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-4q5qq"] Feb 18 14:27:24 crc kubenswrapper[4817]: I0218 14:27:24.172612 4817 scope.go:117] "RemoveContainer" containerID="77626d3d8b06dd06e4699408c75ed0f84d2534db6eae4ab257458701371a9858" Feb 18 14:27:24 crc kubenswrapper[4817]: E0218 14:27:24.173004 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:27:24 crc kubenswrapper[4817]: I0218 14:27:24.186961 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="639aeed9-1ba1-4ad8-acb2-90e3e800e4a9" path="/var/lib/kubelet/pods/639aeed9-1ba1-4ad8-acb2-90e3e800e4a9/volumes" Feb 18 14:27:24 crc kubenswrapper[4817]: I0218 14:27:24.188863 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67e72216-792c-4525-b231-2370a5b4d8ef" path="/var/lib/kubelet/pods/67e72216-792c-4525-b231-2370a5b4d8ef/volumes" Feb 18 14:27:27 crc kubenswrapper[4817]: I0218 14:27:27.068841 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-5d24-account-create-update-rrt2r"] Feb 18 14:27:27 crc kubenswrapper[4817]: I0218 14:27:27.088418 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db0f-account-create-update-ljxz9"] Feb 18 14:27:27 crc kubenswrapper[4817]: I0218 14:27:27.100880 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-553d-account-create-update-twtnj"] Feb 18 14:27:27 crc kubenswrapper[4817]: I0218 14:27:27.112591 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-5d24-account-create-update-rrt2r"] Feb 18 14:27:27 crc kubenswrapper[4817]: I0218 14:27:27.123725 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-553d-account-create-update-twtnj"] Feb 18 14:27:27 crc kubenswrapper[4817]: I0218 14:27:27.131966 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-bkwzx"] Feb 18 14:27:27 crc kubenswrapper[4817]: I0218 14:27:27.141264 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db0f-account-create-update-ljxz9"] Feb 18 14:27:27 crc kubenswrapper[4817]: I0218 14:27:27.149582 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-bkwzx"] Feb 18 14:27:27 crc kubenswrapper[4817]: I0218 14:27:27.156930 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-create-hs8dt"] Feb 18 14:27:27 crc kubenswrapper[4817]: I0218 14:27:27.165621 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-9002-account-create-update-x52d2"] Feb 18 14:27:27 crc kubenswrapper[4817]: I0218 14:27:27.175805 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-create-hs8dt"] Feb 18 14:27:27 crc kubenswrapper[4817]: I0218 14:27:27.186394 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-9002-account-create-update-x52d2"] Feb 18 14:27:28 crc kubenswrapper[4817]: I0218 14:27:28.195038 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="288ec94a-2fa7-44a5-afaf-1bf7909336a7" path="/var/lib/kubelet/pods/288ec94a-2fa7-44a5-afaf-1bf7909336a7/volumes" Feb 18 14:27:28 crc kubenswrapper[4817]: I0218 14:27:28.196059 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d4b2205-ea38-4a29-858f-c2acb3cbd423" path="/var/lib/kubelet/pods/2d4b2205-ea38-4a29-858f-c2acb3cbd423/volumes" Feb 18 14:27:28 crc kubenswrapper[4817]: I0218 14:27:28.198989 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4135b831-d02a-45bc-aea0-4584e8b2a01f" path="/var/lib/kubelet/pods/4135b831-d02a-45bc-aea0-4584e8b2a01f/volumes" Feb 18 14:27:28 crc kubenswrapper[4817]: I0218 14:27:28.200523 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4e9ee1a-8b93-4306-ad94-d154b80f60c3" path="/var/lib/kubelet/pods/e4e9ee1a-8b93-4306-ad94-d154b80f60c3/volumes" Feb 18 14:27:28 crc kubenswrapper[4817]: I0218 14:27:28.202074 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed148ce2-1bf9-44a7-b0bd-444c12bead6c" path="/var/lib/kubelet/pods/ed148ce2-1bf9-44a7-b0bd-444c12bead6c/volumes" Feb 18 14:27:28 crc kubenswrapper[4817]: I0218 14:27:28.205373 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff3fb952-c1b3-4311-832b-c8807407385e" path="/var/lib/kubelet/pods/ff3fb952-c1b3-4311-832b-c8807407385e/volumes" Feb 18 14:27:30 crc kubenswrapper[4817]: I0218 14:27:30.007814 4817 scope.go:117] "RemoveContainer" containerID="7ea18320b93e4d6a63ee602f94a46f9b3d8f6cd87e1ea610e7d55edd2aef27b6" Feb 18 14:27:30 crc kubenswrapper[4817]: I0218 14:27:30.040583 4817 scope.go:117] "RemoveContainer" containerID="28dcfcfe333100e0d09c762f4330a068f4ac89f4ea7b3316743027d120631d18" Feb 18 14:27:30 crc kubenswrapper[4817]: I0218 14:27:30.084623 4817 scope.go:117] "RemoveContainer" containerID="acef9937a409cfd792c07db69d073715e581438c9fab8c3beaa608147c79d31c" Feb 18 14:27:30 crc kubenswrapper[4817]: I0218 14:27:30.128236 4817 scope.go:117] "RemoveContainer" containerID="0e889fb035dddb82131229f54177b65d68938b655179dffd1d11fed85e0dee9c" Feb 18 14:27:30 crc kubenswrapper[4817]: I0218 14:27:30.192242 4817 scope.go:117] "RemoveContainer" containerID="2b988613ff01744e08861d50ddd930e5134baf7612fea8c6b33b9dfc712b239f" Feb 18 14:27:30 crc kubenswrapper[4817]: I0218 14:27:30.231504 4817 scope.go:117] "RemoveContainer" containerID="762c834cb57e2f8db2c45854f68cd8b6080465bc2c067c55577f0cd7b28e38c5" Feb 18 14:27:30 crc kubenswrapper[4817]: I0218 14:27:30.278179 4817 scope.go:117] "RemoveContainer" containerID="d2c990d70a0a202e89fdfdf55fe6e779315f8e8016b2d9abf431d4e19a6f7ac4" Feb 18 14:27:30 crc kubenswrapper[4817]: I0218 14:27:30.295795 4817 scope.go:117] "RemoveContainer" containerID="94d2be14a263212f3973117d7e35008b4526d374e1514a4b6f27a9f85b6c9a95" Feb 18 14:27:30 crc kubenswrapper[4817]: I0218 14:27:30.323683 4817 scope.go:117] "RemoveContainer" containerID="9535cedd3fd4e601b68b071a22fbf5c4ceeb7d050a20b90b8f5c374303d0f021" Feb 18 14:27:30 crc kubenswrapper[4817]: I0218 14:27:30.351616 4817 scope.go:117] "RemoveContainer" containerID="0cda4539915d564c1f1a0a904f5f9eee6275da5e919fceb4db7a2a5be677f158" Feb 18 14:27:30 crc kubenswrapper[4817]: I0218 14:27:30.383512 4817 scope.go:117] "RemoveContainer" containerID="3e71791cd6590943c0c6e0cf316c02d74c31a8d7ce97b9649c5b05b4ca2c8622" Feb 18 14:27:30 crc kubenswrapper[4817]: I0218 14:27:30.410212 4817 scope.go:117] "RemoveContainer" containerID="bbf5c931ca0895f50fd9e9110ea5016f6e9460cc528632ab583532d485da1404" Feb 18 14:27:30 crc kubenswrapper[4817]: I0218 14:27:30.453508 4817 scope.go:117] "RemoveContainer" containerID="d9cd83558493762888c997dff506122078861fe06345c3b74ea1ad02b2770ab1" Feb 18 14:27:30 crc kubenswrapper[4817]: I0218 14:27:30.486547 4817 scope.go:117] "RemoveContainer" containerID="5811f76d8389c264d315be4cb88f7078ce8c6626bf4cdee993d6f8845c7f2119" Feb 18 14:27:30 crc kubenswrapper[4817]: I0218 14:27:30.511036 4817 scope.go:117] "RemoveContainer" containerID="4344cd761fe1014382b5bf26b108be7d341da3d9a3a09754092aa47d2a7fa98f" Feb 18 14:27:30 crc kubenswrapper[4817]: I0218 14:27:30.537333 4817 scope.go:117] "RemoveContainer" containerID="1be98d9b34c1e9a8c03d153810f7eb61636ba64426214ab705cae525935c5242" Feb 18 14:27:30 crc kubenswrapper[4817]: I0218 14:27:30.561000 4817 scope.go:117] "RemoveContainer" containerID="363469e2aac9c38739b2c8a3ca00303e3889f1255e53a4cdddebfff3e1d4b78c" Feb 18 14:27:30 crc kubenswrapper[4817]: I0218 14:27:30.580435 4817 scope.go:117] "RemoveContainer" containerID="d8f4f516ab60608db5f1a975a5b5cf3b0fe21f143c617f74af2f246e0dbc549a" Feb 18 14:27:30 crc kubenswrapper[4817]: I0218 14:27:30.607814 4817 scope.go:117] "RemoveContainer" containerID="80f7efcbe5f849c447341e3008d7c195009c7c63dfe561363936ec03f4637064" Feb 18 14:27:35 crc kubenswrapper[4817]: I0218 14:27:35.173397 4817 scope.go:117] "RemoveContainer" containerID="77626d3d8b06dd06e4699408c75ed0f84d2534db6eae4ab257458701371a9858" Feb 18 14:27:35 crc kubenswrapper[4817]: E0218 14:27:35.174249 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:27:41 crc kubenswrapper[4817]: I0218 14:27:41.031291 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-jwsrh"] Feb 18 14:27:41 crc kubenswrapper[4817]: I0218 14:27:41.041145 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-jwsrh"] Feb 18 14:27:42 crc kubenswrapper[4817]: I0218 14:27:42.201832 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5e8be8e-4ac1-4926-8790-e6910c1cbddf" path="/var/lib/kubelet/pods/d5e8be8e-4ac1-4926-8790-e6910c1cbddf/volumes" Feb 18 14:27:47 crc kubenswrapper[4817]: I0218 14:27:47.172441 4817 scope.go:117] "RemoveContainer" containerID="77626d3d8b06dd06e4699408c75ed0f84d2534db6eae4ab257458701371a9858" Feb 18 14:27:47 crc kubenswrapper[4817]: E0218 14:27:47.173041 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:28:02 crc kubenswrapper[4817]: I0218 14:28:02.171701 4817 scope.go:117] "RemoveContainer" containerID="77626d3d8b06dd06e4699408c75ed0f84d2534db6eae4ab257458701371a9858" Feb 18 14:28:02 crc kubenswrapper[4817]: E0218 14:28:02.172659 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:28:11 crc kubenswrapper[4817]: I0218 14:28:11.054593 4817 generic.go:334] "Generic (PLEG): container finished" podID="d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9" containerID="9730ee73612795a3c8fa4bbc6f82ffd8b9b1aed164c4ff8e05dba8d4e9c606a0" exitCode=0 Feb 18 14:28:11 crc kubenswrapper[4817]: I0218 14:28:11.054868 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f" event={"ID":"d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9","Type":"ContainerDied","Data":"9730ee73612795a3c8fa4bbc6f82ffd8b9b1aed164c4ff8e05dba8d4e9c606a0"} Feb 18 14:28:12 crc kubenswrapper[4817]: I0218 14:28:12.582528 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f" Feb 18 14:28:12 crc kubenswrapper[4817]: I0218 14:28:12.754610 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9-bootstrap-combined-ca-bundle\") pod \"d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9\" (UID: \"d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9\") " Feb 18 14:28:12 crc kubenswrapper[4817]: I0218 14:28:12.755759 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmjgb\" (UniqueName: \"kubernetes.io/projected/d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9-kube-api-access-hmjgb\") pod \"d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9\" (UID: \"d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9\") " Feb 18 14:28:12 crc kubenswrapper[4817]: I0218 14:28:12.755943 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9-inventory\") pod \"d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9\" (UID: \"d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9\") " Feb 18 14:28:12 crc kubenswrapper[4817]: I0218 14:28:12.756258 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9-ssh-key-openstack-edpm-ipam\") pod \"d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9\" (UID: \"d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9\") " Feb 18 14:28:12 crc kubenswrapper[4817]: I0218 14:28:12.762341 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9" (UID: "d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:28:12 crc kubenswrapper[4817]: I0218 14:28:12.765191 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9-kube-api-access-hmjgb" (OuterVolumeSpecName: "kube-api-access-hmjgb") pod "d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9" (UID: "d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9"). InnerVolumeSpecName "kube-api-access-hmjgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:28:12 crc kubenswrapper[4817]: I0218 14:28:12.791476 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9" (UID: "d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:28:12 crc kubenswrapper[4817]: I0218 14:28:12.794951 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9-inventory" (OuterVolumeSpecName: "inventory") pod "d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9" (UID: "d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:28:12 crc kubenswrapper[4817]: I0218 14:28:12.859513 4817 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:28:12 crc kubenswrapper[4817]: I0218 14:28:12.859551 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmjgb\" (UniqueName: \"kubernetes.io/projected/d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9-kube-api-access-hmjgb\") on node \"crc\" DevicePath \"\"" Feb 18 14:28:12 crc kubenswrapper[4817]: I0218 14:28:12.859566 4817 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 14:28:12 crc kubenswrapper[4817]: I0218 14:28:12.859576 4817 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 14:28:13 crc kubenswrapper[4817]: I0218 14:28:13.078803 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f" event={"ID":"d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9","Type":"ContainerDied","Data":"af9c8cef9e9c4dd3a625a51c509bcb9c5dbbbd0475f342427c00d6d5b69a787b"} Feb 18 14:28:13 crc kubenswrapper[4817]: I0218 14:28:13.078845 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af9c8cef9e9c4dd3a625a51c509bcb9c5dbbbd0475f342427c00d6d5b69a787b" Feb 18 14:28:13 crc kubenswrapper[4817]: I0218 14:28:13.078866 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f" Feb 18 14:28:13 crc kubenswrapper[4817]: I0218 14:28:13.160275 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qxkc"] Feb 18 14:28:13 crc kubenswrapper[4817]: E0218 14:28:13.160719 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1268dca-bb64-438a-8add-e369fc7c711b" containerName="extract-content" Feb 18 14:28:13 crc kubenswrapper[4817]: I0218 14:28:13.160739 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1268dca-bb64-438a-8add-e369fc7c711b" containerName="extract-content" Feb 18 14:28:13 crc kubenswrapper[4817]: E0218 14:28:13.160761 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1268dca-bb64-438a-8add-e369fc7c711b" containerName="extract-utilities" Feb 18 14:28:13 crc kubenswrapper[4817]: I0218 14:28:13.160772 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1268dca-bb64-438a-8add-e369fc7c711b" containerName="extract-utilities" Feb 18 14:28:13 crc kubenswrapper[4817]: E0218 14:28:13.160785 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1268dca-bb64-438a-8add-e369fc7c711b" containerName="registry-server" Feb 18 14:28:13 crc kubenswrapper[4817]: I0218 14:28:13.160792 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1268dca-bb64-438a-8add-e369fc7c711b" containerName="registry-server" Feb 18 14:28:13 crc kubenswrapper[4817]: E0218 14:28:13.160806 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 18 14:28:13 crc kubenswrapper[4817]: I0218 14:28:13.160814 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 18 14:28:13 crc kubenswrapper[4817]: I0218 14:28:13.161090 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1268dca-bb64-438a-8add-e369fc7c711b" containerName="registry-server" Feb 18 14:28:13 crc kubenswrapper[4817]: I0218 14:28:13.161122 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 18 14:28:13 crc kubenswrapper[4817]: I0218 14:28:13.161973 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qxkc" Feb 18 14:28:13 crc kubenswrapper[4817]: I0218 14:28:13.164100 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 14:28:13 crc kubenswrapper[4817]: I0218 14:28:13.164272 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 14:28:13 crc kubenswrapper[4817]: I0218 14:28:13.164441 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 14:28:13 crc kubenswrapper[4817]: I0218 14:28:13.164456 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x8jkl" Feb 18 14:28:13 crc kubenswrapper[4817]: I0218 14:28:13.173556 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qxkc"] Feb 18 14:28:13 crc kubenswrapper[4817]: I0218 14:28:13.267654 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qxkc\" (UID: \"5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qxkc" Feb 18 14:28:13 crc kubenswrapper[4817]: I0218 14:28:13.267717 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghz7r\" (UniqueName: \"kubernetes.io/projected/5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807-kube-api-access-ghz7r\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qxkc\" (UID: \"5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qxkc" Feb 18 14:28:13 crc kubenswrapper[4817]: I0218 14:28:13.267833 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qxkc\" (UID: \"5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qxkc" Feb 18 14:28:13 crc kubenswrapper[4817]: I0218 14:28:13.369469 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qxkc\" (UID: \"5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qxkc" Feb 18 14:28:13 crc kubenswrapper[4817]: I0218 14:28:13.369525 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghz7r\" (UniqueName: \"kubernetes.io/projected/5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807-kube-api-access-ghz7r\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qxkc\" (UID: \"5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qxkc" Feb 18 14:28:13 crc kubenswrapper[4817]: I0218 14:28:13.369591 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qxkc\" (UID: \"5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qxkc" Feb 18 14:28:13 crc kubenswrapper[4817]: I0218 14:28:13.373460 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qxkc\" (UID: \"5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qxkc" Feb 18 14:28:13 crc kubenswrapper[4817]: I0218 14:28:13.375705 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qxkc\" (UID: \"5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qxkc" Feb 18 14:28:13 crc kubenswrapper[4817]: I0218 14:28:13.386324 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghz7r\" (UniqueName: \"kubernetes.io/projected/5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807-kube-api-access-ghz7r\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qxkc\" (UID: \"5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qxkc" Feb 18 14:28:13 crc kubenswrapper[4817]: I0218 14:28:13.489164 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qxkc" Feb 18 14:28:14 crc kubenswrapper[4817]: I0218 14:28:14.026389 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qxkc"] Feb 18 14:28:14 crc kubenswrapper[4817]: I0218 14:28:14.089295 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qxkc" event={"ID":"5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807","Type":"ContainerStarted","Data":"d9ee74c33f26af350a22a154094ea45474efdac238a0d1da068980830611c3ba"} Feb 18 14:28:15 crc kubenswrapper[4817]: I0218 14:28:15.103261 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qxkc" event={"ID":"5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807","Type":"ContainerStarted","Data":"cdee6aa24052e44ba63e0e16dd3c469c7263307205e73bf59721a21dd6aec780"} Feb 18 14:28:15 crc kubenswrapper[4817]: I0218 14:28:15.130554 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qxkc" podStartSLOduration=1.68255023 podStartE2EDuration="2.130530779s" podCreationTimestamp="2026-02-18 14:28:13 +0000 UTC" firstStartedPulling="2026-02-18 14:28:14.029853335 +0000 UTC m=+1756.605389318" lastFinishedPulling="2026-02-18 14:28:14.477833894 +0000 UTC m=+1757.053369867" observedRunningTime="2026-02-18 14:28:15.123638004 +0000 UTC m=+1757.699173987" watchObservedRunningTime="2026-02-18 14:28:15.130530779 +0000 UTC m=+1757.706066782" Feb 18 14:28:17 crc kubenswrapper[4817]: I0218 14:28:17.171889 4817 scope.go:117] "RemoveContainer" containerID="77626d3d8b06dd06e4699408c75ed0f84d2534db6eae4ab257458701371a9858" Feb 18 14:28:17 crc kubenswrapper[4817]: E0218 14:28:17.173279 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:28:30 crc kubenswrapper[4817]: I0218 14:28:30.986348 4817 scope.go:117] "RemoveContainer" containerID="2c2b15a5408ff421bfb9a43b871a3db26e198cfd4b2742e4727dc629955b23a9" Feb 18 14:28:32 crc kubenswrapper[4817]: I0218 14:28:32.171708 4817 scope.go:117] "RemoveContainer" containerID="77626d3d8b06dd06e4699408c75ed0f84d2534db6eae4ab257458701371a9858" Feb 18 14:28:32 crc kubenswrapper[4817]: E0218 14:28:32.172363 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:28:37 crc kubenswrapper[4817]: I0218 14:28:37.046045 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-khzqv"] Feb 18 14:28:37 crc kubenswrapper[4817]: I0218 14:28:37.057508 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-khzqv"] Feb 18 14:28:37 crc kubenswrapper[4817]: I0218 14:28:37.786068 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lflm7"] Feb 18 14:28:37 crc kubenswrapper[4817]: I0218 14:28:37.790326 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lflm7" Feb 18 14:28:37 crc kubenswrapper[4817]: I0218 14:28:37.798238 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lflm7"] Feb 18 14:28:37 crc kubenswrapper[4817]: I0218 14:28:37.880311 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/532eea9b-6412-4d74-b289-39b3bdab1f60-catalog-content\") pod \"community-operators-lflm7\" (UID: \"532eea9b-6412-4d74-b289-39b3bdab1f60\") " pod="openshift-marketplace/community-operators-lflm7" Feb 18 14:28:37 crc kubenswrapper[4817]: I0218 14:28:37.880369 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdhhf\" (UniqueName: \"kubernetes.io/projected/532eea9b-6412-4d74-b289-39b3bdab1f60-kube-api-access-mdhhf\") pod \"community-operators-lflm7\" (UID: \"532eea9b-6412-4d74-b289-39b3bdab1f60\") " pod="openshift-marketplace/community-operators-lflm7" Feb 18 14:28:37 crc kubenswrapper[4817]: I0218 14:28:37.880467 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/532eea9b-6412-4d74-b289-39b3bdab1f60-utilities\") pod \"community-operators-lflm7\" (UID: \"532eea9b-6412-4d74-b289-39b3bdab1f60\") " pod="openshift-marketplace/community-operators-lflm7" Feb 18 14:28:37 crc kubenswrapper[4817]: I0218 14:28:37.982488 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/532eea9b-6412-4d74-b289-39b3bdab1f60-catalog-content\") pod \"community-operators-lflm7\" (UID: \"532eea9b-6412-4d74-b289-39b3bdab1f60\") " pod="openshift-marketplace/community-operators-lflm7" Feb 18 14:28:37 crc kubenswrapper[4817]: I0218 14:28:37.982541 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdhhf\" (UniqueName: \"kubernetes.io/projected/532eea9b-6412-4d74-b289-39b3bdab1f60-kube-api-access-mdhhf\") pod \"community-operators-lflm7\" (UID: \"532eea9b-6412-4d74-b289-39b3bdab1f60\") " pod="openshift-marketplace/community-operators-lflm7" Feb 18 14:28:37 crc kubenswrapper[4817]: I0218 14:28:37.982618 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/532eea9b-6412-4d74-b289-39b3bdab1f60-utilities\") pod \"community-operators-lflm7\" (UID: \"532eea9b-6412-4d74-b289-39b3bdab1f60\") " pod="openshift-marketplace/community-operators-lflm7" Feb 18 14:28:37 crc kubenswrapper[4817]: I0218 14:28:37.983304 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/532eea9b-6412-4d74-b289-39b3bdab1f60-catalog-content\") pod \"community-operators-lflm7\" (UID: \"532eea9b-6412-4d74-b289-39b3bdab1f60\") " pod="openshift-marketplace/community-operators-lflm7" Feb 18 14:28:37 crc kubenswrapper[4817]: I0218 14:28:37.983320 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/532eea9b-6412-4d74-b289-39b3bdab1f60-utilities\") pod \"community-operators-lflm7\" (UID: \"532eea9b-6412-4d74-b289-39b3bdab1f60\") " pod="openshift-marketplace/community-operators-lflm7" Feb 18 14:28:38 crc kubenswrapper[4817]: I0218 14:28:38.006334 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdhhf\" (UniqueName: \"kubernetes.io/projected/532eea9b-6412-4d74-b289-39b3bdab1f60-kube-api-access-mdhhf\") pod \"community-operators-lflm7\" (UID: \"532eea9b-6412-4d74-b289-39b3bdab1f60\") " pod="openshift-marketplace/community-operators-lflm7" Feb 18 14:28:38 crc kubenswrapper[4817]: I0218 14:28:38.126958 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lflm7" Feb 18 14:28:38 crc kubenswrapper[4817]: I0218 14:28:38.198359 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbf9142d-d5ef-4793-a36f-9f91a8146527" path="/var/lib/kubelet/pods/bbf9142d-d5ef-4793-a36f-9f91a8146527/volumes" Feb 18 14:28:38 crc kubenswrapper[4817]: I0218 14:28:38.621695 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lflm7"] Feb 18 14:28:39 crc kubenswrapper[4817]: I0218 14:28:39.325623 4817 generic.go:334] "Generic (PLEG): container finished" podID="532eea9b-6412-4d74-b289-39b3bdab1f60" containerID="304c0720fb1707cdb5a0a1a8787c4d711625a7ccd3cc3b9a8285884ca73062ac" exitCode=0 Feb 18 14:28:39 crc kubenswrapper[4817]: I0218 14:28:39.325679 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lflm7" event={"ID":"532eea9b-6412-4d74-b289-39b3bdab1f60","Type":"ContainerDied","Data":"304c0720fb1707cdb5a0a1a8787c4d711625a7ccd3cc3b9a8285884ca73062ac"} Feb 18 14:28:39 crc kubenswrapper[4817]: I0218 14:28:39.325709 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lflm7" event={"ID":"532eea9b-6412-4d74-b289-39b3bdab1f60","Type":"ContainerStarted","Data":"4df7c17e592b2d00cd4d4903e470e35146f3eef9182bd6407f56e48d2658740d"} Feb 18 14:28:40 crc kubenswrapper[4817]: I0218 14:28:40.337535 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lflm7" event={"ID":"532eea9b-6412-4d74-b289-39b3bdab1f60","Type":"ContainerStarted","Data":"11909d516aeac95b79f70c49e00c32f4889654924cbf8ad66f299d0e26889630"} Feb 18 14:28:41 crc kubenswrapper[4817]: I0218 14:28:41.043810 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-sm9px"] Feb 18 14:28:41 crc kubenswrapper[4817]: I0218 14:28:41.058204 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-sm9px"] Feb 18 14:28:42 crc kubenswrapper[4817]: I0218 14:28:42.187126 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7ca3146-2c85-46da-baeb-ea06b64ffac0" path="/var/lib/kubelet/pods/a7ca3146-2c85-46da-baeb-ea06b64ffac0/volumes" Feb 18 14:28:42 crc kubenswrapper[4817]: I0218 14:28:42.363995 4817 generic.go:334] "Generic (PLEG): container finished" podID="532eea9b-6412-4d74-b289-39b3bdab1f60" containerID="11909d516aeac95b79f70c49e00c32f4889654924cbf8ad66f299d0e26889630" exitCode=0 Feb 18 14:28:42 crc kubenswrapper[4817]: I0218 14:28:42.364070 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lflm7" event={"ID":"532eea9b-6412-4d74-b289-39b3bdab1f60","Type":"ContainerDied","Data":"11909d516aeac95b79f70c49e00c32f4889654924cbf8ad66f299d0e26889630"} Feb 18 14:28:43 crc kubenswrapper[4817]: I0218 14:28:43.376213 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lflm7" event={"ID":"532eea9b-6412-4d74-b289-39b3bdab1f60","Type":"ContainerStarted","Data":"939a31cc7ea630e033a47aa81cb1695be2d5799b22f54095184488b47e7532fa"} Feb 18 14:28:43 crc kubenswrapper[4817]: I0218 14:28:43.397350 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lflm7" podStartSLOduration=2.968495476 podStartE2EDuration="6.397331887s" podCreationTimestamp="2026-02-18 14:28:37 +0000 UTC" firstStartedPulling="2026-02-18 14:28:39.328421581 +0000 UTC m=+1781.903957564" lastFinishedPulling="2026-02-18 14:28:42.757258002 +0000 UTC m=+1785.332793975" observedRunningTime="2026-02-18 14:28:43.393342666 +0000 UTC m=+1785.968878669" watchObservedRunningTime="2026-02-18 14:28:43.397331887 +0000 UTC m=+1785.972867870" Feb 18 14:28:46 crc kubenswrapper[4817]: I0218 14:28:46.172456 4817 scope.go:117] "RemoveContainer" containerID="77626d3d8b06dd06e4699408c75ed0f84d2534db6eae4ab257458701371a9858" Feb 18 14:28:46 crc kubenswrapper[4817]: E0218 14:28:46.173346 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:28:48 crc kubenswrapper[4817]: I0218 14:28:48.127716 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lflm7" Feb 18 14:28:48 crc kubenswrapper[4817]: I0218 14:28:48.128269 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lflm7" Feb 18 14:28:48 crc kubenswrapper[4817]: I0218 14:28:48.187629 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lflm7" Feb 18 14:28:48 crc kubenswrapper[4817]: I0218 14:28:48.469823 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lflm7" Feb 18 14:28:48 crc kubenswrapper[4817]: I0218 14:28:48.521493 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lflm7"] Feb 18 14:28:50 crc kubenswrapper[4817]: I0218 14:28:50.442559 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lflm7" podUID="532eea9b-6412-4d74-b289-39b3bdab1f60" containerName="registry-server" containerID="cri-o://939a31cc7ea630e033a47aa81cb1695be2d5799b22f54095184488b47e7532fa" gracePeriod=2 Feb 18 14:28:50 crc kubenswrapper[4817]: I0218 14:28:50.949543 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lflm7" Feb 18 14:28:51 crc kubenswrapper[4817]: I0218 14:28:51.070901 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/532eea9b-6412-4d74-b289-39b3bdab1f60-utilities\") pod \"532eea9b-6412-4d74-b289-39b3bdab1f60\" (UID: \"532eea9b-6412-4d74-b289-39b3bdab1f60\") " Feb 18 14:28:51 crc kubenswrapper[4817]: I0218 14:28:51.071176 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdhhf\" (UniqueName: \"kubernetes.io/projected/532eea9b-6412-4d74-b289-39b3bdab1f60-kube-api-access-mdhhf\") pod \"532eea9b-6412-4d74-b289-39b3bdab1f60\" (UID: \"532eea9b-6412-4d74-b289-39b3bdab1f60\") " Feb 18 14:28:51 crc kubenswrapper[4817]: I0218 14:28:51.071295 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/532eea9b-6412-4d74-b289-39b3bdab1f60-catalog-content\") pod \"532eea9b-6412-4d74-b289-39b3bdab1f60\" (UID: \"532eea9b-6412-4d74-b289-39b3bdab1f60\") " Feb 18 14:28:51 crc kubenswrapper[4817]: I0218 14:28:51.072236 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/532eea9b-6412-4d74-b289-39b3bdab1f60-utilities" (OuterVolumeSpecName: "utilities") pod "532eea9b-6412-4d74-b289-39b3bdab1f60" (UID: "532eea9b-6412-4d74-b289-39b3bdab1f60"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:28:51 crc kubenswrapper[4817]: I0218 14:28:51.072627 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/532eea9b-6412-4d74-b289-39b3bdab1f60-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 14:28:51 crc kubenswrapper[4817]: I0218 14:28:51.085536 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/532eea9b-6412-4d74-b289-39b3bdab1f60-kube-api-access-mdhhf" (OuterVolumeSpecName: "kube-api-access-mdhhf") pod "532eea9b-6412-4d74-b289-39b3bdab1f60" (UID: "532eea9b-6412-4d74-b289-39b3bdab1f60"). InnerVolumeSpecName "kube-api-access-mdhhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:28:51 crc kubenswrapper[4817]: I0218 14:28:51.126480 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/532eea9b-6412-4d74-b289-39b3bdab1f60-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "532eea9b-6412-4d74-b289-39b3bdab1f60" (UID: "532eea9b-6412-4d74-b289-39b3bdab1f60"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:28:51 crc kubenswrapper[4817]: I0218 14:28:51.174171 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdhhf\" (UniqueName: \"kubernetes.io/projected/532eea9b-6412-4d74-b289-39b3bdab1f60-kube-api-access-mdhhf\") on node \"crc\" DevicePath \"\"" Feb 18 14:28:51 crc kubenswrapper[4817]: I0218 14:28:51.174211 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/532eea9b-6412-4d74-b289-39b3bdab1f60-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 14:28:51 crc kubenswrapper[4817]: I0218 14:28:51.453837 4817 generic.go:334] "Generic (PLEG): container finished" podID="532eea9b-6412-4d74-b289-39b3bdab1f60" containerID="939a31cc7ea630e033a47aa81cb1695be2d5799b22f54095184488b47e7532fa" exitCode=0 Feb 18 14:28:51 crc kubenswrapper[4817]: I0218 14:28:51.454220 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lflm7" event={"ID":"532eea9b-6412-4d74-b289-39b3bdab1f60","Type":"ContainerDied","Data":"939a31cc7ea630e033a47aa81cb1695be2d5799b22f54095184488b47e7532fa"} Feb 18 14:28:51 crc kubenswrapper[4817]: I0218 14:28:51.454282 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lflm7" event={"ID":"532eea9b-6412-4d74-b289-39b3bdab1f60","Type":"ContainerDied","Data":"4df7c17e592b2d00cd4d4903e470e35146f3eef9182bd6407f56e48d2658740d"} Feb 18 14:28:51 crc kubenswrapper[4817]: I0218 14:28:51.454303 4817 scope.go:117] "RemoveContainer" containerID="939a31cc7ea630e033a47aa81cb1695be2d5799b22f54095184488b47e7532fa" Feb 18 14:28:51 crc kubenswrapper[4817]: I0218 14:28:51.454235 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lflm7" Feb 18 14:28:51 crc kubenswrapper[4817]: I0218 14:28:51.475221 4817 scope.go:117] "RemoveContainer" containerID="11909d516aeac95b79f70c49e00c32f4889654924cbf8ad66f299d0e26889630" Feb 18 14:28:51 crc kubenswrapper[4817]: I0218 14:28:51.503160 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lflm7"] Feb 18 14:28:51 crc kubenswrapper[4817]: I0218 14:28:51.511063 4817 scope.go:117] "RemoveContainer" containerID="304c0720fb1707cdb5a0a1a8787c4d711625a7ccd3cc3b9a8285884ca73062ac" Feb 18 14:28:51 crc kubenswrapper[4817]: I0218 14:28:51.516429 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lflm7"] Feb 18 14:28:51 crc kubenswrapper[4817]: I0218 14:28:51.550711 4817 scope.go:117] "RemoveContainer" containerID="939a31cc7ea630e033a47aa81cb1695be2d5799b22f54095184488b47e7532fa" Feb 18 14:28:51 crc kubenswrapper[4817]: E0218 14:28:51.551262 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"939a31cc7ea630e033a47aa81cb1695be2d5799b22f54095184488b47e7532fa\": container with ID starting with 939a31cc7ea630e033a47aa81cb1695be2d5799b22f54095184488b47e7532fa not found: ID does not exist" containerID="939a31cc7ea630e033a47aa81cb1695be2d5799b22f54095184488b47e7532fa" Feb 18 14:28:51 crc kubenswrapper[4817]: I0218 14:28:51.551322 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"939a31cc7ea630e033a47aa81cb1695be2d5799b22f54095184488b47e7532fa"} err="failed to get container status \"939a31cc7ea630e033a47aa81cb1695be2d5799b22f54095184488b47e7532fa\": rpc error: code = NotFound desc = could not find container \"939a31cc7ea630e033a47aa81cb1695be2d5799b22f54095184488b47e7532fa\": container with ID starting with 939a31cc7ea630e033a47aa81cb1695be2d5799b22f54095184488b47e7532fa not found: ID does not exist" Feb 18 14:28:51 crc kubenswrapper[4817]: I0218 14:28:51.551356 4817 scope.go:117] "RemoveContainer" containerID="11909d516aeac95b79f70c49e00c32f4889654924cbf8ad66f299d0e26889630" Feb 18 14:28:51 crc kubenswrapper[4817]: E0218 14:28:51.551809 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11909d516aeac95b79f70c49e00c32f4889654924cbf8ad66f299d0e26889630\": container with ID starting with 11909d516aeac95b79f70c49e00c32f4889654924cbf8ad66f299d0e26889630 not found: ID does not exist" containerID="11909d516aeac95b79f70c49e00c32f4889654924cbf8ad66f299d0e26889630" Feb 18 14:28:51 crc kubenswrapper[4817]: I0218 14:28:51.551848 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11909d516aeac95b79f70c49e00c32f4889654924cbf8ad66f299d0e26889630"} err="failed to get container status \"11909d516aeac95b79f70c49e00c32f4889654924cbf8ad66f299d0e26889630\": rpc error: code = NotFound desc = could not find container \"11909d516aeac95b79f70c49e00c32f4889654924cbf8ad66f299d0e26889630\": container with ID starting with 11909d516aeac95b79f70c49e00c32f4889654924cbf8ad66f299d0e26889630 not found: ID does not exist" Feb 18 14:28:51 crc kubenswrapper[4817]: I0218 14:28:51.551876 4817 scope.go:117] "RemoveContainer" containerID="304c0720fb1707cdb5a0a1a8787c4d711625a7ccd3cc3b9a8285884ca73062ac" Feb 18 14:28:51 crc kubenswrapper[4817]: E0218 14:28:51.552240 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"304c0720fb1707cdb5a0a1a8787c4d711625a7ccd3cc3b9a8285884ca73062ac\": container with ID starting with 304c0720fb1707cdb5a0a1a8787c4d711625a7ccd3cc3b9a8285884ca73062ac not found: ID does not exist" containerID="304c0720fb1707cdb5a0a1a8787c4d711625a7ccd3cc3b9a8285884ca73062ac" Feb 18 14:28:51 crc kubenswrapper[4817]: I0218 14:28:51.552267 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"304c0720fb1707cdb5a0a1a8787c4d711625a7ccd3cc3b9a8285884ca73062ac"} err="failed to get container status \"304c0720fb1707cdb5a0a1a8787c4d711625a7ccd3cc3b9a8285884ca73062ac\": rpc error: code = NotFound desc = could not find container \"304c0720fb1707cdb5a0a1a8787c4d711625a7ccd3cc3b9a8285884ca73062ac\": container with ID starting with 304c0720fb1707cdb5a0a1a8787c4d711625a7ccd3cc3b9a8285884ca73062ac not found: ID does not exist" Feb 18 14:28:52 crc kubenswrapper[4817]: I0218 14:28:52.191340 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="532eea9b-6412-4d74-b289-39b3bdab1f60" path="/var/lib/kubelet/pods/532eea9b-6412-4d74-b289-39b3bdab1f60/volumes" Feb 18 14:28:58 crc kubenswrapper[4817]: I0218 14:28:58.051650 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-dzzzv"] Feb 18 14:28:58 crc kubenswrapper[4817]: I0218 14:28:58.062375 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-dzzzv"] Feb 18 14:28:58 crc kubenswrapper[4817]: I0218 14:28:58.187775 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="162fd834-bb59-43f4-98f0-9acb0333e71c" path="/var/lib/kubelet/pods/162fd834-bb59-43f4-98f0-9acb0333e71c/volumes" Feb 18 14:29:00 crc kubenswrapper[4817]: I0218 14:29:00.172633 4817 scope.go:117] "RemoveContainer" containerID="77626d3d8b06dd06e4699408c75ed0f84d2534db6eae4ab257458701371a9858" Feb 18 14:29:00 crc kubenswrapper[4817]: E0218 14:29:00.173347 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:29:14 crc kubenswrapper[4817]: I0218 14:29:14.171735 4817 scope.go:117] "RemoveContainer" containerID="77626d3d8b06dd06e4699408c75ed0f84d2534db6eae4ab257458701371a9858" Feb 18 14:29:14 crc kubenswrapper[4817]: E0218 14:29:14.172577 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:29:26 crc kubenswrapper[4817]: I0218 14:29:26.172732 4817 scope.go:117] "RemoveContainer" containerID="77626d3d8b06dd06e4699408c75ed0f84d2534db6eae4ab257458701371a9858" Feb 18 14:29:26 crc kubenswrapper[4817]: E0218 14:29:26.174660 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:29:28 crc kubenswrapper[4817]: I0218 14:29:28.045305 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-2pxsw"] Feb 18 14:29:28 crc kubenswrapper[4817]: I0218 14:29:28.054714 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-w48xt"] Feb 18 14:29:28 crc kubenswrapper[4817]: I0218 14:29:28.064479 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-2pxsw"] Feb 18 14:29:28 crc kubenswrapper[4817]: I0218 14:29:28.075668 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-w48xt"] Feb 18 14:29:28 crc kubenswrapper[4817]: I0218 14:29:28.183723 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07f0e519-a5f3-45a2-a5da-e10f851f18df" path="/var/lib/kubelet/pods/07f0e519-a5f3-45a2-a5da-e10f851f18df/volumes" Feb 18 14:29:28 crc kubenswrapper[4817]: I0218 14:29:28.185406 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb12a33e-172a-4c2d-8c97-8ae5486ce22d" path="/var/lib/kubelet/pods/fb12a33e-172a-4c2d-8c97-8ae5486ce22d/volumes" Feb 18 14:29:30 crc kubenswrapper[4817]: I0218 14:29:30.039916 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-2fk5c"] Feb 18 14:29:30 crc kubenswrapper[4817]: I0218 14:29:30.056007 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-2fk5c"] Feb 18 14:29:30 crc kubenswrapper[4817]: I0218 14:29:30.185509 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8de51007-ada2-49f5-90b2-11151899e3cf" path="/var/lib/kubelet/pods/8de51007-ada2-49f5-90b2-11151899e3cf/volumes" Feb 18 14:29:31 crc kubenswrapper[4817]: I0218 14:29:31.104407 4817 scope.go:117] "RemoveContainer" containerID="47021a4d2507c0e35400943910b6d9219bf27a681ec61204b6398fd99f2a3061" Feb 18 14:29:31 crc kubenswrapper[4817]: I0218 14:29:31.151249 4817 scope.go:117] "RemoveContainer" containerID="215f78b9b93a35d655cb71cb9ad214093e60b960a2b02feae20952738469759d" Feb 18 14:29:31 crc kubenswrapper[4817]: I0218 14:29:31.233888 4817 scope.go:117] "RemoveContainer" containerID="9adb2e5a4fdf34e1a9a8889725d117c3402d9cc2068cccc21ebd90da49d70abf" Feb 18 14:29:31 crc kubenswrapper[4817]: I0218 14:29:31.273083 4817 scope.go:117] "RemoveContainer" containerID="a2b4cacda3d056587cd39f7a1cbb66789b5d848ebe2fd068888ed5b7d9b5436a" Feb 18 14:29:31 crc kubenswrapper[4817]: I0218 14:29:31.333141 4817 scope.go:117] "RemoveContainer" containerID="e793335c53e9d73b4e1af11f18328dabf6d835bbe47c5bd702d8544002a500f0" Feb 18 14:29:31 crc kubenswrapper[4817]: I0218 14:29:31.376767 4817 scope.go:117] "RemoveContainer" containerID="567f816430e7e0392dac1420abe93af070fcfc13cb22e0b64a9d79548fe9e016" Feb 18 14:29:41 crc kubenswrapper[4817]: I0218 14:29:41.171861 4817 scope.go:117] "RemoveContainer" containerID="77626d3d8b06dd06e4699408c75ed0f84d2534db6eae4ab257458701371a9858" Feb 18 14:29:41 crc kubenswrapper[4817]: E0218 14:29:41.172617 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:29:53 crc kubenswrapper[4817]: I0218 14:29:53.171298 4817 scope.go:117] "RemoveContainer" containerID="77626d3d8b06dd06e4699408c75ed0f84d2534db6eae4ab257458701371a9858" Feb 18 14:29:53 crc kubenswrapper[4817]: E0218 14:29:53.172248 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:30:00 crc kubenswrapper[4817]: I0218 14:30:00.042782 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-pjb2r"] Feb 18 14:30:00 crc kubenswrapper[4817]: I0218 14:30:00.053110 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-rk57x"] Feb 18 14:30:00 crc kubenswrapper[4817]: I0218 14:30:00.065218 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-rk57x"] Feb 18 14:30:00 crc kubenswrapper[4817]: I0218 14:30:00.074218 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-pjb2r"] Feb 18 14:30:00 crc kubenswrapper[4817]: I0218 14:30:00.159118 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523750-79hmz"] Feb 18 14:30:00 crc kubenswrapper[4817]: E0218 14:30:00.164452 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="532eea9b-6412-4d74-b289-39b3bdab1f60" containerName="extract-utilities" Feb 18 14:30:00 crc kubenswrapper[4817]: I0218 14:30:00.164492 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="532eea9b-6412-4d74-b289-39b3bdab1f60" containerName="extract-utilities" Feb 18 14:30:00 crc kubenswrapper[4817]: E0218 14:30:00.164522 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="532eea9b-6412-4d74-b289-39b3bdab1f60" containerName="registry-server" Feb 18 14:30:00 crc kubenswrapper[4817]: I0218 14:30:00.164533 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="532eea9b-6412-4d74-b289-39b3bdab1f60" containerName="registry-server" Feb 18 14:30:00 crc kubenswrapper[4817]: E0218 14:30:00.164555 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="532eea9b-6412-4d74-b289-39b3bdab1f60" containerName="extract-content" Feb 18 14:30:00 crc kubenswrapper[4817]: I0218 14:30:00.164564 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="532eea9b-6412-4d74-b289-39b3bdab1f60" containerName="extract-content" Feb 18 14:30:00 crc kubenswrapper[4817]: I0218 14:30:00.164769 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="532eea9b-6412-4d74-b289-39b3bdab1f60" containerName="registry-server" Feb 18 14:30:00 crc kubenswrapper[4817]: I0218 14:30:00.165602 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523750-79hmz" Feb 18 14:30:00 crc kubenswrapper[4817]: I0218 14:30:00.171462 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 18 14:30:00 crc kubenswrapper[4817]: I0218 14:30:00.185899 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 18 14:30:00 crc kubenswrapper[4817]: I0218 14:30:00.199568 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1" path="/var/lib/kubelet/pods/10d7cc80-38b6-46ed-8ec0-4c8ae03eb2b1/volumes" Feb 18 14:30:00 crc kubenswrapper[4817]: I0218 14:30:00.200405 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fe876a5-4499-40ff-b468-d395efa01d26" path="/var/lib/kubelet/pods/4fe876a5-4499-40ff-b468-d395efa01d26/volumes" Feb 18 14:30:00 crc kubenswrapper[4817]: I0218 14:30:00.201103 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523750-79hmz"] Feb 18 14:30:00 crc kubenswrapper[4817]: I0218 14:30:00.267470 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mc6v\" (UniqueName: \"kubernetes.io/projected/868382c3-7437-4ae4-9dd7-0f7629fe09ab-kube-api-access-9mc6v\") pod \"collect-profiles-29523750-79hmz\" (UID: \"868382c3-7437-4ae4-9dd7-0f7629fe09ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523750-79hmz" Feb 18 14:30:00 crc kubenswrapper[4817]: I0218 14:30:00.267535 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/868382c3-7437-4ae4-9dd7-0f7629fe09ab-config-volume\") pod \"collect-profiles-29523750-79hmz\" (UID: \"868382c3-7437-4ae4-9dd7-0f7629fe09ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523750-79hmz" Feb 18 14:30:00 crc kubenswrapper[4817]: I0218 14:30:00.267563 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/868382c3-7437-4ae4-9dd7-0f7629fe09ab-secret-volume\") pod \"collect-profiles-29523750-79hmz\" (UID: \"868382c3-7437-4ae4-9dd7-0f7629fe09ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523750-79hmz" Feb 18 14:30:00 crc kubenswrapper[4817]: I0218 14:30:00.369701 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/868382c3-7437-4ae4-9dd7-0f7629fe09ab-config-volume\") pod \"collect-profiles-29523750-79hmz\" (UID: \"868382c3-7437-4ae4-9dd7-0f7629fe09ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523750-79hmz" Feb 18 14:30:00 crc kubenswrapper[4817]: I0218 14:30:00.369757 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/868382c3-7437-4ae4-9dd7-0f7629fe09ab-secret-volume\") pod \"collect-profiles-29523750-79hmz\" (UID: \"868382c3-7437-4ae4-9dd7-0f7629fe09ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523750-79hmz" Feb 18 14:30:00 crc kubenswrapper[4817]: I0218 14:30:00.369935 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mc6v\" (UniqueName: \"kubernetes.io/projected/868382c3-7437-4ae4-9dd7-0f7629fe09ab-kube-api-access-9mc6v\") pod \"collect-profiles-29523750-79hmz\" (UID: \"868382c3-7437-4ae4-9dd7-0f7629fe09ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523750-79hmz" Feb 18 14:30:00 crc kubenswrapper[4817]: I0218 14:30:00.371072 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/868382c3-7437-4ae4-9dd7-0f7629fe09ab-config-volume\") pod \"collect-profiles-29523750-79hmz\" (UID: \"868382c3-7437-4ae4-9dd7-0f7629fe09ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523750-79hmz" Feb 18 14:30:00 crc kubenswrapper[4817]: I0218 14:30:00.391746 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/868382c3-7437-4ae4-9dd7-0f7629fe09ab-secret-volume\") pod \"collect-profiles-29523750-79hmz\" (UID: \"868382c3-7437-4ae4-9dd7-0f7629fe09ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523750-79hmz" Feb 18 14:30:00 crc kubenswrapper[4817]: I0218 14:30:00.397398 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mc6v\" (UniqueName: \"kubernetes.io/projected/868382c3-7437-4ae4-9dd7-0f7629fe09ab-kube-api-access-9mc6v\") pod \"collect-profiles-29523750-79hmz\" (UID: \"868382c3-7437-4ae4-9dd7-0f7629fe09ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523750-79hmz" Feb 18 14:30:00 crc kubenswrapper[4817]: I0218 14:30:00.502429 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523750-79hmz" Feb 18 14:30:00 crc kubenswrapper[4817]: I0218 14:30:00.979613 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523750-79hmz"] Feb 18 14:30:01 crc kubenswrapper[4817]: I0218 14:30:01.031694 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-54xpq"] Feb 18 14:30:01 crc kubenswrapper[4817]: I0218 14:30:01.042878 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-54xpq"] Feb 18 14:30:01 crc kubenswrapper[4817]: I0218 14:30:01.068717 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523750-79hmz" event={"ID":"868382c3-7437-4ae4-9dd7-0f7629fe09ab","Type":"ContainerStarted","Data":"06438531638b43cf24eacf7735fd704798c5a6188b1eab824db06936ca5b3e73"} Feb 18 14:30:02 crc kubenswrapper[4817]: I0218 14:30:02.088542 4817 generic.go:334] "Generic (PLEG): container finished" podID="868382c3-7437-4ae4-9dd7-0f7629fe09ab" containerID="22578da341ece50346a885ac68787bde287f82a2032211be12e36cb991fe55fa" exitCode=0 Feb 18 14:30:02 crc kubenswrapper[4817]: I0218 14:30:02.088887 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523750-79hmz" event={"ID":"868382c3-7437-4ae4-9dd7-0f7629fe09ab","Type":"ContainerDied","Data":"22578da341ece50346a885ac68787bde287f82a2032211be12e36cb991fe55fa"} Feb 18 14:30:02 crc kubenswrapper[4817]: I0218 14:30:02.184238 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42f4b322-ace9-42b0-944b-c5fa3181fc54" path="/var/lib/kubelet/pods/42f4b322-ace9-42b0-944b-c5fa3181fc54/volumes" Feb 18 14:30:03 crc kubenswrapper[4817]: I0218 14:30:03.029907 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-16c7-account-create-update-qgzsl"] Feb 18 14:30:03 crc kubenswrapper[4817]: I0218 14:30:03.040773 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-e448-account-create-update-fzq87"] Feb 18 14:30:03 crc kubenswrapper[4817]: I0218 14:30:03.052738 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-16c7-account-create-update-qgzsl"] Feb 18 14:30:03 crc kubenswrapper[4817]: I0218 14:30:03.063952 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-e448-account-create-update-fzq87"] Feb 18 14:30:03 crc kubenswrapper[4817]: I0218 14:30:03.074509 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-9b47-account-create-update-cxfbq"] Feb 18 14:30:03 crc kubenswrapper[4817]: I0218 14:30:03.085751 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-9b47-account-create-update-cxfbq"] Feb 18 14:30:03 crc kubenswrapper[4817]: I0218 14:30:03.494442 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523750-79hmz" Feb 18 14:30:03 crc kubenswrapper[4817]: I0218 14:30:03.635208 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/868382c3-7437-4ae4-9dd7-0f7629fe09ab-config-volume\") pod \"868382c3-7437-4ae4-9dd7-0f7629fe09ab\" (UID: \"868382c3-7437-4ae4-9dd7-0f7629fe09ab\") " Feb 18 14:30:03 crc kubenswrapper[4817]: I0218 14:30:03.635618 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/868382c3-7437-4ae4-9dd7-0f7629fe09ab-secret-volume\") pod \"868382c3-7437-4ae4-9dd7-0f7629fe09ab\" (UID: \"868382c3-7437-4ae4-9dd7-0f7629fe09ab\") " Feb 18 14:30:03 crc kubenswrapper[4817]: I0218 14:30:03.635955 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mc6v\" (UniqueName: \"kubernetes.io/projected/868382c3-7437-4ae4-9dd7-0f7629fe09ab-kube-api-access-9mc6v\") pod \"868382c3-7437-4ae4-9dd7-0f7629fe09ab\" (UID: \"868382c3-7437-4ae4-9dd7-0f7629fe09ab\") " Feb 18 14:30:03 crc kubenswrapper[4817]: I0218 14:30:03.636238 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/868382c3-7437-4ae4-9dd7-0f7629fe09ab-config-volume" (OuterVolumeSpecName: "config-volume") pod "868382c3-7437-4ae4-9dd7-0f7629fe09ab" (UID: "868382c3-7437-4ae4-9dd7-0f7629fe09ab"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:30:03 crc kubenswrapper[4817]: I0218 14:30:03.636753 4817 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/868382c3-7437-4ae4-9dd7-0f7629fe09ab-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 14:30:03 crc kubenswrapper[4817]: I0218 14:30:03.641230 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/868382c3-7437-4ae4-9dd7-0f7629fe09ab-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "868382c3-7437-4ae4-9dd7-0f7629fe09ab" (UID: "868382c3-7437-4ae4-9dd7-0f7629fe09ab"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:30:03 crc kubenswrapper[4817]: I0218 14:30:03.643254 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/868382c3-7437-4ae4-9dd7-0f7629fe09ab-kube-api-access-9mc6v" (OuterVolumeSpecName: "kube-api-access-9mc6v") pod "868382c3-7437-4ae4-9dd7-0f7629fe09ab" (UID: "868382c3-7437-4ae4-9dd7-0f7629fe09ab"). InnerVolumeSpecName "kube-api-access-9mc6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:30:03 crc kubenswrapper[4817]: I0218 14:30:03.738611 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mc6v\" (UniqueName: \"kubernetes.io/projected/868382c3-7437-4ae4-9dd7-0f7629fe09ab-kube-api-access-9mc6v\") on node \"crc\" DevicePath \"\"" Feb 18 14:30:03 crc kubenswrapper[4817]: I0218 14:30:03.738909 4817 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/868382c3-7437-4ae4-9dd7-0f7629fe09ab-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 18 14:30:04 crc kubenswrapper[4817]: I0218 14:30:04.110272 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523750-79hmz" event={"ID":"868382c3-7437-4ae4-9dd7-0f7629fe09ab","Type":"ContainerDied","Data":"06438531638b43cf24eacf7735fd704798c5a6188b1eab824db06936ca5b3e73"} Feb 18 14:30:04 crc kubenswrapper[4817]: I0218 14:30:04.110631 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06438531638b43cf24eacf7735fd704798c5a6188b1eab824db06936ca5b3e73" Feb 18 14:30:04 crc kubenswrapper[4817]: I0218 14:30:04.110389 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523750-79hmz" Feb 18 14:30:04 crc kubenswrapper[4817]: I0218 14:30:04.172420 4817 scope.go:117] "RemoveContainer" containerID="77626d3d8b06dd06e4699408c75ed0f84d2534db6eae4ab257458701371a9858" Feb 18 14:30:04 crc kubenswrapper[4817]: E0218 14:30:04.172798 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:30:04 crc kubenswrapper[4817]: I0218 14:30:04.187505 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43f9dae0-f2ed-4f91-b922-6f3432c8997d" path="/var/lib/kubelet/pods/43f9dae0-f2ed-4f91-b922-6f3432c8997d/volumes" Feb 18 14:30:04 crc kubenswrapper[4817]: I0218 14:30:04.188137 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="585ec0f8-a374-44ae-8b97-024af4983f69" path="/var/lib/kubelet/pods/585ec0f8-a374-44ae-8b97-024af4983f69/volumes" Feb 18 14:30:04 crc kubenswrapper[4817]: I0218 14:30:04.188693 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5d3725d-5bb0-4edd-b707-6690d2ac99f5" path="/var/lib/kubelet/pods/c5d3725d-5bb0-4edd-b707-6690d2ac99f5/volumes" Feb 18 14:30:04 crc kubenswrapper[4817]: I0218 14:30:04.587496 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523705-gm644"] Feb 18 14:30:04 crc kubenswrapper[4817]: I0218 14:30:04.597279 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523705-gm644"] Feb 18 14:30:06 crc kubenswrapper[4817]: I0218 14:30:06.184160 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f85543c-dccf-4a3e-be40-7305a2e49d1d" path="/var/lib/kubelet/pods/3f85543c-dccf-4a3e-be40-7305a2e49d1d/volumes" Feb 18 14:30:06 crc kubenswrapper[4817]: E0218 14:30:06.814283 4817 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b90b086_b1c9_4a0a_9f4e_ebf1c8beb807.slice/crio-conmon-cdee6aa24052e44ba63e0e16dd3c469c7263307205e73bf59721a21dd6aec780.scope\": RecentStats: unable to find data in memory cache]" Feb 18 14:30:07 crc kubenswrapper[4817]: I0218 14:30:07.139292 4817 generic.go:334] "Generic (PLEG): container finished" podID="5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807" containerID="cdee6aa24052e44ba63e0e16dd3c469c7263307205e73bf59721a21dd6aec780" exitCode=0 Feb 18 14:30:07 crc kubenswrapper[4817]: I0218 14:30:07.139344 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qxkc" event={"ID":"5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807","Type":"ContainerDied","Data":"cdee6aa24052e44ba63e0e16dd3c469c7263307205e73bf59721a21dd6aec780"} Feb 18 14:30:08 crc kubenswrapper[4817]: I0218 14:30:08.796622 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qxkc" Feb 18 14:30:08 crc kubenswrapper[4817]: I0218 14:30:08.902917 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghz7r\" (UniqueName: \"kubernetes.io/projected/5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807-kube-api-access-ghz7r\") pod \"5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807\" (UID: \"5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807\") " Feb 18 14:30:08 crc kubenswrapper[4817]: I0218 14:30:08.903060 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807-inventory\") pod \"5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807\" (UID: \"5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807\") " Feb 18 14:30:08 crc kubenswrapper[4817]: I0218 14:30:08.903111 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807-ssh-key-openstack-edpm-ipam\") pod \"5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807\" (UID: \"5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807\") " Feb 18 14:30:08 crc kubenswrapper[4817]: I0218 14:30:08.908682 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807-kube-api-access-ghz7r" (OuterVolumeSpecName: "kube-api-access-ghz7r") pod "5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807" (UID: "5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807"). InnerVolumeSpecName "kube-api-access-ghz7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:30:08 crc kubenswrapper[4817]: I0218 14:30:08.933650 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807-inventory" (OuterVolumeSpecName: "inventory") pod "5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807" (UID: "5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:30:08 crc kubenswrapper[4817]: I0218 14:30:08.955473 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807" (UID: "5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:30:09 crc kubenswrapper[4817]: I0218 14:30:09.006228 4817 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 14:30:09 crc kubenswrapper[4817]: I0218 14:30:09.006273 4817 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 14:30:09 crc kubenswrapper[4817]: I0218 14:30:09.006288 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghz7r\" (UniqueName: \"kubernetes.io/projected/5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807-kube-api-access-ghz7r\") on node \"crc\" DevicePath \"\"" Feb 18 14:30:09 crc kubenswrapper[4817]: I0218 14:30:09.162167 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qxkc" event={"ID":"5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807","Type":"ContainerDied","Data":"d9ee74c33f26af350a22a154094ea45474efdac238a0d1da068980830611c3ba"} Feb 18 14:30:09 crc kubenswrapper[4817]: I0218 14:30:09.162223 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9ee74c33f26af350a22a154094ea45474efdac238a0d1da068980830611c3ba" Feb 18 14:30:09 crc kubenswrapper[4817]: I0218 14:30:09.162549 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qxkc" Feb 18 14:30:09 crc kubenswrapper[4817]: I0218 14:30:09.242240 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vftp7"] Feb 18 14:30:09 crc kubenswrapper[4817]: E0218 14:30:09.242704 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="868382c3-7437-4ae4-9dd7-0f7629fe09ab" containerName="collect-profiles" Feb 18 14:30:09 crc kubenswrapper[4817]: I0218 14:30:09.242730 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="868382c3-7437-4ae4-9dd7-0f7629fe09ab" containerName="collect-profiles" Feb 18 14:30:09 crc kubenswrapper[4817]: E0218 14:30:09.242781 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 18 14:30:09 crc kubenswrapper[4817]: I0218 14:30:09.242792 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 18 14:30:09 crc kubenswrapper[4817]: I0218 14:30:09.243022 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="868382c3-7437-4ae4-9dd7-0f7629fe09ab" containerName="collect-profiles" Feb 18 14:30:09 crc kubenswrapper[4817]: I0218 14:30:09.243049 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 18 14:30:09 crc kubenswrapper[4817]: I0218 14:30:09.245272 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vftp7" Feb 18 14:30:09 crc kubenswrapper[4817]: I0218 14:30:09.249896 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x8jkl" Feb 18 14:30:09 crc kubenswrapper[4817]: I0218 14:30:09.250295 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 14:30:09 crc kubenswrapper[4817]: I0218 14:30:09.250506 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 14:30:09 crc kubenswrapper[4817]: I0218 14:30:09.250817 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 14:30:09 crc kubenswrapper[4817]: I0218 14:30:09.258798 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vftp7"] Feb 18 14:30:09 crc kubenswrapper[4817]: I0218 14:30:09.413550 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2md4q\" (UniqueName: \"kubernetes.io/projected/f04dc5ce-0657-4e8c-8c0a-3b86924ea903-kube-api-access-2md4q\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vftp7\" (UID: \"f04dc5ce-0657-4e8c-8c0a-3b86924ea903\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vftp7" Feb 18 14:30:09 crc kubenswrapper[4817]: I0218 14:30:09.414178 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f04dc5ce-0657-4e8c-8c0a-3b86924ea903-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vftp7\" (UID: \"f04dc5ce-0657-4e8c-8c0a-3b86924ea903\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vftp7" Feb 18 14:30:09 crc kubenswrapper[4817]: I0218 14:30:09.414214 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f04dc5ce-0657-4e8c-8c0a-3b86924ea903-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vftp7\" (UID: \"f04dc5ce-0657-4e8c-8c0a-3b86924ea903\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vftp7" Feb 18 14:30:09 crc kubenswrapper[4817]: I0218 14:30:09.516369 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f04dc5ce-0657-4e8c-8c0a-3b86924ea903-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vftp7\" (UID: \"f04dc5ce-0657-4e8c-8c0a-3b86924ea903\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vftp7" Feb 18 14:30:09 crc kubenswrapper[4817]: I0218 14:30:09.516414 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f04dc5ce-0657-4e8c-8c0a-3b86924ea903-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vftp7\" (UID: \"f04dc5ce-0657-4e8c-8c0a-3b86924ea903\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vftp7" Feb 18 14:30:09 crc kubenswrapper[4817]: I0218 14:30:09.516441 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2md4q\" (UniqueName: \"kubernetes.io/projected/f04dc5ce-0657-4e8c-8c0a-3b86924ea903-kube-api-access-2md4q\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vftp7\" (UID: \"f04dc5ce-0657-4e8c-8c0a-3b86924ea903\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vftp7" Feb 18 14:30:09 crc kubenswrapper[4817]: I0218 14:30:09.520119 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f04dc5ce-0657-4e8c-8c0a-3b86924ea903-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vftp7\" (UID: \"f04dc5ce-0657-4e8c-8c0a-3b86924ea903\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vftp7" Feb 18 14:30:09 crc kubenswrapper[4817]: I0218 14:30:09.530785 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f04dc5ce-0657-4e8c-8c0a-3b86924ea903-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vftp7\" (UID: \"f04dc5ce-0657-4e8c-8c0a-3b86924ea903\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vftp7" Feb 18 14:30:09 crc kubenswrapper[4817]: I0218 14:30:09.533729 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2md4q\" (UniqueName: \"kubernetes.io/projected/f04dc5ce-0657-4e8c-8c0a-3b86924ea903-kube-api-access-2md4q\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vftp7\" (UID: \"f04dc5ce-0657-4e8c-8c0a-3b86924ea903\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vftp7" Feb 18 14:30:09 crc kubenswrapper[4817]: I0218 14:30:09.563381 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vftp7" Feb 18 14:30:10 crc kubenswrapper[4817]: I0218 14:30:10.145933 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vftp7"] Feb 18 14:30:10 crc kubenswrapper[4817]: I0218 14:30:10.149972 4817 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 14:30:10 crc kubenswrapper[4817]: I0218 14:30:10.191084 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vftp7" event={"ID":"f04dc5ce-0657-4e8c-8c0a-3b86924ea903","Type":"ContainerStarted","Data":"4b0845a4fa60ea88dbda152e24aa3c55f84bc3433c7219c9c728890cfdca0ac1"} Feb 18 14:30:11 crc kubenswrapper[4817]: I0218 14:30:11.193470 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vftp7" event={"ID":"f04dc5ce-0657-4e8c-8c0a-3b86924ea903","Type":"ContainerStarted","Data":"d9906d36a69122eaf6fc8dddd5e50b19060c07c9f48fbb47e04730965744e215"} Feb 18 14:30:11 crc kubenswrapper[4817]: I0218 14:30:11.211639 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vftp7" podStartSLOduration=1.6690477929999998 podStartE2EDuration="2.211617552s" podCreationTimestamp="2026-02-18 14:30:09 +0000 UTC" firstStartedPulling="2026-02-18 14:30:10.149776664 +0000 UTC m=+1872.725312647" lastFinishedPulling="2026-02-18 14:30:10.692346423 +0000 UTC m=+1873.267882406" observedRunningTime="2026-02-18 14:30:11.209762555 +0000 UTC m=+1873.785298548" watchObservedRunningTime="2026-02-18 14:30:11.211617552 +0000 UTC m=+1873.787153535" Feb 18 14:30:17 crc kubenswrapper[4817]: I0218 14:30:17.171882 4817 scope.go:117] "RemoveContainer" containerID="77626d3d8b06dd06e4699408c75ed0f84d2534db6eae4ab257458701371a9858" Feb 18 14:30:17 crc kubenswrapper[4817]: E0218 14:30:17.172692 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:30:28 crc kubenswrapper[4817]: I0218 14:30:28.177602 4817 scope.go:117] "RemoveContainer" containerID="77626d3d8b06dd06e4699408c75ed0f84d2534db6eae4ab257458701371a9858" Feb 18 14:30:28 crc kubenswrapper[4817]: E0218 14:30:28.178302 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:30:31 crc kubenswrapper[4817]: I0218 14:30:31.699830 4817 scope.go:117] "RemoveContainer" containerID="a8000b31370d1a61d32be27c11e56e9a063f1030ec9c7d7906a0b4c19aa8246a" Feb 18 14:30:31 crc kubenswrapper[4817]: I0218 14:30:31.728530 4817 scope.go:117] "RemoveContainer" containerID="285489d62434a2d1524abfb4ca7649979b15a2848e6d6d60c03d663094c8870b" Feb 18 14:30:31 crc kubenswrapper[4817]: I0218 14:30:31.793335 4817 scope.go:117] "RemoveContainer" containerID="b7c54389cd72035e803566681b0052acb780bca86033ace971da40787be37322" Feb 18 14:30:31 crc kubenswrapper[4817]: I0218 14:30:31.850119 4817 scope.go:117] "RemoveContainer" containerID="0d0b900a8db1d295b441b5270730b75e4e473ab00702ff85d8156305dc718499" Feb 18 14:30:31 crc kubenswrapper[4817]: I0218 14:30:31.899594 4817 scope.go:117] "RemoveContainer" containerID="a57f60ed37120dae989c8e17c03a3028d7155108550b32a921b8a3e1b9462836" Feb 18 14:30:31 crc kubenswrapper[4817]: I0218 14:30:31.952870 4817 scope.go:117] "RemoveContainer" containerID="d103343187fc0b61663b10abdb9c7a21d00e8d59353a5b51024fe08bcfe57e4d" Feb 18 14:30:31 crc kubenswrapper[4817]: I0218 14:30:31.999090 4817 scope.go:117] "RemoveContainer" containerID="7536b26f3ab7582e5f8c97ef070a635653e8f1b7325cb897f3d1d94e52ad96ac" Feb 18 14:30:40 crc kubenswrapper[4817]: I0218 14:30:40.676570 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vrphp"] Feb 18 14:30:40 crc kubenswrapper[4817]: I0218 14:30:40.679904 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vrphp" Feb 18 14:30:40 crc kubenswrapper[4817]: I0218 14:30:40.689913 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vrphp"] Feb 18 14:30:40 crc kubenswrapper[4817]: I0218 14:30:40.770558 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwczc\" (UniqueName: \"kubernetes.io/projected/d96953ea-2054-4676-9ead-94870b6297d6-kube-api-access-kwczc\") pod \"certified-operators-vrphp\" (UID: \"d96953ea-2054-4676-9ead-94870b6297d6\") " pod="openshift-marketplace/certified-operators-vrphp" Feb 18 14:30:40 crc kubenswrapper[4817]: I0218 14:30:40.770740 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d96953ea-2054-4676-9ead-94870b6297d6-utilities\") pod \"certified-operators-vrphp\" (UID: \"d96953ea-2054-4676-9ead-94870b6297d6\") " pod="openshift-marketplace/certified-operators-vrphp" Feb 18 14:30:40 crc kubenswrapper[4817]: I0218 14:30:40.770832 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d96953ea-2054-4676-9ead-94870b6297d6-catalog-content\") pod \"certified-operators-vrphp\" (UID: \"d96953ea-2054-4676-9ead-94870b6297d6\") " pod="openshift-marketplace/certified-operators-vrphp" Feb 18 14:30:40 crc kubenswrapper[4817]: I0218 14:30:40.873432 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwczc\" (UniqueName: \"kubernetes.io/projected/d96953ea-2054-4676-9ead-94870b6297d6-kube-api-access-kwczc\") pod \"certified-operators-vrphp\" (UID: \"d96953ea-2054-4676-9ead-94870b6297d6\") " pod="openshift-marketplace/certified-operators-vrphp" Feb 18 14:30:40 crc kubenswrapper[4817]: I0218 14:30:40.873526 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d96953ea-2054-4676-9ead-94870b6297d6-utilities\") pod \"certified-operators-vrphp\" (UID: \"d96953ea-2054-4676-9ead-94870b6297d6\") " pod="openshift-marketplace/certified-operators-vrphp" Feb 18 14:30:40 crc kubenswrapper[4817]: I0218 14:30:40.873619 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d96953ea-2054-4676-9ead-94870b6297d6-catalog-content\") pod \"certified-operators-vrphp\" (UID: \"d96953ea-2054-4676-9ead-94870b6297d6\") " pod="openshift-marketplace/certified-operators-vrphp" Feb 18 14:30:40 crc kubenswrapper[4817]: I0218 14:30:40.874046 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d96953ea-2054-4676-9ead-94870b6297d6-utilities\") pod \"certified-operators-vrphp\" (UID: \"d96953ea-2054-4676-9ead-94870b6297d6\") " pod="openshift-marketplace/certified-operators-vrphp" Feb 18 14:30:40 crc kubenswrapper[4817]: I0218 14:30:40.874053 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d96953ea-2054-4676-9ead-94870b6297d6-catalog-content\") pod \"certified-operators-vrphp\" (UID: \"d96953ea-2054-4676-9ead-94870b6297d6\") " pod="openshift-marketplace/certified-operators-vrphp" Feb 18 14:30:40 crc kubenswrapper[4817]: I0218 14:30:40.902615 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwczc\" (UniqueName: \"kubernetes.io/projected/d96953ea-2054-4676-9ead-94870b6297d6-kube-api-access-kwczc\") pod \"certified-operators-vrphp\" (UID: \"d96953ea-2054-4676-9ead-94870b6297d6\") " pod="openshift-marketplace/certified-operators-vrphp" Feb 18 14:30:41 crc kubenswrapper[4817]: I0218 14:30:41.010503 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vrphp" Feb 18 14:30:41 crc kubenswrapper[4817]: I0218 14:30:41.507607 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vrphp"] Feb 18 14:30:42 crc kubenswrapper[4817]: I0218 14:30:42.503907 4817 generic.go:334] "Generic (PLEG): container finished" podID="d96953ea-2054-4676-9ead-94870b6297d6" containerID="1061eae23708d265ed2e660527915ad8354b687124ca71484eb8a99a88676c0f" exitCode=0 Feb 18 14:30:42 crc kubenswrapper[4817]: I0218 14:30:42.503966 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrphp" event={"ID":"d96953ea-2054-4676-9ead-94870b6297d6","Type":"ContainerDied","Data":"1061eae23708d265ed2e660527915ad8354b687124ca71484eb8a99a88676c0f"} Feb 18 14:30:42 crc kubenswrapper[4817]: I0218 14:30:42.504019 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrphp" event={"ID":"d96953ea-2054-4676-9ead-94870b6297d6","Type":"ContainerStarted","Data":"d9763bc3f1adf82d86d4429106230db412381f92c9e11586226d05b911395cbd"} Feb 18 14:30:43 crc kubenswrapper[4817]: I0218 14:30:43.172306 4817 scope.go:117] "RemoveContainer" containerID="77626d3d8b06dd06e4699408c75ed0f84d2534db6eae4ab257458701371a9858" Feb 18 14:30:43 crc kubenswrapper[4817]: E0218 14:30:43.172858 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:30:43 crc kubenswrapper[4817]: I0218 14:30:43.515185 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrphp" event={"ID":"d96953ea-2054-4676-9ead-94870b6297d6","Type":"ContainerStarted","Data":"4a9f45e449119cbd7775511b92b57750851b6e1eb732e6d154820188b49ca520"} Feb 18 14:30:45 crc kubenswrapper[4817]: I0218 14:30:45.534504 4817 generic.go:334] "Generic (PLEG): container finished" podID="d96953ea-2054-4676-9ead-94870b6297d6" containerID="4a9f45e449119cbd7775511b92b57750851b6e1eb732e6d154820188b49ca520" exitCode=0 Feb 18 14:30:45 crc kubenswrapper[4817]: I0218 14:30:45.534769 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrphp" event={"ID":"d96953ea-2054-4676-9ead-94870b6297d6","Type":"ContainerDied","Data":"4a9f45e449119cbd7775511b92b57750851b6e1eb732e6d154820188b49ca520"} Feb 18 14:30:46 crc kubenswrapper[4817]: I0218 14:30:46.546723 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrphp" event={"ID":"d96953ea-2054-4676-9ead-94870b6297d6","Type":"ContainerStarted","Data":"c57af8ea02b04300baf55535851ae08971f2b644bbff6d79e245e014de23980a"} Feb 18 14:30:46 crc kubenswrapper[4817]: I0218 14:30:46.569729 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vrphp" podStartSLOduration=3.150199834 podStartE2EDuration="6.569708546s" podCreationTimestamp="2026-02-18 14:30:40 +0000 UTC" firstStartedPulling="2026-02-18 14:30:42.507109683 +0000 UTC m=+1905.082645676" lastFinishedPulling="2026-02-18 14:30:45.926618405 +0000 UTC m=+1908.502154388" observedRunningTime="2026-02-18 14:30:46.566501885 +0000 UTC m=+1909.142037878" watchObservedRunningTime="2026-02-18 14:30:46.569708546 +0000 UTC m=+1909.145244549" Feb 18 14:30:51 crc kubenswrapper[4817]: I0218 14:30:51.011295 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vrphp" Feb 18 14:30:51 crc kubenswrapper[4817]: I0218 14:30:51.011507 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vrphp" Feb 18 14:30:51 crc kubenswrapper[4817]: I0218 14:30:51.058170 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vrphp" Feb 18 14:30:51 crc kubenswrapper[4817]: I0218 14:30:51.649818 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vrphp" Feb 18 14:30:51 crc kubenswrapper[4817]: I0218 14:30:51.710041 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vrphp"] Feb 18 14:30:53 crc kubenswrapper[4817]: I0218 14:30:53.620010 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vrphp" podUID="d96953ea-2054-4676-9ead-94870b6297d6" containerName="registry-server" containerID="cri-o://c57af8ea02b04300baf55535851ae08971f2b644bbff6d79e245e014de23980a" gracePeriod=2 Feb 18 14:30:54 crc kubenswrapper[4817]: I0218 14:30:54.165159 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vrphp" Feb 18 14:30:54 crc kubenswrapper[4817]: I0218 14:30:54.289480 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d96953ea-2054-4676-9ead-94870b6297d6-catalog-content\") pod \"d96953ea-2054-4676-9ead-94870b6297d6\" (UID: \"d96953ea-2054-4676-9ead-94870b6297d6\") " Feb 18 14:30:54 crc kubenswrapper[4817]: I0218 14:30:54.289572 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d96953ea-2054-4676-9ead-94870b6297d6-utilities\") pod \"d96953ea-2054-4676-9ead-94870b6297d6\" (UID: \"d96953ea-2054-4676-9ead-94870b6297d6\") " Feb 18 14:30:54 crc kubenswrapper[4817]: I0218 14:30:54.289610 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwczc\" (UniqueName: \"kubernetes.io/projected/d96953ea-2054-4676-9ead-94870b6297d6-kube-api-access-kwczc\") pod \"d96953ea-2054-4676-9ead-94870b6297d6\" (UID: \"d96953ea-2054-4676-9ead-94870b6297d6\") " Feb 18 14:30:54 crc kubenswrapper[4817]: I0218 14:30:54.291836 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d96953ea-2054-4676-9ead-94870b6297d6-utilities" (OuterVolumeSpecName: "utilities") pod "d96953ea-2054-4676-9ead-94870b6297d6" (UID: "d96953ea-2054-4676-9ead-94870b6297d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:30:54 crc kubenswrapper[4817]: I0218 14:30:54.297016 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d96953ea-2054-4676-9ead-94870b6297d6-kube-api-access-kwczc" (OuterVolumeSpecName: "kube-api-access-kwczc") pod "d96953ea-2054-4676-9ead-94870b6297d6" (UID: "d96953ea-2054-4676-9ead-94870b6297d6"). InnerVolumeSpecName "kube-api-access-kwczc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:30:54 crc kubenswrapper[4817]: I0218 14:30:54.361639 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d96953ea-2054-4676-9ead-94870b6297d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d96953ea-2054-4676-9ead-94870b6297d6" (UID: "d96953ea-2054-4676-9ead-94870b6297d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:30:54 crc kubenswrapper[4817]: I0218 14:30:54.392592 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d96953ea-2054-4676-9ead-94870b6297d6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 14:30:54 crc kubenswrapper[4817]: I0218 14:30:54.392626 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d96953ea-2054-4676-9ead-94870b6297d6-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 14:30:54 crc kubenswrapper[4817]: I0218 14:30:54.392637 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwczc\" (UniqueName: \"kubernetes.io/projected/d96953ea-2054-4676-9ead-94870b6297d6-kube-api-access-kwczc\") on node \"crc\" DevicePath \"\"" Feb 18 14:30:54 crc kubenswrapper[4817]: I0218 14:30:54.630025 4817 generic.go:334] "Generic (PLEG): container finished" podID="d96953ea-2054-4676-9ead-94870b6297d6" containerID="c57af8ea02b04300baf55535851ae08971f2b644bbff6d79e245e014de23980a" exitCode=0 Feb 18 14:30:54 crc kubenswrapper[4817]: I0218 14:30:54.630061 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vrphp" Feb 18 14:30:54 crc kubenswrapper[4817]: I0218 14:30:54.630072 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrphp" event={"ID":"d96953ea-2054-4676-9ead-94870b6297d6","Type":"ContainerDied","Data":"c57af8ea02b04300baf55535851ae08971f2b644bbff6d79e245e014de23980a"} Feb 18 14:30:54 crc kubenswrapper[4817]: I0218 14:30:54.630122 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrphp" event={"ID":"d96953ea-2054-4676-9ead-94870b6297d6","Type":"ContainerDied","Data":"d9763bc3f1adf82d86d4429106230db412381f92c9e11586226d05b911395cbd"} Feb 18 14:30:54 crc kubenswrapper[4817]: I0218 14:30:54.630156 4817 scope.go:117] "RemoveContainer" containerID="c57af8ea02b04300baf55535851ae08971f2b644bbff6d79e245e014de23980a" Feb 18 14:30:54 crc kubenswrapper[4817]: I0218 14:30:54.651193 4817 scope.go:117] "RemoveContainer" containerID="4a9f45e449119cbd7775511b92b57750851b6e1eb732e6d154820188b49ca520" Feb 18 14:30:54 crc kubenswrapper[4817]: I0218 14:30:54.671188 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vrphp"] Feb 18 14:30:54 crc kubenswrapper[4817]: I0218 14:30:54.681301 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vrphp"] Feb 18 14:30:54 crc kubenswrapper[4817]: I0218 14:30:54.687641 4817 scope.go:117] "RemoveContainer" containerID="1061eae23708d265ed2e660527915ad8354b687124ca71484eb8a99a88676c0f" Feb 18 14:30:54 crc kubenswrapper[4817]: I0218 14:30:54.721450 4817 scope.go:117] "RemoveContainer" containerID="c57af8ea02b04300baf55535851ae08971f2b644bbff6d79e245e014de23980a" Feb 18 14:30:54 crc kubenswrapper[4817]: E0218 14:30:54.721911 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c57af8ea02b04300baf55535851ae08971f2b644bbff6d79e245e014de23980a\": container with ID starting with c57af8ea02b04300baf55535851ae08971f2b644bbff6d79e245e014de23980a not found: ID does not exist" containerID="c57af8ea02b04300baf55535851ae08971f2b644bbff6d79e245e014de23980a" Feb 18 14:30:54 crc kubenswrapper[4817]: I0218 14:30:54.721942 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c57af8ea02b04300baf55535851ae08971f2b644bbff6d79e245e014de23980a"} err="failed to get container status \"c57af8ea02b04300baf55535851ae08971f2b644bbff6d79e245e014de23980a\": rpc error: code = NotFound desc = could not find container \"c57af8ea02b04300baf55535851ae08971f2b644bbff6d79e245e014de23980a\": container with ID starting with c57af8ea02b04300baf55535851ae08971f2b644bbff6d79e245e014de23980a not found: ID does not exist" Feb 18 14:30:54 crc kubenswrapper[4817]: I0218 14:30:54.721964 4817 scope.go:117] "RemoveContainer" containerID="4a9f45e449119cbd7775511b92b57750851b6e1eb732e6d154820188b49ca520" Feb 18 14:30:54 crc kubenswrapper[4817]: E0218 14:30:54.722599 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a9f45e449119cbd7775511b92b57750851b6e1eb732e6d154820188b49ca520\": container with ID starting with 4a9f45e449119cbd7775511b92b57750851b6e1eb732e6d154820188b49ca520 not found: ID does not exist" containerID="4a9f45e449119cbd7775511b92b57750851b6e1eb732e6d154820188b49ca520" Feb 18 14:30:54 crc kubenswrapper[4817]: I0218 14:30:54.722660 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a9f45e449119cbd7775511b92b57750851b6e1eb732e6d154820188b49ca520"} err="failed to get container status \"4a9f45e449119cbd7775511b92b57750851b6e1eb732e6d154820188b49ca520\": rpc error: code = NotFound desc = could not find container \"4a9f45e449119cbd7775511b92b57750851b6e1eb732e6d154820188b49ca520\": container with ID starting with 4a9f45e449119cbd7775511b92b57750851b6e1eb732e6d154820188b49ca520 not found: ID does not exist" Feb 18 14:30:54 crc kubenswrapper[4817]: I0218 14:30:54.722677 4817 scope.go:117] "RemoveContainer" containerID="1061eae23708d265ed2e660527915ad8354b687124ca71484eb8a99a88676c0f" Feb 18 14:30:54 crc kubenswrapper[4817]: E0218 14:30:54.722867 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1061eae23708d265ed2e660527915ad8354b687124ca71484eb8a99a88676c0f\": container with ID starting with 1061eae23708d265ed2e660527915ad8354b687124ca71484eb8a99a88676c0f not found: ID does not exist" containerID="1061eae23708d265ed2e660527915ad8354b687124ca71484eb8a99a88676c0f" Feb 18 14:30:54 crc kubenswrapper[4817]: I0218 14:30:54.722922 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1061eae23708d265ed2e660527915ad8354b687124ca71484eb8a99a88676c0f"} err="failed to get container status \"1061eae23708d265ed2e660527915ad8354b687124ca71484eb8a99a88676c0f\": rpc error: code = NotFound desc = could not find container \"1061eae23708d265ed2e660527915ad8354b687124ca71484eb8a99a88676c0f\": container with ID starting with 1061eae23708d265ed2e660527915ad8354b687124ca71484eb8a99a88676c0f not found: ID does not exist" Feb 18 14:30:56 crc kubenswrapper[4817]: I0218 14:30:56.172053 4817 scope.go:117] "RemoveContainer" containerID="77626d3d8b06dd06e4699408c75ed0f84d2534db6eae4ab257458701371a9858" Feb 18 14:30:56 crc kubenswrapper[4817]: E0218 14:30:56.172656 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:30:56 crc kubenswrapper[4817]: I0218 14:30:56.183735 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d96953ea-2054-4676-9ead-94870b6297d6" path="/var/lib/kubelet/pods/d96953ea-2054-4676-9ead-94870b6297d6/volumes" Feb 18 14:30:57 crc kubenswrapper[4817]: I0218 14:30:57.049730 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9gmj9"] Feb 18 14:30:57 crc kubenswrapper[4817]: I0218 14:30:57.060552 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9gmj9"] Feb 18 14:30:58 crc kubenswrapper[4817]: I0218 14:30:58.192469 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64604bbb-190b-4850-97cc-07979a94d7aa" path="/var/lib/kubelet/pods/64604bbb-190b-4850-97cc-07979a94d7aa/volumes" Feb 18 14:31:07 crc kubenswrapper[4817]: I0218 14:31:07.172276 4817 scope.go:117] "RemoveContainer" containerID="77626d3d8b06dd06e4699408c75ed0f84d2534db6eae4ab257458701371a9858" Feb 18 14:31:07 crc kubenswrapper[4817]: E0218 14:31:07.173283 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:31:18 crc kubenswrapper[4817]: I0218 14:31:18.178239 4817 scope.go:117] "RemoveContainer" containerID="77626d3d8b06dd06e4699408c75ed0f84d2534db6eae4ab257458701371a9858" Feb 18 14:31:18 crc kubenswrapper[4817]: E0218 14:31:18.179051 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:31:21 crc kubenswrapper[4817]: I0218 14:31:21.890978 4817 generic.go:334] "Generic (PLEG): container finished" podID="f04dc5ce-0657-4e8c-8c0a-3b86924ea903" containerID="d9906d36a69122eaf6fc8dddd5e50b19060c07c9f48fbb47e04730965744e215" exitCode=0 Feb 18 14:31:21 crc kubenswrapper[4817]: I0218 14:31:21.891067 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vftp7" event={"ID":"f04dc5ce-0657-4e8c-8c0a-3b86924ea903","Type":"ContainerDied","Data":"d9906d36a69122eaf6fc8dddd5e50b19060c07c9f48fbb47e04730965744e215"} Feb 18 14:31:23 crc kubenswrapper[4817]: I0218 14:31:23.058421 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-mr9dl"] Feb 18 14:31:23 crc kubenswrapper[4817]: I0218 14:31:23.071704 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-mr9dl"] Feb 18 14:31:23 crc kubenswrapper[4817]: I0218 14:31:23.521277 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vftp7" Feb 18 14:31:23 crc kubenswrapper[4817]: I0218 14:31:23.588704 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f04dc5ce-0657-4e8c-8c0a-3b86924ea903-inventory\") pod \"f04dc5ce-0657-4e8c-8c0a-3b86924ea903\" (UID: \"f04dc5ce-0657-4e8c-8c0a-3b86924ea903\") " Feb 18 14:31:23 crc kubenswrapper[4817]: I0218 14:31:23.588888 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f04dc5ce-0657-4e8c-8c0a-3b86924ea903-ssh-key-openstack-edpm-ipam\") pod \"f04dc5ce-0657-4e8c-8c0a-3b86924ea903\" (UID: \"f04dc5ce-0657-4e8c-8c0a-3b86924ea903\") " Feb 18 14:31:23 crc kubenswrapper[4817]: I0218 14:31:23.588950 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2md4q\" (UniqueName: \"kubernetes.io/projected/f04dc5ce-0657-4e8c-8c0a-3b86924ea903-kube-api-access-2md4q\") pod \"f04dc5ce-0657-4e8c-8c0a-3b86924ea903\" (UID: \"f04dc5ce-0657-4e8c-8c0a-3b86924ea903\") " Feb 18 14:31:23 crc kubenswrapper[4817]: I0218 14:31:23.599232 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f04dc5ce-0657-4e8c-8c0a-3b86924ea903-kube-api-access-2md4q" (OuterVolumeSpecName: "kube-api-access-2md4q") pod "f04dc5ce-0657-4e8c-8c0a-3b86924ea903" (UID: "f04dc5ce-0657-4e8c-8c0a-3b86924ea903"). InnerVolumeSpecName "kube-api-access-2md4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:31:23 crc kubenswrapper[4817]: I0218 14:31:23.623859 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f04dc5ce-0657-4e8c-8c0a-3b86924ea903-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f04dc5ce-0657-4e8c-8c0a-3b86924ea903" (UID: "f04dc5ce-0657-4e8c-8c0a-3b86924ea903"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:31:23 crc kubenswrapper[4817]: I0218 14:31:23.634160 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f04dc5ce-0657-4e8c-8c0a-3b86924ea903-inventory" (OuterVolumeSpecName: "inventory") pod "f04dc5ce-0657-4e8c-8c0a-3b86924ea903" (UID: "f04dc5ce-0657-4e8c-8c0a-3b86924ea903"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:31:23 crc kubenswrapper[4817]: I0218 14:31:23.692229 4817 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f04dc5ce-0657-4e8c-8c0a-3b86924ea903-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 14:31:23 crc kubenswrapper[4817]: I0218 14:31:23.692260 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2md4q\" (UniqueName: \"kubernetes.io/projected/f04dc5ce-0657-4e8c-8c0a-3b86924ea903-kube-api-access-2md4q\") on node \"crc\" DevicePath \"\"" Feb 18 14:31:23 crc kubenswrapper[4817]: I0218 14:31:23.692270 4817 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f04dc5ce-0657-4e8c-8c0a-3b86924ea903-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 14:31:23 crc kubenswrapper[4817]: I0218 14:31:23.910858 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vftp7" event={"ID":"f04dc5ce-0657-4e8c-8c0a-3b86924ea903","Type":"ContainerDied","Data":"4b0845a4fa60ea88dbda152e24aa3c55f84bc3433c7219c9c728890cfdca0ac1"} Feb 18 14:31:23 crc kubenswrapper[4817]: I0218 14:31:23.910897 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b0845a4fa60ea88dbda152e24aa3c55f84bc3433c7219c9c728890cfdca0ac1" Feb 18 14:31:23 crc kubenswrapper[4817]: I0218 14:31:23.910924 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vftp7" Feb 18 14:31:24 crc kubenswrapper[4817]: I0218 14:31:24.002497 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gdv4j"] Feb 18 14:31:24 crc kubenswrapper[4817]: E0218 14:31:24.002963 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d96953ea-2054-4676-9ead-94870b6297d6" containerName="registry-server" Feb 18 14:31:24 crc kubenswrapper[4817]: I0218 14:31:24.002983 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="d96953ea-2054-4676-9ead-94870b6297d6" containerName="registry-server" Feb 18 14:31:24 crc kubenswrapper[4817]: E0218 14:31:24.003034 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f04dc5ce-0657-4e8c-8c0a-3b86924ea903" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 18 14:31:24 crc kubenswrapper[4817]: I0218 14:31:24.003042 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f04dc5ce-0657-4e8c-8c0a-3b86924ea903" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 18 14:31:24 crc kubenswrapper[4817]: E0218 14:31:24.003074 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d96953ea-2054-4676-9ead-94870b6297d6" containerName="extract-utilities" Feb 18 14:31:24 crc kubenswrapper[4817]: I0218 14:31:24.003081 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="d96953ea-2054-4676-9ead-94870b6297d6" containerName="extract-utilities" Feb 18 14:31:24 crc kubenswrapper[4817]: E0218 14:31:24.003094 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d96953ea-2054-4676-9ead-94870b6297d6" containerName="extract-content" Feb 18 14:31:24 crc kubenswrapper[4817]: I0218 14:31:24.003100 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="d96953ea-2054-4676-9ead-94870b6297d6" containerName="extract-content" Feb 18 14:31:24 crc kubenswrapper[4817]: I0218 14:31:24.003299 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="d96953ea-2054-4676-9ead-94870b6297d6" containerName="registry-server" Feb 18 14:31:24 crc kubenswrapper[4817]: I0218 14:31:24.003320 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="f04dc5ce-0657-4e8c-8c0a-3b86924ea903" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 18 14:31:24 crc kubenswrapper[4817]: I0218 14:31:24.004158 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gdv4j" Feb 18 14:31:24 crc kubenswrapper[4817]: I0218 14:31:24.009864 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 14:31:24 crc kubenswrapper[4817]: I0218 14:31:24.009936 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 14:31:24 crc kubenswrapper[4817]: I0218 14:31:24.010022 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x8jkl" Feb 18 14:31:24 crc kubenswrapper[4817]: I0218 14:31:24.012704 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 14:31:24 crc kubenswrapper[4817]: I0218 14:31:24.015431 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gdv4j"] Feb 18 14:31:24 crc kubenswrapper[4817]: I0218 14:31:24.100197 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wmjb\" (UniqueName: \"kubernetes.io/projected/30959336-e13c-426f-9116-3fd2e485a6ed-kube-api-access-6wmjb\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gdv4j\" (UID: \"30959336-e13c-426f-9116-3fd2e485a6ed\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gdv4j" Feb 18 14:31:24 crc kubenswrapper[4817]: I0218 14:31:24.100287 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30959336-e13c-426f-9116-3fd2e485a6ed-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gdv4j\" (UID: \"30959336-e13c-426f-9116-3fd2e485a6ed\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gdv4j" Feb 18 14:31:24 crc kubenswrapper[4817]: I0218 14:31:24.100469 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/30959336-e13c-426f-9116-3fd2e485a6ed-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gdv4j\" (UID: \"30959336-e13c-426f-9116-3fd2e485a6ed\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gdv4j" Feb 18 14:31:24 crc kubenswrapper[4817]: I0218 14:31:24.182046 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76e9a054-69ef-45a0-b901-7ba80c2c2f46" path="/var/lib/kubelet/pods/76e9a054-69ef-45a0-b901-7ba80c2c2f46/volumes" Feb 18 14:31:24 crc kubenswrapper[4817]: I0218 14:31:24.201982 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/30959336-e13c-426f-9116-3fd2e485a6ed-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gdv4j\" (UID: \"30959336-e13c-426f-9116-3fd2e485a6ed\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gdv4j" Feb 18 14:31:24 crc kubenswrapper[4817]: I0218 14:31:24.202071 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wmjb\" (UniqueName: \"kubernetes.io/projected/30959336-e13c-426f-9116-3fd2e485a6ed-kube-api-access-6wmjb\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gdv4j\" (UID: \"30959336-e13c-426f-9116-3fd2e485a6ed\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gdv4j" Feb 18 14:31:24 crc kubenswrapper[4817]: I0218 14:31:24.202131 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30959336-e13c-426f-9116-3fd2e485a6ed-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gdv4j\" (UID: \"30959336-e13c-426f-9116-3fd2e485a6ed\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gdv4j" Feb 18 14:31:24 crc kubenswrapper[4817]: I0218 14:31:24.206774 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/30959336-e13c-426f-9116-3fd2e485a6ed-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gdv4j\" (UID: \"30959336-e13c-426f-9116-3fd2e485a6ed\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gdv4j" Feb 18 14:31:24 crc kubenswrapper[4817]: I0218 14:31:24.215510 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30959336-e13c-426f-9116-3fd2e485a6ed-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gdv4j\" (UID: \"30959336-e13c-426f-9116-3fd2e485a6ed\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gdv4j" Feb 18 14:31:24 crc kubenswrapper[4817]: I0218 14:31:24.221899 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wmjb\" (UniqueName: \"kubernetes.io/projected/30959336-e13c-426f-9116-3fd2e485a6ed-kube-api-access-6wmjb\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gdv4j\" (UID: \"30959336-e13c-426f-9116-3fd2e485a6ed\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gdv4j" Feb 18 14:31:24 crc kubenswrapper[4817]: I0218 14:31:24.322565 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gdv4j" Feb 18 14:31:24 crc kubenswrapper[4817]: I0218 14:31:24.857665 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gdv4j"] Feb 18 14:31:24 crc kubenswrapper[4817]: I0218 14:31:24.919604 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gdv4j" event={"ID":"30959336-e13c-426f-9116-3fd2e485a6ed","Type":"ContainerStarted","Data":"b7025f0d3eb9bb673fd86a905f63757fe847c7761be747bb7a646030da3831d1"} Feb 18 14:31:25 crc kubenswrapper[4817]: I0218 14:31:25.930943 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gdv4j" event={"ID":"30959336-e13c-426f-9116-3fd2e485a6ed","Type":"ContainerStarted","Data":"70d7e0b5a61821cb092c04702e20d874346b07ebdcd40856b12b8ee660e00e02"} Feb 18 14:31:25 crc kubenswrapper[4817]: I0218 14:31:25.957770 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gdv4j" podStartSLOduration=2.551223154 podStartE2EDuration="2.957752039s" podCreationTimestamp="2026-02-18 14:31:23 +0000 UTC" firstStartedPulling="2026-02-18 14:31:24.865026423 +0000 UTC m=+1947.440562406" lastFinishedPulling="2026-02-18 14:31:25.271555288 +0000 UTC m=+1947.847091291" observedRunningTime="2026-02-18 14:31:25.947754626 +0000 UTC m=+1948.523290629" watchObservedRunningTime="2026-02-18 14:31:25.957752039 +0000 UTC m=+1948.533288022" Feb 18 14:31:30 crc kubenswrapper[4817]: I0218 14:31:30.971863 4817 generic.go:334] "Generic (PLEG): container finished" podID="30959336-e13c-426f-9116-3fd2e485a6ed" containerID="70d7e0b5a61821cb092c04702e20d874346b07ebdcd40856b12b8ee660e00e02" exitCode=0 Feb 18 14:31:30 crc kubenswrapper[4817]: I0218 14:31:30.971993 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gdv4j" event={"ID":"30959336-e13c-426f-9116-3fd2e485a6ed","Type":"ContainerDied","Data":"70d7e0b5a61821cb092c04702e20d874346b07ebdcd40856b12b8ee660e00e02"} Feb 18 14:31:31 crc kubenswrapper[4817]: I0218 14:31:31.172170 4817 scope.go:117] "RemoveContainer" containerID="77626d3d8b06dd06e4699408c75ed0f84d2534db6eae4ab257458701371a9858" Feb 18 14:31:31 crc kubenswrapper[4817]: E0218 14:31:31.172449 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:31:32 crc kubenswrapper[4817]: I0218 14:31:32.143756 4817 scope.go:117] "RemoveContainer" containerID="6068d26a7aff6d6af7977aa0851d1c0f08658d980729bbd3920d67f8b455ea51" Feb 18 14:31:32 crc kubenswrapper[4817]: I0218 14:31:32.197685 4817 scope.go:117] "RemoveContainer" containerID="a70619c12ed1ff62eccf9a163deeae4ddb7b5cb7ff13b5191cb42343397ce290" Feb 18 14:31:32 crc kubenswrapper[4817]: I0218 14:31:32.507980 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gdv4j" Feb 18 14:31:32 crc kubenswrapper[4817]: I0218 14:31:32.667739 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wmjb\" (UniqueName: \"kubernetes.io/projected/30959336-e13c-426f-9116-3fd2e485a6ed-kube-api-access-6wmjb\") pod \"30959336-e13c-426f-9116-3fd2e485a6ed\" (UID: \"30959336-e13c-426f-9116-3fd2e485a6ed\") " Feb 18 14:31:32 crc kubenswrapper[4817]: I0218 14:31:32.668219 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30959336-e13c-426f-9116-3fd2e485a6ed-inventory\") pod \"30959336-e13c-426f-9116-3fd2e485a6ed\" (UID: \"30959336-e13c-426f-9116-3fd2e485a6ed\") " Feb 18 14:31:32 crc kubenswrapper[4817]: I0218 14:31:32.668309 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/30959336-e13c-426f-9116-3fd2e485a6ed-ssh-key-openstack-edpm-ipam\") pod \"30959336-e13c-426f-9116-3fd2e485a6ed\" (UID: \"30959336-e13c-426f-9116-3fd2e485a6ed\") " Feb 18 14:31:32 crc kubenswrapper[4817]: I0218 14:31:32.673628 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30959336-e13c-426f-9116-3fd2e485a6ed-kube-api-access-6wmjb" (OuterVolumeSpecName: "kube-api-access-6wmjb") pod "30959336-e13c-426f-9116-3fd2e485a6ed" (UID: "30959336-e13c-426f-9116-3fd2e485a6ed"). InnerVolumeSpecName "kube-api-access-6wmjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:31:32 crc kubenswrapper[4817]: I0218 14:31:32.697008 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30959336-e13c-426f-9116-3fd2e485a6ed-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "30959336-e13c-426f-9116-3fd2e485a6ed" (UID: "30959336-e13c-426f-9116-3fd2e485a6ed"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:31:32 crc kubenswrapper[4817]: I0218 14:31:32.697744 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30959336-e13c-426f-9116-3fd2e485a6ed-inventory" (OuterVolumeSpecName: "inventory") pod "30959336-e13c-426f-9116-3fd2e485a6ed" (UID: "30959336-e13c-426f-9116-3fd2e485a6ed"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:31:32 crc kubenswrapper[4817]: I0218 14:31:32.770645 4817 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/30959336-e13c-426f-9116-3fd2e485a6ed-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 14:31:32 crc kubenswrapper[4817]: I0218 14:31:32.770700 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wmjb\" (UniqueName: \"kubernetes.io/projected/30959336-e13c-426f-9116-3fd2e485a6ed-kube-api-access-6wmjb\") on node \"crc\" DevicePath \"\"" Feb 18 14:31:32 crc kubenswrapper[4817]: I0218 14:31:32.770714 4817 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30959336-e13c-426f-9116-3fd2e485a6ed-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 14:31:32 crc kubenswrapper[4817]: I0218 14:31:32.991320 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gdv4j" event={"ID":"30959336-e13c-426f-9116-3fd2e485a6ed","Type":"ContainerDied","Data":"b7025f0d3eb9bb673fd86a905f63757fe847c7761be747bb7a646030da3831d1"} Feb 18 14:31:32 crc kubenswrapper[4817]: I0218 14:31:32.991357 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7025f0d3eb9bb673fd86a905f63757fe847c7761be747bb7a646030da3831d1" Feb 18 14:31:32 crc kubenswrapper[4817]: I0218 14:31:32.991444 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gdv4j" Feb 18 14:31:33 crc kubenswrapper[4817]: I0218 14:31:33.063574 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hptxc"] Feb 18 14:31:33 crc kubenswrapper[4817]: I0218 14:31:33.073336 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hptxc"] Feb 18 14:31:33 crc kubenswrapper[4817]: I0218 14:31:33.085658 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kttlh"] Feb 18 14:31:33 crc kubenswrapper[4817]: E0218 14:31:33.086150 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30959336-e13c-426f-9116-3fd2e485a6ed" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 18 14:31:33 crc kubenswrapper[4817]: I0218 14:31:33.086174 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="30959336-e13c-426f-9116-3fd2e485a6ed" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 18 14:31:33 crc kubenswrapper[4817]: I0218 14:31:33.086433 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="30959336-e13c-426f-9116-3fd2e485a6ed" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 18 14:31:33 crc kubenswrapper[4817]: I0218 14:31:33.087195 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kttlh" Feb 18 14:31:33 crc kubenswrapper[4817]: I0218 14:31:33.096401 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kttlh"] Feb 18 14:31:33 crc kubenswrapper[4817]: I0218 14:31:33.127456 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 14:31:33 crc kubenswrapper[4817]: I0218 14:31:33.128019 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 14:31:33 crc kubenswrapper[4817]: I0218 14:31:33.128259 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 14:31:33 crc kubenswrapper[4817]: I0218 14:31:33.128272 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x8jkl" Feb 18 14:31:33 crc kubenswrapper[4817]: I0218 14:31:33.179329 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2e45ac1d-02e2-457d-9944-cf1ecaf8edd3-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kttlh\" (UID: \"2e45ac1d-02e2-457d-9944-cf1ecaf8edd3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kttlh" Feb 18 14:31:33 crc kubenswrapper[4817]: I0218 14:31:33.179375 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e45ac1d-02e2-457d-9944-cf1ecaf8edd3-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kttlh\" (UID: \"2e45ac1d-02e2-457d-9944-cf1ecaf8edd3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kttlh" Feb 18 14:31:33 crc kubenswrapper[4817]: I0218 14:31:33.179519 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dbh2\" (UniqueName: \"kubernetes.io/projected/2e45ac1d-02e2-457d-9944-cf1ecaf8edd3-kube-api-access-5dbh2\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kttlh\" (UID: \"2e45ac1d-02e2-457d-9944-cf1ecaf8edd3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kttlh" Feb 18 14:31:33 crc kubenswrapper[4817]: I0218 14:31:33.281496 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2e45ac1d-02e2-457d-9944-cf1ecaf8edd3-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kttlh\" (UID: \"2e45ac1d-02e2-457d-9944-cf1ecaf8edd3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kttlh" Feb 18 14:31:33 crc kubenswrapper[4817]: I0218 14:31:33.281552 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e45ac1d-02e2-457d-9944-cf1ecaf8edd3-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kttlh\" (UID: \"2e45ac1d-02e2-457d-9944-cf1ecaf8edd3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kttlh" Feb 18 14:31:33 crc kubenswrapper[4817]: I0218 14:31:33.281736 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dbh2\" (UniqueName: \"kubernetes.io/projected/2e45ac1d-02e2-457d-9944-cf1ecaf8edd3-kube-api-access-5dbh2\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kttlh\" (UID: \"2e45ac1d-02e2-457d-9944-cf1ecaf8edd3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kttlh" Feb 18 14:31:33 crc kubenswrapper[4817]: I0218 14:31:33.284758 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2e45ac1d-02e2-457d-9944-cf1ecaf8edd3-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kttlh\" (UID: \"2e45ac1d-02e2-457d-9944-cf1ecaf8edd3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kttlh" Feb 18 14:31:33 crc kubenswrapper[4817]: I0218 14:31:33.286601 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e45ac1d-02e2-457d-9944-cf1ecaf8edd3-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kttlh\" (UID: \"2e45ac1d-02e2-457d-9944-cf1ecaf8edd3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kttlh" Feb 18 14:31:33 crc kubenswrapper[4817]: I0218 14:31:33.340763 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dbh2\" (UniqueName: \"kubernetes.io/projected/2e45ac1d-02e2-457d-9944-cf1ecaf8edd3-kube-api-access-5dbh2\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kttlh\" (UID: \"2e45ac1d-02e2-457d-9944-cf1ecaf8edd3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kttlh" Feb 18 14:31:33 crc kubenswrapper[4817]: I0218 14:31:33.443426 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kttlh" Feb 18 14:31:34 crc kubenswrapper[4817]: I0218 14:31:34.013140 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kttlh"] Feb 18 14:31:34 crc kubenswrapper[4817]: I0218 14:31:34.183610 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e176b326-3d3d-4b95-8a7e-e18448de49ae" path="/var/lib/kubelet/pods/e176b326-3d3d-4b95-8a7e-e18448de49ae/volumes" Feb 18 14:31:35 crc kubenswrapper[4817]: I0218 14:31:35.014543 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kttlh" event={"ID":"2e45ac1d-02e2-457d-9944-cf1ecaf8edd3","Type":"ContainerStarted","Data":"2884fca895782d77b237345f38558101d223f74876d63e5c11118ae76c4c49b5"} Feb 18 14:31:35 crc kubenswrapper[4817]: I0218 14:31:35.014827 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kttlh" event={"ID":"2e45ac1d-02e2-457d-9944-cf1ecaf8edd3","Type":"ContainerStarted","Data":"605feb146feacb6e6e8ebfc5b1e388bfbf2ac81c310880249b0408d0e3817a71"} Feb 18 14:31:35 crc kubenswrapper[4817]: I0218 14:31:35.038781 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kttlh" podStartSLOduration=1.588357009 podStartE2EDuration="2.038758304s" podCreationTimestamp="2026-02-18 14:31:33 +0000 UTC" firstStartedPulling="2026-02-18 14:31:34.015681481 +0000 UTC m=+1956.591217464" lastFinishedPulling="2026-02-18 14:31:34.466082776 +0000 UTC m=+1957.041618759" observedRunningTime="2026-02-18 14:31:35.031149172 +0000 UTC m=+1957.606685155" watchObservedRunningTime="2026-02-18 14:31:35.038758304 +0000 UTC m=+1957.614294287" Feb 18 14:31:44 crc kubenswrapper[4817]: I0218 14:31:44.172097 4817 scope.go:117] "RemoveContainer" containerID="77626d3d8b06dd06e4699408c75ed0f84d2534db6eae4ab257458701371a9858" Feb 18 14:31:44 crc kubenswrapper[4817]: E0218 14:31:44.173309 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:31:55 crc kubenswrapper[4817]: I0218 14:31:55.171685 4817 scope.go:117] "RemoveContainer" containerID="77626d3d8b06dd06e4699408c75ed0f84d2534db6eae4ab257458701371a9858" Feb 18 14:31:55 crc kubenswrapper[4817]: E0218 14:31:55.172550 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:32:04 crc kubenswrapper[4817]: I0218 14:32:04.184586 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hsbc9"] Feb 18 14:32:04 crc kubenswrapper[4817]: I0218 14:32:04.188344 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hsbc9" Feb 18 14:32:04 crc kubenswrapper[4817]: I0218 14:32:04.197231 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hsbc9"] Feb 18 14:32:04 crc kubenswrapper[4817]: I0218 14:32:04.286144 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4185e717-2ef8-456e-ad88-f8a65231cd06-utilities\") pod \"redhat-operators-hsbc9\" (UID: \"4185e717-2ef8-456e-ad88-f8a65231cd06\") " pod="openshift-marketplace/redhat-operators-hsbc9" Feb 18 14:32:04 crc kubenswrapper[4817]: I0218 14:32:04.286344 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c6gz\" (UniqueName: \"kubernetes.io/projected/4185e717-2ef8-456e-ad88-f8a65231cd06-kube-api-access-8c6gz\") pod \"redhat-operators-hsbc9\" (UID: \"4185e717-2ef8-456e-ad88-f8a65231cd06\") " pod="openshift-marketplace/redhat-operators-hsbc9" Feb 18 14:32:04 crc kubenswrapper[4817]: I0218 14:32:04.286680 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4185e717-2ef8-456e-ad88-f8a65231cd06-catalog-content\") pod \"redhat-operators-hsbc9\" (UID: \"4185e717-2ef8-456e-ad88-f8a65231cd06\") " pod="openshift-marketplace/redhat-operators-hsbc9" Feb 18 14:32:04 crc kubenswrapper[4817]: I0218 14:32:04.390133 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4185e717-2ef8-456e-ad88-f8a65231cd06-catalog-content\") pod \"redhat-operators-hsbc9\" (UID: \"4185e717-2ef8-456e-ad88-f8a65231cd06\") " pod="openshift-marketplace/redhat-operators-hsbc9" Feb 18 14:32:04 crc kubenswrapper[4817]: I0218 14:32:04.390332 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4185e717-2ef8-456e-ad88-f8a65231cd06-catalog-content\") pod \"redhat-operators-hsbc9\" (UID: \"4185e717-2ef8-456e-ad88-f8a65231cd06\") " pod="openshift-marketplace/redhat-operators-hsbc9" Feb 18 14:32:04 crc kubenswrapper[4817]: I0218 14:32:04.390422 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4185e717-2ef8-456e-ad88-f8a65231cd06-utilities\") pod \"redhat-operators-hsbc9\" (UID: \"4185e717-2ef8-456e-ad88-f8a65231cd06\") " pod="openshift-marketplace/redhat-operators-hsbc9" Feb 18 14:32:04 crc kubenswrapper[4817]: I0218 14:32:04.390472 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c6gz\" (UniqueName: \"kubernetes.io/projected/4185e717-2ef8-456e-ad88-f8a65231cd06-kube-api-access-8c6gz\") pod \"redhat-operators-hsbc9\" (UID: \"4185e717-2ef8-456e-ad88-f8a65231cd06\") " pod="openshift-marketplace/redhat-operators-hsbc9" Feb 18 14:32:04 crc kubenswrapper[4817]: I0218 14:32:04.391094 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4185e717-2ef8-456e-ad88-f8a65231cd06-utilities\") pod \"redhat-operators-hsbc9\" (UID: \"4185e717-2ef8-456e-ad88-f8a65231cd06\") " pod="openshift-marketplace/redhat-operators-hsbc9" Feb 18 14:32:04 crc kubenswrapper[4817]: I0218 14:32:04.410747 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c6gz\" (UniqueName: \"kubernetes.io/projected/4185e717-2ef8-456e-ad88-f8a65231cd06-kube-api-access-8c6gz\") pod \"redhat-operators-hsbc9\" (UID: \"4185e717-2ef8-456e-ad88-f8a65231cd06\") " pod="openshift-marketplace/redhat-operators-hsbc9" Feb 18 14:32:04 crc kubenswrapper[4817]: I0218 14:32:04.511522 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hsbc9" Feb 18 14:32:05 crc kubenswrapper[4817]: I0218 14:32:05.035412 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hsbc9"] Feb 18 14:32:05 crc kubenswrapper[4817]: I0218 14:32:05.747441 4817 generic.go:334] "Generic (PLEG): container finished" podID="4185e717-2ef8-456e-ad88-f8a65231cd06" containerID="2acbd7e6e91fbbd89a10a04a6472f01d4e0ecaadf2802edb6d53b6d51fa8c5e6" exitCode=0 Feb 18 14:32:05 crc kubenswrapper[4817]: I0218 14:32:05.747501 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hsbc9" event={"ID":"4185e717-2ef8-456e-ad88-f8a65231cd06","Type":"ContainerDied","Data":"2acbd7e6e91fbbd89a10a04a6472f01d4e0ecaadf2802edb6d53b6d51fa8c5e6"} Feb 18 14:32:05 crc kubenswrapper[4817]: I0218 14:32:05.747532 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hsbc9" event={"ID":"4185e717-2ef8-456e-ad88-f8a65231cd06","Type":"ContainerStarted","Data":"1f0b14b5fe5be7988d9ff7a54584b3914ff94819d0fb37edc0c4173cb1e92ca9"} Feb 18 14:32:08 crc kubenswrapper[4817]: I0218 14:32:08.039609 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-khvxr"] Feb 18 14:32:08 crc kubenswrapper[4817]: I0218 14:32:08.050009 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-khvxr"] Feb 18 14:32:08 crc kubenswrapper[4817]: I0218 14:32:08.182763 4817 scope.go:117] "RemoveContainer" containerID="77626d3d8b06dd06e4699408c75ed0f84d2534db6eae4ab257458701371a9858" Feb 18 14:32:08 crc kubenswrapper[4817]: I0218 14:32:08.186102 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22047a76-beb3-439b-994a-25a8c306be7b" path="/var/lib/kubelet/pods/22047a76-beb3-439b-994a-25a8c306be7b/volumes" Feb 18 14:32:08 crc kubenswrapper[4817]: E0218 14:32:08.186101 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:32:09 crc kubenswrapper[4817]: I0218 14:32:09.788254 4817 generic.go:334] "Generic (PLEG): container finished" podID="2e45ac1d-02e2-457d-9944-cf1ecaf8edd3" containerID="2884fca895782d77b237345f38558101d223f74876d63e5c11118ae76c4c49b5" exitCode=0 Feb 18 14:32:09 crc kubenswrapper[4817]: I0218 14:32:09.788360 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kttlh" event={"ID":"2e45ac1d-02e2-457d-9944-cf1ecaf8edd3","Type":"ContainerDied","Data":"2884fca895782d77b237345f38558101d223f74876d63e5c11118ae76c4c49b5"} Feb 18 14:32:14 crc kubenswrapper[4817]: I0218 14:32:14.023747 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kttlh" Feb 18 14:32:14 crc kubenswrapper[4817]: I0218 14:32:14.118549 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dbh2\" (UniqueName: \"kubernetes.io/projected/2e45ac1d-02e2-457d-9944-cf1ecaf8edd3-kube-api-access-5dbh2\") pod \"2e45ac1d-02e2-457d-9944-cf1ecaf8edd3\" (UID: \"2e45ac1d-02e2-457d-9944-cf1ecaf8edd3\") " Feb 18 14:32:14 crc kubenswrapper[4817]: I0218 14:32:14.118760 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e45ac1d-02e2-457d-9944-cf1ecaf8edd3-inventory\") pod \"2e45ac1d-02e2-457d-9944-cf1ecaf8edd3\" (UID: \"2e45ac1d-02e2-457d-9944-cf1ecaf8edd3\") " Feb 18 14:32:14 crc kubenswrapper[4817]: I0218 14:32:14.118899 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2e45ac1d-02e2-457d-9944-cf1ecaf8edd3-ssh-key-openstack-edpm-ipam\") pod \"2e45ac1d-02e2-457d-9944-cf1ecaf8edd3\" (UID: \"2e45ac1d-02e2-457d-9944-cf1ecaf8edd3\") " Feb 18 14:32:14 crc kubenswrapper[4817]: I0218 14:32:14.126338 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e45ac1d-02e2-457d-9944-cf1ecaf8edd3-kube-api-access-5dbh2" (OuterVolumeSpecName: "kube-api-access-5dbh2") pod "2e45ac1d-02e2-457d-9944-cf1ecaf8edd3" (UID: "2e45ac1d-02e2-457d-9944-cf1ecaf8edd3"). InnerVolumeSpecName "kube-api-access-5dbh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:32:14 crc kubenswrapper[4817]: I0218 14:32:14.155775 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e45ac1d-02e2-457d-9944-cf1ecaf8edd3-inventory" (OuterVolumeSpecName: "inventory") pod "2e45ac1d-02e2-457d-9944-cf1ecaf8edd3" (UID: "2e45ac1d-02e2-457d-9944-cf1ecaf8edd3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:32:14 crc kubenswrapper[4817]: I0218 14:32:14.169875 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e45ac1d-02e2-457d-9944-cf1ecaf8edd3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2e45ac1d-02e2-457d-9944-cf1ecaf8edd3" (UID: "2e45ac1d-02e2-457d-9944-cf1ecaf8edd3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:32:14 crc kubenswrapper[4817]: I0218 14:32:14.220822 4817 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2e45ac1d-02e2-457d-9944-cf1ecaf8edd3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 14:32:14 crc kubenswrapper[4817]: I0218 14:32:14.220851 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dbh2\" (UniqueName: \"kubernetes.io/projected/2e45ac1d-02e2-457d-9944-cf1ecaf8edd3-kube-api-access-5dbh2\") on node \"crc\" DevicePath \"\"" Feb 18 14:32:14 crc kubenswrapper[4817]: I0218 14:32:14.220861 4817 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e45ac1d-02e2-457d-9944-cf1ecaf8edd3-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 14:32:14 crc kubenswrapper[4817]: I0218 14:32:14.842756 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hsbc9" event={"ID":"4185e717-2ef8-456e-ad88-f8a65231cd06","Type":"ContainerStarted","Data":"2c66982bc45a3914c52ea65ccac772dea499ee8f6eb7147dadb684e91b0eaa55"} Feb 18 14:32:14 crc kubenswrapper[4817]: I0218 14:32:14.844239 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kttlh" event={"ID":"2e45ac1d-02e2-457d-9944-cf1ecaf8edd3","Type":"ContainerDied","Data":"605feb146feacb6e6e8ebfc5b1e388bfbf2ac81c310880249b0408d0e3817a71"} Feb 18 14:32:14 crc kubenswrapper[4817]: I0218 14:32:14.844282 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="605feb146feacb6e6e8ebfc5b1e388bfbf2ac81c310880249b0408d0e3817a71" Feb 18 14:32:14 crc kubenswrapper[4817]: I0218 14:32:14.844316 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kttlh" Feb 18 14:32:15 crc kubenswrapper[4817]: I0218 14:32:15.127560 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-79xdx"] Feb 18 14:32:15 crc kubenswrapper[4817]: E0218 14:32:15.128030 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e45ac1d-02e2-457d-9944-cf1ecaf8edd3" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 18 14:32:15 crc kubenswrapper[4817]: I0218 14:32:15.128044 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e45ac1d-02e2-457d-9944-cf1ecaf8edd3" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 18 14:32:15 crc kubenswrapper[4817]: I0218 14:32:15.128251 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e45ac1d-02e2-457d-9944-cf1ecaf8edd3" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 18 14:32:15 crc kubenswrapper[4817]: I0218 14:32:15.129013 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-79xdx" Feb 18 14:32:15 crc kubenswrapper[4817]: I0218 14:32:15.132063 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x8jkl" Feb 18 14:32:15 crc kubenswrapper[4817]: I0218 14:32:15.134931 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 14:32:15 crc kubenswrapper[4817]: I0218 14:32:15.135192 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 14:32:15 crc kubenswrapper[4817]: I0218 14:32:15.135352 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 14:32:15 crc kubenswrapper[4817]: I0218 14:32:15.205722 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-79xdx"] Feb 18 14:32:15 crc kubenswrapper[4817]: I0218 14:32:15.240216 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sksvx\" (UniqueName: \"kubernetes.io/projected/e0745d01-0937-448d-a458-6f5823075a7a-kube-api-access-sksvx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-79xdx\" (UID: \"e0745d01-0937-448d-a458-6f5823075a7a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-79xdx" Feb 18 14:32:15 crc kubenswrapper[4817]: I0218 14:32:15.240543 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0745d01-0937-448d-a458-6f5823075a7a-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-79xdx\" (UID: \"e0745d01-0937-448d-a458-6f5823075a7a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-79xdx" Feb 18 14:32:15 crc kubenswrapper[4817]: I0218 14:32:15.240742 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e0745d01-0937-448d-a458-6f5823075a7a-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-79xdx\" (UID: \"e0745d01-0937-448d-a458-6f5823075a7a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-79xdx" Feb 18 14:32:15 crc kubenswrapper[4817]: I0218 14:32:15.342987 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sksvx\" (UniqueName: \"kubernetes.io/projected/e0745d01-0937-448d-a458-6f5823075a7a-kube-api-access-sksvx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-79xdx\" (UID: \"e0745d01-0937-448d-a458-6f5823075a7a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-79xdx" Feb 18 14:32:15 crc kubenswrapper[4817]: I0218 14:32:15.343639 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0745d01-0937-448d-a458-6f5823075a7a-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-79xdx\" (UID: \"e0745d01-0937-448d-a458-6f5823075a7a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-79xdx" Feb 18 14:32:15 crc kubenswrapper[4817]: I0218 14:32:15.343781 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e0745d01-0937-448d-a458-6f5823075a7a-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-79xdx\" (UID: \"e0745d01-0937-448d-a458-6f5823075a7a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-79xdx" Feb 18 14:32:15 crc kubenswrapper[4817]: I0218 14:32:15.348920 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e0745d01-0937-448d-a458-6f5823075a7a-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-79xdx\" (UID: \"e0745d01-0937-448d-a458-6f5823075a7a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-79xdx" Feb 18 14:32:15 crc kubenswrapper[4817]: I0218 14:32:15.359037 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0745d01-0937-448d-a458-6f5823075a7a-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-79xdx\" (UID: \"e0745d01-0937-448d-a458-6f5823075a7a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-79xdx" Feb 18 14:32:15 crc kubenswrapper[4817]: I0218 14:32:15.359067 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sksvx\" (UniqueName: \"kubernetes.io/projected/e0745d01-0937-448d-a458-6f5823075a7a-kube-api-access-sksvx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-79xdx\" (UID: \"e0745d01-0937-448d-a458-6f5823075a7a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-79xdx" Feb 18 14:32:15 crc kubenswrapper[4817]: I0218 14:32:15.445542 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-79xdx" Feb 18 14:32:16 crc kubenswrapper[4817]: W0218 14:32:16.032606 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0745d01_0937_448d_a458_6f5823075a7a.slice/crio-5afbb74734de40c6679d0772681b670df1f54867c09784b1f1918941e8fed35b WatchSource:0}: Error finding container 5afbb74734de40c6679d0772681b670df1f54867c09784b1f1918941e8fed35b: Status 404 returned error can't find the container with id 5afbb74734de40c6679d0772681b670df1f54867c09784b1f1918941e8fed35b Feb 18 14:32:16 crc kubenswrapper[4817]: I0218 14:32:16.037994 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-79xdx"] Feb 18 14:32:16 crc kubenswrapper[4817]: I0218 14:32:16.863673 4817 generic.go:334] "Generic (PLEG): container finished" podID="4185e717-2ef8-456e-ad88-f8a65231cd06" containerID="2c66982bc45a3914c52ea65ccac772dea499ee8f6eb7147dadb684e91b0eaa55" exitCode=0 Feb 18 14:32:16 crc kubenswrapper[4817]: I0218 14:32:16.863713 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hsbc9" event={"ID":"4185e717-2ef8-456e-ad88-f8a65231cd06","Type":"ContainerDied","Data":"2c66982bc45a3914c52ea65ccac772dea499ee8f6eb7147dadb684e91b0eaa55"} Feb 18 14:32:16 crc kubenswrapper[4817]: I0218 14:32:16.865992 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-79xdx" event={"ID":"e0745d01-0937-448d-a458-6f5823075a7a","Type":"ContainerStarted","Data":"5afbb74734de40c6679d0772681b670df1f54867c09784b1f1918941e8fed35b"} Feb 18 14:32:19 crc kubenswrapper[4817]: I0218 14:32:19.909742 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-79xdx" event={"ID":"e0745d01-0937-448d-a458-6f5823075a7a","Type":"ContainerStarted","Data":"4bdcfe21de5966ca57a8809879e24f4aa1b03d3be8d410f543c96150c3de259f"} Feb 18 14:32:19 crc kubenswrapper[4817]: I0218 14:32:19.936777 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-79xdx" podStartSLOduration=2.175926695 podStartE2EDuration="4.936756822s" podCreationTimestamp="2026-02-18 14:32:15 +0000 UTC" firstStartedPulling="2026-02-18 14:32:16.035428182 +0000 UTC m=+1998.610964175" lastFinishedPulling="2026-02-18 14:32:18.796258319 +0000 UTC m=+2001.371794302" observedRunningTime="2026-02-18 14:32:19.930999906 +0000 UTC m=+2002.506535899" watchObservedRunningTime="2026-02-18 14:32:19.936756822 +0000 UTC m=+2002.512292805" Feb 18 14:32:20 crc kubenswrapper[4817]: I0218 14:32:20.172037 4817 scope.go:117] "RemoveContainer" containerID="77626d3d8b06dd06e4699408c75ed0f84d2534db6eae4ab257458701371a9858" Feb 18 14:32:20 crc kubenswrapper[4817]: I0218 14:32:20.922761 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerStarted","Data":"350b94e016e63e9dbf9f1cb2943e19d533f92423c570ba0a133fb08ef9bb2a0b"} Feb 18 14:32:20 crc kubenswrapper[4817]: I0218 14:32:20.925504 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hsbc9" event={"ID":"4185e717-2ef8-456e-ad88-f8a65231cd06","Type":"ContainerStarted","Data":"5d7971ce6030f8f4b5380f4f5cb5d796f422f752c56ec2d4f0f4320dfe792229"} Feb 18 14:32:24 crc kubenswrapper[4817]: I0218 14:32:24.512689 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hsbc9" Feb 18 14:32:24 crc kubenswrapper[4817]: I0218 14:32:24.513139 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hsbc9" Feb 18 14:32:25 crc kubenswrapper[4817]: I0218 14:32:25.561395 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hsbc9" podUID="4185e717-2ef8-456e-ad88-f8a65231cd06" containerName="registry-server" probeResult="failure" output=< Feb 18 14:32:25 crc kubenswrapper[4817]: timeout: failed to connect service ":50051" within 1s Feb 18 14:32:25 crc kubenswrapper[4817]: > Feb 18 14:32:32 crc kubenswrapper[4817]: I0218 14:32:32.392745 4817 scope.go:117] "RemoveContainer" containerID="47a50248f30af6ef7ab21313783ae7fc54be9ceb9b4f064a1b9e653f3b841298" Feb 18 14:32:32 crc kubenswrapper[4817]: I0218 14:32:32.442240 4817 scope.go:117] "RemoveContainer" containerID="522a5c4cd7a61ef4b17190b5ad9cbb8f6613603f9ad5fd28f81bab978cd24c1c" Feb 18 14:32:34 crc kubenswrapper[4817]: I0218 14:32:34.563669 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hsbc9" Feb 18 14:32:34 crc kubenswrapper[4817]: I0218 14:32:34.594173 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hsbc9" podStartSLOduration=16.677740856 podStartE2EDuration="30.594146934s" podCreationTimestamp="2026-02-18 14:32:04 +0000 UTC" firstStartedPulling="2026-02-18 14:32:05.750400116 +0000 UTC m=+1988.325936099" lastFinishedPulling="2026-02-18 14:32:19.666806194 +0000 UTC m=+2002.242342177" observedRunningTime="2026-02-18 14:32:20.991596009 +0000 UTC m=+2003.567132002" watchObservedRunningTime="2026-02-18 14:32:34.594146934 +0000 UTC m=+2017.169682917" Feb 18 14:32:34 crc kubenswrapper[4817]: I0218 14:32:34.612493 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hsbc9" Feb 18 14:32:35 crc kubenswrapper[4817]: I0218 14:32:35.203139 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hsbc9"] Feb 18 14:32:35 crc kubenswrapper[4817]: I0218 14:32:35.372310 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rxkcv"] Feb 18 14:32:35 crc kubenswrapper[4817]: I0218 14:32:35.372615 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rxkcv" podUID="bc6b2223-1330-41d7-aad0-944699ee1f3c" containerName="registry-server" containerID="cri-o://542ea0e26151ce730342f18a3559c45637e430cf27d259610b4a3fe970e3a6f4" gracePeriod=2 Feb 18 14:32:38 crc kubenswrapper[4817]: I0218 14:32:38.073022 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rxkcv_bc6b2223-1330-41d7-aad0-944699ee1f3c/registry-server/0.log" Feb 18 14:32:38 crc kubenswrapper[4817]: I0218 14:32:38.075270 4817 generic.go:334] "Generic (PLEG): container finished" podID="bc6b2223-1330-41d7-aad0-944699ee1f3c" containerID="542ea0e26151ce730342f18a3559c45637e430cf27d259610b4a3fe970e3a6f4" exitCode=137 Feb 18 14:32:38 crc kubenswrapper[4817]: I0218 14:32:38.075348 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rxkcv" event={"ID":"bc6b2223-1330-41d7-aad0-944699ee1f3c","Type":"ContainerDied","Data":"542ea0e26151ce730342f18a3559c45637e430cf27d259610b4a3fe970e3a6f4"} Feb 18 14:32:38 crc kubenswrapper[4817]: I0218 14:32:38.444021 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rxkcv_bc6b2223-1330-41d7-aad0-944699ee1f3c/registry-server/0.log" Feb 18 14:32:38 crc kubenswrapper[4817]: I0218 14:32:38.444650 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rxkcv" Feb 18 14:32:38 crc kubenswrapper[4817]: I0218 14:32:38.578849 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddv2x\" (UniqueName: \"kubernetes.io/projected/bc6b2223-1330-41d7-aad0-944699ee1f3c-kube-api-access-ddv2x\") pod \"bc6b2223-1330-41d7-aad0-944699ee1f3c\" (UID: \"bc6b2223-1330-41d7-aad0-944699ee1f3c\") " Feb 18 14:32:38 crc kubenswrapper[4817]: I0218 14:32:38.578999 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc6b2223-1330-41d7-aad0-944699ee1f3c-catalog-content\") pod \"bc6b2223-1330-41d7-aad0-944699ee1f3c\" (UID: \"bc6b2223-1330-41d7-aad0-944699ee1f3c\") " Feb 18 14:32:38 crc kubenswrapper[4817]: I0218 14:32:38.579186 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc6b2223-1330-41d7-aad0-944699ee1f3c-utilities\") pod \"bc6b2223-1330-41d7-aad0-944699ee1f3c\" (UID: \"bc6b2223-1330-41d7-aad0-944699ee1f3c\") " Feb 18 14:32:38 crc kubenswrapper[4817]: I0218 14:32:38.581269 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc6b2223-1330-41d7-aad0-944699ee1f3c-utilities" (OuterVolumeSpecName: "utilities") pod "bc6b2223-1330-41d7-aad0-944699ee1f3c" (UID: "bc6b2223-1330-41d7-aad0-944699ee1f3c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:32:38 crc kubenswrapper[4817]: I0218 14:32:38.585465 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc6b2223-1330-41d7-aad0-944699ee1f3c-kube-api-access-ddv2x" (OuterVolumeSpecName: "kube-api-access-ddv2x") pod "bc6b2223-1330-41d7-aad0-944699ee1f3c" (UID: "bc6b2223-1330-41d7-aad0-944699ee1f3c"). InnerVolumeSpecName "kube-api-access-ddv2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:32:38 crc kubenswrapper[4817]: I0218 14:32:38.681970 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc6b2223-1330-41d7-aad0-944699ee1f3c-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 14:32:38 crc kubenswrapper[4817]: I0218 14:32:38.682017 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddv2x\" (UniqueName: \"kubernetes.io/projected/bc6b2223-1330-41d7-aad0-944699ee1f3c-kube-api-access-ddv2x\") on node \"crc\" DevicePath \"\"" Feb 18 14:32:38 crc kubenswrapper[4817]: I0218 14:32:38.750795 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc6b2223-1330-41d7-aad0-944699ee1f3c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc6b2223-1330-41d7-aad0-944699ee1f3c" (UID: "bc6b2223-1330-41d7-aad0-944699ee1f3c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:32:38 crc kubenswrapper[4817]: I0218 14:32:38.784630 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc6b2223-1330-41d7-aad0-944699ee1f3c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 14:32:39 crc kubenswrapper[4817]: I0218 14:32:39.085047 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rxkcv_bc6b2223-1330-41d7-aad0-944699ee1f3c/registry-server/0.log" Feb 18 14:32:39 crc kubenswrapper[4817]: I0218 14:32:39.086684 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rxkcv" event={"ID":"bc6b2223-1330-41d7-aad0-944699ee1f3c","Type":"ContainerDied","Data":"e4dae77c358163b3026c66a6dc2b0f4c53da2bfc0abc9fb0290d936e903a79e9"} Feb 18 14:32:39 crc kubenswrapper[4817]: I0218 14:32:39.086745 4817 scope.go:117] "RemoveContainer" containerID="542ea0e26151ce730342f18a3559c45637e430cf27d259610b4a3fe970e3a6f4" Feb 18 14:32:39 crc kubenswrapper[4817]: I0218 14:32:39.086796 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rxkcv" Feb 18 14:32:39 crc kubenswrapper[4817]: I0218 14:32:39.113775 4817 scope.go:117] "RemoveContainer" containerID="3382e2d1e8d128a8cf4b8c0b3ef168a0e8e8af682a1885736930424f5b0456cf" Feb 18 14:32:39 crc kubenswrapper[4817]: I0218 14:32:39.117719 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rxkcv"] Feb 18 14:32:39 crc kubenswrapper[4817]: I0218 14:32:39.133433 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rxkcv"] Feb 18 14:32:39 crc kubenswrapper[4817]: I0218 14:32:39.164954 4817 scope.go:117] "RemoveContainer" containerID="39d705451ea265ef9bf5db17576e31f23365fbd7341d6fa8b277d8553895b18c" Feb 18 14:32:40 crc kubenswrapper[4817]: I0218 14:32:40.183888 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc6b2223-1330-41d7-aad0-944699ee1f3c" path="/var/lib/kubelet/pods/bc6b2223-1330-41d7-aad0-944699ee1f3c/volumes" Feb 18 14:33:04 crc kubenswrapper[4817]: I0218 14:33:04.331188 4817 generic.go:334] "Generic (PLEG): container finished" podID="e0745d01-0937-448d-a458-6f5823075a7a" containerID="4bdcfe21de5966ca57a8809879e24f4aa1b03d3be8d410f543c96150c3de259f" exitCode=0 Feb 18 14:33:04 crc kubenswrapper[4817]: I0218 14:33:04.331268 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-79xdx" event={"ID":"e0745d01-0937-448d-a458-6f5823075a7a","Type":"ContainerDied","Data":"4bdcfe21de5966ca57a8809879e24f4aa1b03d3be8d410f543c96150c3de259f"} Feb 18 14:33:05 crc kubenswrapper[4817]: I0218 14:33:05.972294 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-79xdx" Feb 18 14:33:06 crc kubenswrapper[4817]: I0218 14:33:06.033315 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sksvx\" (UniqueName: \"kubernetes.io/projected/e0745d01-0937-448d-a458-6f5823075a7a-kube-api-access-sksvx\") pod \"e0745d01-0937-448d-a458-6f5823075a7a\" (UID: \"e0745d01-0937-448d-a458-6f5823075a7a\") " Feb 18 14:33:06 crc kubenswrapper[4817]: I0218 14:33:06.033548 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0745d01-0937-448d-a458-6f5823075a7a-inventory\") pod \"e0745d01-0937-448d-a458-6f5823075a7a\" (UID: \"e0745d01-0937-448d-a458-6f5823075a7a\") " Feb 18 14:33:06 crc kubenswrapper[4817]: I0218 14:33:06.033674 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e0745d01-0937-448d-a458-6f5823075a7a-ssh-key-openstack-edpm-ipam\") pod \"e0745d01-0937-448d-a458-6f5823075a7a\" (UID: \"e0745d01-0937-448d-a458-6f5823075a7a\") " Feb 18 14:33:06 crc kubenswrapper[4817]: I0218 14:33:06.038898 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0745d01-0937-448d-a458-6f5823075a7a-kube-api-access-sksvx" (OuterVolumeSpecName: "kube-api-access-sksvx") pod "e0745d01-0937-448d-a458-6f5823075a7a" (UID: "e0745d01-0937-448d-a458-6f5823075a7a"). InnerVolumeSpecName "kube-api-access-sksvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:33:06 crc kubenswrapper[4817]: E0218 14:33:06.062085 4817 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0745d01-0937-448d-a458-6f5823075a7a-inventory podName:e0745d01-0937-448d-a458-6f5823075a7a nodeName:}" failed. No retries permitted until 2026-02-18 14:33:06.562055095 +0000 UTC m=+2049.137591078 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/e0745d01-0937-448d-a458-6f5823075a7a-inventory") pod "e0745d01-0937-448d-a458-6f5823075a7a" (UID: "e0745d01-0937-448d-a458-6f5823075a7a") : error deleting /var/lib/kubelet/pods/e0745d01-0937-448d-a458-6f5823075a7a/volume-subpaths: remove /var/lib/kubelet/pods/e0745d01-0937-448d-a458-6f5823075a7a/volume-subpaths: no such file or directory Feb 18 14:33:06 crc kubenswrapper[4817]: I0218 14:33:06.065035 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0745d01-0937-448d-a458-6f5823075a7a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e0745d01-0937-448d-a458-6f5823075a7a" (UID: "e0745d01-0937-448d-a458-6f5823075a7a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:33:06 crc kubenswrapper[4817]: I0218 14:33:06.136015 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sksvx\" (UniqueName: \"kubernetes.io/projected/e0745d01-0937-448d-a458-6f5823075a7a-kube-api-access-sksvx\") on node \"crc\" DevicePath \"\"" Feb 18 14:33:06 crc kubenswrapper[4817]: I0218 14:33:06.136054 4817 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e0745d01-0937-448d-a458-6f5823075a7a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 14:33:06 crc kubenswrapper[4817]: I0218 14:33:06.352122 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-79xdx" event={"ID":"e0745d01-0937-448d-a458-6f5823075a7a","Type":"ContainerDied","Data":"5afbb74734de40c6679d0772681b670df1f54867c09784b1f1918941e8fed35b"} Feb 18 14:33:06 crc kubenswrapper[4817]: I0218 14:33:06.352177 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-79xdx" Feb 18 14:33:06 crc kubenswrapper[4817]: I0218 14:33:06.352183 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5afbb74734de40c6679d0772681b670df1f54867c09784b1f1918941e8fed35b" Feb 18 14:33:06 crc kubenswrapper[4817]: I0218 14:33:06.448389 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-t7ncv"] Feb 18 14:33:06 crc kubenswrapper[4817]: E0218 14:33:06.448858 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc6b2223-1330-41d7-aad0-944699ee1f3c" containerName="registry-server" Feb 18 14:33:06 crc kubenswrapper[4817]: I0218 14:33:06.448879 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc6b2223-1330-41d7-aad0-944699ee1f3c" containerName="registry-server" Feb 18 14:33:06 crc kubenswrapper[4817]: E0218 14:33:06.448896 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc6b2223-1330-41d7-aad0-944699ee1f3c" containerName="extract-content" Feb 18 14:33:06 crc kubenswrapper[4817]: I0218 14:33:06.448904 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc6b2223-1330-41d7-aad0-944699ee1f3c" containerName="extract-content" Feb 18 14:33:06 crc kubenswrapper[4817]: E0218 14:33:06.448937 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc6b2223-1330-41d7-aad0-944699ee1f3c" containerName="extract-utilities" Feb 18 14:33:06 crc kubenswrapper[4817]: I0218 14:33:06.448967 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc6b2223-1330-41d7-aad0-944699ee1f3c" containerName="extract-utilities" Feb 18 14:33:06 crc kubenswrapper[4817]: E0218 14:33:06.450310 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0745d01-0937-448d-a458-6f5823075a7a" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 18 14:33:06 crc kubenswrapper[4817]: I0218 14:33:06.450336 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0745d01-0937-448d-a458-6f5823075a7a" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 18 14:33:06 crc kubenswrapper[4817]: I0218 14:33:06.450681 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc6b2223-1330-41d7-aad0-944699ee1f3c" containerName="registry-server" Feb 18 14:33:06 crc kubenswrapper[4817]: I0218 14:33:06.450715 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0745d01-0937-448d-a458-6f5823075a7a" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 18 14:33:06 crc kubenswrapper[4817]: I0218 14:33:06.451695 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-t7ncv" Feb 18 14:33:06 crc kubenswrapper[4817]: I0218 14:33:06.458714 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-t7ncv"] Feb 18 14:33:06 crc kubenswrapper[4817]: I0218 14:33:06.543811 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f75wp\" (UniqueName: \"kubernetes.io/projected/1bef22ac-a84d-4941-8290-6b98eb56367b-kube-api-access-f75wp\") pod \"ssh-known-hosts-edpm-deployment-t7ncv\" (UID: \"1bef22ac-a84d-4941-8290-6b98eb56367b\") " pod="openstack/ssh-known-hosts-edpm-deployment-t7ncv" Feb 18 14:33:06 crc kubenswrapper[4817]: I0218 14:33:06.543887 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/1bef22ac-a84d-4941-8290-6b98eb56367b-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-t7ncv\" (UID: \"1bef22ac-a84d-4941-8290-6b98eb56367b\") " pod="openstack/ssh-known-hosts-edpm-deployment-t7ncv" Feb 18 14:33:06 crc kubenswrapper[4817]: I0218 14:33:06.543988 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1bef22ac-a84d-4941-8290-6b98eb56367b-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-t7ncv\" (UID: \"1bef22ac-a84d-4941-8290-6b98eb56367b\") " pod="openstack/ssh-known-hosts-edpm-deployment-t7ncv" Feb 18 14:33:06 crc kubenswrapper[4817]: I0218 14:33:06.645874 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0745d01-0937-448d-a458-6f5823075a7a-inventory\") pod \"e0745d01-0937-448d-a458-6f5823075a7a\" (UID: \"e0745d01-0937-448d-a458-6f5823075a7a\") " Feb 18 14:33:06 crc kubenswrapper[4817]: I0218 14:33:06.646680 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f75wp\" (UniqueName: \"kubernetes.io/projected/1bef22ac-a84d-4941-8290-6b98eb56367b-kube-api-access-f75wp\") pod \"ssh-known-hosts-edpm-deployment-t7ncv\" (UID: \"1bef22ac-a84d-4941-8290-6b98eb56367b\") " pod="openstack/ssh-known-hosts-edpm-deployment-t7ncv" Feb 18 14:33:06 crc kubenswrapper[4817]: I0218 14:33:06.646757 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/1bef22ac-a84d-4941-8290-6b98eb56367b-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-t7ncv\" (UID: \"1bef22ac-a84d-4941-8290-6b98eb56367b\") " pod="openstack/ssh-known-hosts-edpm-deployment-t7ncv" Feb 18 14:33:06 crc kubenswrapper[4817]: I0218 14:33:06.646862 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1bef22ac-a84d-4941-8290-6b98eb56367b-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-t7ncv\" (UID: \"1bef22ac-a84d-4941-8290-6b98eb56367b\") " pod="openstack/ssh-known-hosts-edpm-deployment-t7ncv" Feb 18 14:33:06 crc kubenswrapper[4817]: I0218 14:33:06.650532 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/1bef22ac-a84d-4941-8290-6b98eb56367b-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-t7ncv\" (UID: \"1bef22ac-a84d-4941-8290-6b98eb56367b\") " pod="openstack/ssh-known-hosts-edpm-deployment-t7ncv" Feb 18 14:33:06 crc kubenswrapper[4817]: I0218 14:33:06.662264 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0745d01-0937-448d-a458-6f5823075a7a-inventory" (OuterVolumeSpecName: "inventory") pod "e0745d01-0937-448d-a458-6f5823075a7a" (UID: "e0745d01-0937-448d-a458-6f5823075a7a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:33:06 crc kubenswrapper[4817]: I0218 14:33:06.665554 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1bef22ac-a84d-4941-8290-6b98eb56367b-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-t7ncv\" (UID: \"1bef22ac-a84d-4941-8290-6b98eb56367b\") " pod="openstack/ssh-known-hosts-edpm-deployment-t7ncv" Feb 18 14:33:06 crc kubenswrapper[4817]: I0218 14:33:06.668034 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f75wp\" (UniqueName: \"kubernetes.io/projected/1bef22ac-a84d-4941-8290-6b98eb56367b-kube-api-access-f75wp\") pod \"ssh-known-hosts-edpm-deployment-t7ncv\" (UID: \"1bef22ac-a84d-4941-8290-6b98eb56367b\") " pod="openstack/ssh-known-hosts-edpm-deployment-t7ncv" Feb 18 14:33:06 crc kubenswrapper[4817]: I0218 14:33:06.748894 4817 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0745d01-0937-448d-a458-6f5823075a7a-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 14:33:06 crc kubenswrapper[4817]: I0218 14:33:06.767941 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-t7ncv" Feb 18 14:33:07 crc kubenswrapper[4817]: I0218 14:33:07.293952 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-t7ncv"] Feb 18 14:33:07 crc kubenswrapper[4817]: I0218 14:33:07.363876 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-t7ncv" event={"ID":"1bef22ac-a84d-4941-8290-6b98eb56367b","Type":"ContainerStarted","Data":"2bd31f3da96bb6c67c9b23003b10d3d98abf8c391b7212fe11c2aeae5608e40f"} Feb 18 14:33:08 crc kubenswrapper[4817]: I0218 14:33:08.374113 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-t7ncv" event={"ID":"1bef22ac-a84d-4941-8290-6b98eb56367b","Type":"ContainerStarted","Data":"6e7687286fa87a428e29c7f312a56b4043e8b24174c68109c84647114dc905ca"} Feb 18 14:33:08 crc kubenswrapper[4817]: I0218 14:33:08.395623 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-t7ncv" podStartSLOduration=1.9187074769999999 podStartE2EDuration="2.395601412s" podCreationTimestamp="2026-02-18 14:33:06 +0000 UTC" firstStartedPulling="2026-02-18 14:33:07.305039621 +0000 UTC m=+2049.880575604" lastFinishedPulling="2026-02-18 14:33:07.781933556 +0000 UTC m=+2050.357469539" observedRunningTime="2026-02-18 14:33:08.389853026 +0000 UTC m=+2050.965389009" watchObservedRunningTime="2026-02-18 14:33:08.395601412 +0000 UTC m=+2050.971137395" Feb 18 14:33:15 crc kubenswrapper[4817]: I0218 14:33:15.040700 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-6mxmm"] Feb 18 14:33:15 crc kubenswrapper[4817]: I0218 14:33:15.050726 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-6mxmm"] Feb 18 14:33:15 crc kubenswrapper[4817]: I0218 14:33:15.435208 4817 generic.go:334] "Generic (PLEG): container finished" podID="1bef22ac-a84d-4941-8290-6b98eb56367b" containerID="6e7687286fa87a428e29c7f312a56b4043e8b24174c68109c84647114dc905ca" exitCode=0 Feb 18 14:33:15 crc kubenswrapper[4817]: I0218 14:33:15.435294 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-t7ncv" event={"ID":"1bef22ac-a84d-4941-8290-6b98eb56367b","Type":"ContainerDied","Data":"6e7687286fa87a428e29c7f312a56b4043e8b24174c68109c84647114dc905ca"} Feb 18 14:33:16 crc kubenswrapper[4817]: I0218 14:33:16.183847 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e4e668a-0bfc-430d-8796-9ed775e01913" path="/var/lib/kubelet/pods/2e4e668a-0bfc-430d-8796-9ed775e01913/volumes" Feb 18 14:33:16 crc kubenswrapper[4817]: I0218 14:33:16.971990 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-t7ncv" Feb 18 14:33:17 crc kubenswrapper[4817]: I0218 14:33:17.076302 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1bef22ac-a84d-4941-8290-6b98eb56367b-ssh-key-openstack-edpm-ipam\") pod \"1bef22ac-a84d-4941-8290-6b98eb56367b\" (UID: \"1bef22ac-a84d-4941-8290-6b98eb56367b\") " Feb 18 14:33:17 crc kubenswrapper[4817]: I0218 14:33:17.076414 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/1bef22ac-a84d-4941-8290-6b98eb56367b-inventory-0\") pod \"1bef22ac-a84d-4941-8290-6b98eb56367b\" (UID: \"1bef22ac-a84d-4941-8290-6b98eb56367b\") " Feb 18 14:33:17 crc kubenswrapper[4817]: I0218 14:33:17.076518 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f75wp\" (UniqueName: \"kubernetes.io/projected/1bef22ac-a84d-4941-8290-6b98eb56367b-kube-api-access-f75wp\") pod \"1bef22ac-a84d-4941-8290-6b98eb56367b\" (UID: \"1bef22ac-a84d-4941-8290-6b98eb56367b\") " Feb 18 14:33:17 crc kubenswrapper[4817]: I0218 14:33:17.083230 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bef22ac-a84d-4941-8290-6b98eb56367b-kube-api-access-f75wp" (OuterVolumeSpecName: "kube-api-access-f75wp") pod "1bef22ac-a84d-4941-8290-6b98eb56367b" (UID: "1bef22ac-a84d-4941-8290-6b98eb56367b"). InnerVolumeSpecName "kube-api-access-f75wp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:33:17 crc kubenswrapper[4817]: I0218 14:33:17.108916 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bef22ac-a84d-4941-8290-6b98eb56367b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1bef22ac-a84d-4941-8290-6b98eb56367b" (UID: "1bef22ac-a84d-4941-8290-6b98eb56367b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:33:17 crc kubenswrapper[4817]: I0218 14:33:17.110175 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bef22ac-a84d-4941-8290-6b98eb56367b-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "1bef22ac-a84d-4941-8290-6b98eb56367b" (UID: "1bef22ac-a84d-4941-8290-6b98eb56367b"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:33:17 crc kubenswrapper[4817]: I0218 14:33:17.179443 4817 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/1bef22ac-a84d-4941-8290-6b98eb56367b-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 18 14:33:17 crc kubenswrapper[4817]: I0218 14:33:17.179479 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f75wp\" (UniqueName: \"kubernetes.io/projected/1bef22ac-a84d-4941-8290-6b98eb56367b-kube-api-access-f75wp\") on node \"crc\" DevicePath \"\"" Feb 18 14:33:17 crc kubenswrapper[4817]: I0218 14:33:17.179493 4817 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1bef22ac-a84d-4941-8290-6b98eb56367b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 14:33:17 crc kubenswrapper[4817]: I0218 14:33:17.451869 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-t7ncv" event={"ID":"1bef22ac-a84d-4941-8290-6b98eb56367b","Type":"ContainerDied","Data":"2bd31f3da96bb6c67c9b23003b10d3d98abf8c391b7212fe11c2aeae5608e40f"} Feb 18 14:33:17 crc kubenswrapper[4817]: I0218 14:33:17.451910 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bd31f3da96bb6c67c9b23003b10d3d98abf8c391b7212fe11c2aeae5608e40f" Feb 18 14:33:17 crc kubenswrapper[4817]: I0218 14:33:17.451961 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-t7ncv" Feb 18 14:33:17 crc kubenswrapper[4817]: I0218 14:33:17.518671 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-vxt6n"] Feb 18 14:33:17 crc kubenswrapper[4817]: E0218 14:33:17.529749 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bef22ac-a84d-4941-8290-6b98eb56367b" containerName="ssh-known-hosts-edpm-deployment" Feb 18 14:33:17 crc kubenswrapper[4817]: I0218 14:33:17.529788 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bef22ac-a84d-4941-8290-6b98eb56367b" containerName="ssh-known-hosts-edpm-deployment" Feb 18 14:33:17 crc kubenswrapper[4817]: I0218 14:33:17.530520 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bef22ac-a84d-4941-8290-6b98eb56367b" containerName="ssh-known-hosts-edpm-deployment" Feb 18 14:33:17 crc kubenswrapper[4817]: I0218 14:33:17.531689 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vxt6n" Feb 18 14:33:17 crc kubenswrapper[4817]: I0218 14:33:17.534563 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x8jkl" Feb 18 14:33:17 crc kubenswrapper[4817]: I0218 14:33:17.534767 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 14:33:17 crc kubenswrapper[4817]: I0218 14:33:17.541146 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 14:33:17 crc kubenswrapper[4817]: I0218 14:33:17.541359 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 14:33:17 crc kubenswrapper[4817]: I0218 14:33:17.575115 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-vxt6n"] Feb 18 14:33:17 crc kubenswrapper[4817]: I0218 14:33:17.587007 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abdb2358-3c43-4027-ab8e-fb25932c4f97-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vxt6n\" (UID: \"abdb2358-3c43-4027-ab8e-fb25932c4f97\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vxt6n" Feb 18 14:33:17 crc kubenswrapper[4817]: I0218 14:33:17.587110 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/abdb2358-3c43-4027-ab8e-fb25932c4f97-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vxt6n\" (UID: \"abdb2358-3c43-4027-ab8e-fb25932c4f97\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vxt6n" Feb 18 14:33:17 crc kubenswrapper[4817]: I0218 14:33:17.587249 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tllm2\" (UniqueName: \"kubernetes.io/projected/abdb2358-3c43-4027-ab8e-fb25932c4f97-kube-api-access-tllm2\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vxt6n\" (UID: \"abdb2358-3c43-4027-ab8e-fb25932c4f97\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vxt6n" Feb 18 14:33:17 crc kubenswrapper[4817]: I0218 14:33:17.690531 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tllm2\" (UniqueName: \"kubernetes.io/projected/abdb2358-3c43-4027-ab8e-fb25932c4f97-kube-api-access-tllm2\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vxt6n\" (UID: \"abdb2358-3c43-4027-ab8e-fb25932c4f97\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vxt6n" Feb 18 14:33:17 crc kubenswrapper[4817]: I0218 14:33:17.691192 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abdb2358-3c43-4027-ab8e-fb25932c4f97-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vxt6n\" (UID: \"abdb2358-3c43-4027-ab8e-fb25932c4f97\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vxt6n" Feb 18 14:33:17 crc kubenswrapper[4817]: I0218 14:33:17.691503 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/abdb2358-3c43-4027-ab8e-fb25932c4f97-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vxt6n\" (UID: \"abdb2358-3c43-4027-ab8e-fb25932c4f97\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vxt6n" Feb 18 14:33:17 crc kubenswrapper[4817]: I0218 14:33:17.704387 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/abdb2358-3c43-4027-ab8e-fb25932c4f97-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vxt6n\" (UID: \"abdb2358-3c43-4027-ab8e-fb25932c4f97\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vxt6n" Feb 18 14:33:17 crc kubenswrapper[4817]: I0218 14:33:17.705544 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abdb2358-3c43-4027-ab8e-fb25932c4f97-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vxt6n\" (UID: \"abdb2358-3c43-4027-ab8e-fb25932c4f97\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vxt6n" Feb 18 14:33:17 crc kubenswrapper[4817]: I0218 14:33:17.706231 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tllm2\" (UniqueName: \"kubernetes.io/projected/abdb2358-3c43-4027-ab8e-fb25932c4f97-kube-api-access-tllm2\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vxt6n\" (UID: \"abdb2358-3c43-4027-ab8e-fb25932c4f97\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vxt6n" Feb 18 14:33:17 crc kubenswrapper[4817]: I0218 14:33:17.850538 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vxt6n" Feb 18 14:33:18 crc kubenswrapper[4817]: W0218 14:33:18.346315 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabdb2358_3c43_4027_ab8e_fb25932c4f97.slice/crio-95527e7dec0e35565d03b723c13049d684359fd86b35e03e08cd454f2e318951 WatchSource:0}: Error finding container 95527e7dec0e35565d03b723c13049d684359fd86b35e03e08cd454f2e318951: Status 404 returned error can't find the container with id 95527e7dec0e35565d03b723c13049d684359fd86b35e03e08cd454f2e318951 Feb 18 14:33:18 crc kubenswrapper[4817]: I0218 14:33:18.349444 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-vxt6n"] Feb 18 14:33:18 crc kubenswrapper[4817]: I0218 14:33:18.465035 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vxt6n" event={"ID":"abdb2358-3c43-4027-ab8e-fb25932c4f97","Type":"ContainerStarted","Data":"95527e7dec0e35565d03b723c13049d684359fd86b35e03e08cd454f2e318951"} Feb 18 14:33:19 crc kubenswrapper[4817]: I0218 14:33:19.479936 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vxt6n" event={"ID":"abdb2358-3c43-4027-ab8e-fb25932c4f97","Type":"ContainerStarted","Data":"cdb2987ad86405bfbe0ede0646b33f95a014a01fc57c609289b5a060003a3233"} Feb 18 14:33:19 crc kubenswrapper[4817]: I0218 14:33:19.503877 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vxt6n" podStartSLOduration=2.072912701 podStartE2EDuration="2.503861393s" podCreationTimestamp="2026-02-18 14:33:17 +0000 UTC" firstStartedPulling="2026-02-18 14:33:18.348969375 +0000 UTC m=+2060.924505348" lastFinishedPulling="2026-02-18 14:33:18.779918057 +0000 UTC m=+2061.355454040" observedRunningTime="2026-02-18 14:33:19.49860125 +0000 UTC m=+2062.074137233" watchObservedRunningTime="2026-02-18 14:33:19.503861393 +0000 UTC m=+2062.079397376" Feb 18 14:33:23 crc kubenswrapper[4817]: I0218 14:33:23.032992 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-rc7gd"] Feb 18 14:33:23 crc kubenswrapper[4817]: I0218 14:33:23.041042 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-rc7gd"] Feb 18 14:33:24 crc kubenswrapper[4817]: I0218 14:33:24.183390 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2eec2d51-e854-4ecd-aa31-fd94387e6aa6" path="/var/lib/kubelet/pods/2eec2d51-e854-4ecd-aa31-fd94387e6aa6/volumes" Feb 18 14:33:26 crc kubenswrapper[4817]: I0218 14:33:26.545949 4817 generic.go:334] "Generic (PLEG): container finished" podID="abdb2358-3c43-4027-ab8e-fb25932c4f97" containerID="cdb2987ad86405bfbe0ede0646b33f95a014a01fc57c609289b5a060003a3233" exitCode=0 Feb 18 14:33:26 crc kubenswrapper[4817]: I0218 14:33:26.546045 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vxt6n" event={"ID":"abdb2358-3c43-4027-ab8e-fb25932c4f97","Type":"ContainerDied","Data":"cdb2987ad86405bfbe0ede0646b33f95a014a01fc57c609289b5a060003a3233"} Feb 18 14:33:28 crc kubenswrapper[4817]: I0218 14:33:28.056500 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vxt6n" Feb 18 14:33:28 crc kubenswrapper[4817]: I0218 14:33:28.117725 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abdb2358-3c43-4027-ab8e-fb25932c4f97-inventory\") pod \"abdb2358-3c43-4027-ab8e-fb25932c4f97\" (UID: \"abdb2358-3c43-4027-ab8e-fb25932c4f97\") " Feb 18 14:33:28 crc kubenswrapper[4817]: I0218 14:33:28.117854 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/abdb2358-3c43-4027-ab8e-fb25932c4f97-ssh-key-openstack-edpm-ipam\") pod \"abdb2358-3c43-4027-ab8e-fb25932c4f97\" (UID: \"abdb2358-3c43-4027-ab8e-fb25932c4f97\") " Feb 18 14:33:28 crc kubenswrapper[4817]: I0218 14:33:28.117947 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tllm2\" (UniqueName: \"kubernetes.io/projected/abdb2358-3c43-4027-ab8e-fb25932c4f97-kube-api-access-tllm2\") pod \"abdb2358-3c43-4027-ab8e-fb25932c4f97\" (UID: \"abdb2358-3c43-4027-ab8e-fb25932c4f97\") " Feb 18 14:33:28 crc kubenswrapper[4817]: I0218 14:33:28.122994 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abdb2358-3c43-4027-ab8e-fb25932c4f97-kube-api-access-tllm2" (OuterVolumeSpecName: "kube-api-access-tllm2") pod "abdb2358-3c43-4027-ab8e-fb25932c4f97" (UID: "abdb2358-3c43-4027-ab8e-fb25932c4f97"). InnerVolumeSpecName "kube-api-access-tllm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:33:28 crc kubenswrapper[4817]: I0218 14:33:28.144917 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abdb2358-3c43-4027-ab8e-fb25932c4f97-inventory" (OuterVolumeSpecName: "inventory") pod "abdb2358-3c43-4027-ab8e-fb25932c4f97" (UID: "abdb2358-3c43-4027-ab8e-fb25932c4f97"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:33:28 crc kubenswrapper[4817]: I0218 14:33:28.146025 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abdb2358-3c43-4027-ab8e-fb25932c4f97-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "abdb2358-3c43-4027-ab8e-fb25932c4f97" (UID: "abdb2358-3c43-4027-ab8e-fb25932c4f97"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:33:28 crc kubenswrapper[4817]: I0218 14:33:28.221154 4817 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abdb2358-3c43-4027-ab8e-fb25932c4f97-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 14:33:28 crc kubenswrapper[4817]: I0218 14:33:28.221188 4817 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/abdb2358-3c43-4027-ab8e-fb25932c4f97-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 14:33:28 crc kubenswrapper[4817]: I0218 14:33:28.221200 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tllm2\" (UniqueName: \"kubernetes.io/projected/abdb2358-3c43-4027-ab8e-fb25932c4f97-kube-api-access-tllm2\") on node \"crc\" DevicePath \"\"" Feb 18 14:33:28 crc kubenswrapper[4817]: I0218 14:33:28.569627 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vxt6n" event={"ID":"abdb2358-3c43-4027-ab8e-fb25932c4f97","Type":"ContainerDied","Data":"95527e7dec0e35565d03b723c13049d684359fd86b35e03e08cd454f2e318951"} Feb 18 14:33:28 crc kubenswrapper[4817]: I0218 14:33:28.569668 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95527e7dec0e35565d03b723c13049d684359fd86b35e03e08cd454f2e318951" Feb 18 14:33:28 crc kubenswrapper[4817]: I0218 14:33:28.569700 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vxt6n" Feb 18 14:33:28 crc kubenswrapper[4817]: I0218 14:33:28.698158 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k5wnj"] Feb 18 14:33:28 crc kubenswrapper[4817]: E0218 14:33:28.698936 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abdb2358-3c43-4027-ab8e-fb25932c4f97" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 18 14:33:28 crc kubenswrapper[4817]: I0218 14:33:28.698961 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="abdb2358-3c43-4027-ab8e-fb25932c4f97" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 18 14:33:28 crc kubenswrapper[4817]: I0218 14:33:28.699224 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="abdb2358-3c43-4027-ab8e-fb25932c4f97" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 18 14:33:28 crc kubenswrapper[4817]: I0218 14:33:28.699994 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k5wnj" Feb 18 14:33:28 crc kubenswrapper[4817]: I0218 14:33:28.701949 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 14:33:28 crc kubenswrapper[4817]: I0218 14:33:28.702075 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 14:33:28 crc kubenswrapper[4817]: I0218 14:33:28.704266 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x8jkl" Feb 18 14:33:28 crc kubenswrapper[4817]: I0218 14:33:28.704371 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 14:33:28 crc kubenswrapper[4817]: I0218 14:33:28.710471 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k5wnj"] Feb 18 14:33:28 crc kubenswrapper[4817]: I0218 14:33:28.748549 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k89g5\" (UniqueName: \"kubernetes.io/projected/3c302fa9-5186-4192-9cf6-b6d533570323-kube-api-access-k89g5\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-k5wnj\" (UID: \"3c302fa9-5186-4192-9cf6-b6d533570323\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k5wnj" Feb 18 14:33:28 crc kubenswrapper[4817]: I0218 14:33:28.748613 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c302fa9-5186-4192-9cf6-b6d533570323-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-k5wnj\" (UID: \"3c302fa9-5186-4192-9cf6-b6d533570323\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k5wnj" Feb 18 14:33:28 crc kubenswrapper[4817]: I0218 14:33:28.748814 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3c302fa9-5186-4192-9cf6-b6d533570323-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-k5wnj\" (UID: \"3c302fa9-5186-4192-9cf6-b6d533570323\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k5wnj" Feb 18 14:33:28 crc kubenswrapper[4817]: I0218 14:33:28.851070 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3c302fa9-5186-4192-9cf6-b6d533570323-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-k5wnj\" (UID: \"3c302fa9-5186-4192-9cf6-b6d533570323\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k5wnj" Feb 18 14:33:28 crc kubenswrapper[4817]: I0218 14:33:28.851144 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k89g5\" (UniqueName: \"kubernetes.io/projected/3c302fa9-5186-4192-9cf6-b6d533570323-kube-api-access-k89g5\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-k5wnj\" (UID: \"3c302fa9-5186-4192-9cf6-b6d533570323\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k5wnj" Feb 18 14:33:28 crc kubenswrapper[4817]: I0218 14:33:28.851166 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c302fa9-5186-4192-9cf6-b6d533570323-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-k5wnj\" (UID: \"3c302fa9-5186-4192-9cf6-b6d533570323\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k5wnj" Feb 18 14:33:28 crc kubenswrapper[4817]: I0218 14:33:28.855386 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c302fa9-5186-4192-9cf6-b6d533570323-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-k5wnj\" (UID: \"3c302fa9-5186-4192-9cf6-b6d533570323\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k5wnj" Feb 18 14:33:28 crc kubenswrapper[4817]: I0218 14:33:28.855515 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3c302fa9-5186-4192-9cf6-b6d533570323-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-k5wnj\" (UID: \"3c302fa9-5186-4192-9cf6-b6d533570323\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k5wnj" Feb 18 14:33:28 crc kubenswrapper[4817]: I0218 14:33:28.868195 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k89g5\" (UniqueName: \"kubernetes.io/projected/3c302fa9-5186-4192-9cf6-b6d533570323-kube-api-access-k89g5\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-k5wnj\" (UID: \"3c302fa9-5186-4192-9cf6-b6d533570323\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k5wnj" Feb 18 14:33:29 crc kubenswrapper[4817]: I0218 14:33:29.059754 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k5wnj" Feb 18 14:33:29 crc kubenswrapper[4817]: I0218 14:33:29.594445 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k5wnj"] Feb 18 14:33:30 crc kubenswrapper[4817]: I0218 14:33:30.587226 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k5wnj" event={"ID":"3c302fa9-5186-4192-9cf6-b6d533570323","Type":"ContainerStarted","Data":"53039725c9f5782e5d44df30fdfb64f9ffaea2c8a02e8c59b9def59cf0da68b0"} Feb 18 14:33:32 crc kubenswrapper[4817]: I0218 14:33:32.606138 4817 scope.go:117] "RemoveContainer" containerID="48eec7203a92500b53a432814248b1ba6b6019c370a3555217314aa7397779b1" Feb 18 14:33:32 crc kubenswrapper[4817]: I0218 14:33:32.643996 4817 scope.go:117] "RemoveContainer" containerID="4ba7a12f52d396a8ef524aad1ccad889df65d9d3b445891e515c2ba8acfa3569" Feb 18 14:33:35 crc kubenswrapper[4817]: I0218 14:33:35.632789 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k5wnj" event={"ID":"3c302fa9-5186-4192-9cf6-b6d533570323","Type":"ContainerStarted","Data":"4d0341b8d557d948a783efc7f0bb288a57c0174249a1b3f4825eb2bc5309ed90"} Feb 18 14:33:35 crc kubenswrapper[4817]: I0218 14:33:35.656472 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k5wnj" podStartSLOduration=2.556608528 podStartE2EDuration="7.656443211s" podCreationTimestamp="2026-02-18 14:33:28 +0000 UTC" firstStartedPulling="2026-02-18 14:33:29.596887899 +0000 UTC m=+2072.172423882" lastFinishedPulling="2026-02-18 14:33:34.696722582 +0000 UTC m=+2077.272258565" observedRunningTime="2026-02-18 14:33:35.648269774 +0000 UTC m=+2078.223805757" watchObservedRunningTime="2026-02-18 14:33:35.656443211 +0000 UTC m=+2078.231979204" Feb 18 14:33:44 crc kubenswrapper[4817]: I0218 14:33:44.709365 4817 generic.go:334] "Generic (PLEG): container finished" podID="3c302fa9-5186-4192-9cf6-b6d533570323" containerID="4d0341b8d557d948a783efc7f0bb288a57c0174249a1b3f4825eb2bc5309ed90" exitCode=0 Feb 18 14:33:44 crc kubenswrapper[4817]: I0218 14:33:44.709467 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k5wnj" event={"ID":"3c302fa9-5186-4192-9cf6-b6d533570323","Type":"ContainerDied","Data":"4d0341b8d557d948a783efc7f0bb288a57c0174249a1b3f4825eb2bc5309ed90"} Feb 18 14:33:46 crc kubenswrapper[4817]: I0218 14:33:46.210243 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k5wnj" Feb 18 14:33:46 crc kubenswrapper[4817]: I0218 14:33:46.319869 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c302fa9-5186-4192-9cf6-b6d533570323-inventory\") pod \"3c302fa9-5186-4192-9cf6-b6d533570323\" (UID: \"3c302fa9-5186-4192-9cf6-b6d533570323\") " Feb 18 14:33:46 crc kubenswrapper[4817]: I0218 14:33:46.320000 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3c302fa9-5186-4192-9cf6-b6d533570323-ssh-key-openstack-edpm-ipam\") pod \"3c302fa9-5186-4192-9cf6-b6d533570323\" (UID: \"3c302fa9-5186-4192-9cf6-b6d533570323\") " Feb 18 14:33:46 crc kubenswrapper[4817]: I0218 14:33:46.320184 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k89g5\" (UniqueName: \"kubernetes.io/projected/3c302fa9-5186-4192-9cf6-b6d533570323-kube-api-access-k89g5\") pod \"3c302fa9-5186-4192-9cf6-b6d533570323\" (UID: \"3c302fa9-5186-4192-9cf6-b6d533570323\") " Feb 18 14:33:46 crc kubenswrapper[4817]: I0218 14:33:46.334551 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c302fa9-5186-4192-9cf6-b6d533570323-kube-api-access-k89g5" (OuterVolumeSpecName: "kube-api-access-k89g5") pod "3c302fa9-5186-4192-9cf6-b6d533570323" (UID: "3c302fa9-5186-4192-9cf6-b6d533570323"). InnerVolumeSpecName "kube-api-access-k89g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:33:46 crc kubenswrapper[4817]: I0218 14:33:46.348601 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c302fa9-5186-4192-9cf6-b6d533570323-inventory" (OuterVolumeSpecName: "inventory") pod "3c302fa9-5186-4192-9cf6-b6d533570323" (UID: "3c302fa9-5186-4192-9cf6-b6d533570323"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:33:46 crc kubenswrapper[4817]: I0218 14:33:46.354323 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c302fa9-5186-4192-9cf6-b6d533570323-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3c302fa9-5186-4192-9cf6-b6d533570323" (UID: "3c302fa9-5186-4192-9cf6-b6d533570323"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:33:46 crc kubenswrapper[4817]: I0218 14:33:46.422970 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k89g5\" (UniqueName: \"kubernetes.io/projected/3c302fa9-5186-4192-9cf6-b6d533570323-kube-api-access-k89g5\") on node \"crc\" DevicePath \"\"" Feb 18 14:33:46 crc kubenswrapper[4817]: I0218 14:33:46.423047 4817 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c302fa9-5186-4192-9cf6-b6d533570323-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 14:33:46 crc kubenswrapper[4817]: I0218 14:33:46.423064 4817 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3c302fa9-5186-4192-9cf6-b6d533570323-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 14:33:46 crc kubenswrapper[4817]: I0218 14:33:46.733442 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k5wnj" event={"ID":"3c302fa9-5186-4192-9cf6-b6d533570323","Type":"ContainerDied","Data":"53039725c9f5782e5d44df30fdfb64f9ffaea2c8a02e8c59b9def59cf0da68b0"} Feb 18 14:33:46 crc kubenswrapper[4817]: I0218 14:33:46.733487 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53039725c9f5782e5d44df30fdfb64f9ffaea2c8a02e8c59b9def59cf0da68b0" Feb 18 14:33:46 crc kubenswrapper[4817]: I0218 14:33:46.733517 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k5wnj" Feb 18 14:33:46 crc kubenswrapper[4817]: I0218 14:33:46.841949 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg"] Feb 18 14:33:46 crc kubenswrapper[4817]: E0218 14:33:46.842956 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c302fa9-5186-4192-9cf6-b6d533570323" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 18 14:33:46 crc kubenswrapper[4817]: I0218 14:33:46.843030 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c302fa9-5186-4192-9cf6-b6d533570323" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 18 14:33:46 crc kubenswrapper[4817]: I0218 14:33:46.843394 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c302fa9-5186-4192-9cf6-b6d533570323" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 18 14:33:46 crc kubenswrapper[4817]: I0218 14:33:46.845053 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:46 crc kubenswrapper[4817]: I0218 14:33:46.847719 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 18 14:33:46 crc kubenswrapper[4817]: I0218 14:33:46.847960 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 18 14:33:46 crc kubenswrapper[4817]: I0218 14:33:46.848148 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 14:33:46 crc kubenswrapper[4817]: I0218 14:33:46.848386 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 18 14:33:46 crc kubenswrapper[4817]: I0218 14:33:46.848567 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 18 14:33:46 crc kubenswrapper[4817]: I0218 14:33:46.848786 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x8jkl" Feb 18 14:33:46 crc kubenswrapper[4817]: I0218 14:33:46.851898 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 14:33:46 crc kubenswrapper[4817]: I0218 14:33:46.854225 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 14:33:46 crc kubenswrapper[4817]: I0218 14:33:46.854641 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg"] Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.033927 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f54e3715-121a-4498-a552-5a9f1daed55c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.034033 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.034068 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcm9x\" (UniqueName: \"kubernetes.io/projected/f54e3715-121a-4498-a552-5a9f1daed55c-kube-api-access-dcm9x\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.034101 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f54e3715-121a-4498-a552-5a9f1daed55c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.034158 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.034279 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.034344 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.034396 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.034460 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.034493 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.034529 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f54e3715-121a-4498-a552-5a9f1daed55c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.034559 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.034751 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f54e3715-121a-4498-a552-5a9f1daed55c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.034875 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.137187 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.137245 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.137283 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.137321 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.137357 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f54e3715-121a-4498-a552-5a9f1daed55c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.137396 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.137467 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f54e3715-121a-4498-a552-5a9f1daed55c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.137520 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.137635 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f54e3715-121a-4498-a552-5a9f1daed55c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.137673 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.137695 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcm9x\" (UniqueName: \"kubernetes.io/projected/f54e3715-121a-4498-a552-5a9f1daed55c-kube-api-access-dcm9x\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.137719 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f54e3715-121a-4498-a552-5a9f1daed55c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.137743 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.137805 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.143349 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f54e3715-121a-4498-a552-5a9f1daed55c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.144828 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f54e3715-121a-4498-a552-5a9f1daed55c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.145060 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.145349 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.145918 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f54e3715-121a-4498-a552-5a9f1daed55c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.145957 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.146021 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.146528 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.147829 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.148216 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.148921 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.149419 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f54e3715-121a-4498-a552-5a9f1daed55c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.151926 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.156667 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcm9x\" (UniqueName: \"kubernetes.io/projected/f54e3715-121a-4498-a552-5a9f1daed55c-kube-api-access-dcm9x\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.176659 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.731308 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg"] Feb 18 14:33:47 crc kubenswrapper[4817]: I0218 14:33:47.746007 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" event={"ID":"f54e3715-121a-4498-a552-5a9f1daed55c","Type":"ContainerStarted","Data":"b7093c3add64537f5583a66c33130d5eaeb268d210b231c99a1f406e88a84aaa"} Feb 18 14:33:48 crc kubenswrapper[4817]: I0218 14:33:48.755733 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" event={"ID":"f54e3715-121a-4498-a552-5a9f1daed55c","Type":"ContainerStarted","Data":"45bc11e78e5fafb51bcd98212248904d7fb85145d0949bd6780f1d67b21c56da"} Feb 18 14:33:48 crc kubenswrapper[4817]: I0218 14:33:48.786503 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" podStartSLOduration=2.201522543 podStartE2EDuration="2.786483302s" podCreationTimestamp="2026-02-18 14:33:46 +0000 UTC" firstStartedPulling="2026-02-18 14:33:47.735231486 +0000 UTC m=+2090.310767469" lastFinishedPulling="2026-02-18 14:33:48.320192235 +0000 UTC m=+2090.895728228" observedRunningTime="2026-02-18 14:33:48.781069045 +0000 UTC m=+2091.356605038" watchObservedRunningTime="2026-02-18 14:33:48.786483302 +0000 UTC m=+2091.362019295" Feb 18 14:34:23 crc kubenswrapper[4817]: I0218 14:34:23.067250 4817 generic.go:334] "Generic (PLEG): container finished" podID="f54e3715-121a-4498-a552-5a9f1daed55c" containerID="45bc11e78e5fafb51bcd98212248904d7fb85145d0949bd6780f1d67b21c56da" exitCode=0 Feb 18 14:34:23 crc kubenswrapper[4817]: I0218 14:34:23.067336 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" event={"ID":"f54e3715-121a-4498-a552-5a9f1daed55c","Type":"ContainerDied","Data":"45bc11e78e5fafb51bcd98212248904d7fb85145d0949bd6780f1d67b21c56da"} Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.587052 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.642176 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f54e3715-121a-4498-a552-5a9f1daed55c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"f54e3715-121a-4498-a552-5a9f1daed55c\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.642236 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcm9x\" (UniqueName: \"kubernetes.io/projected/f54e3715-121a-4498-a552-5a9f1daed55c-kube-api-access-dcm9x\") pod \"f54e3715-121a-4498-a552-5a9f1daed55c\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.642278 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-bootstrap-combined-ca-bundle\") pod \"f54e3715-121a-4498-a552-5a9f1daed55c\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.642341 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-neutron-metadata-combined-ca-bundle\") pod \"f54e3715-121a-4498-a552-5a9f1daed55c\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.642415 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-ovn-combined-ca-bundle\") pod \"f54e3715-121a-4498-a552-5a9f1daed55c\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.642450 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-telemetry-combined-ca-bundle\") pod \"f54e3715-121a-4498-a552-5a9f1daed55c\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.642487 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-ssh-key-openstack-edpm-ipam\") pod \"f54e3715-121a-4498-a552-5a9f1daed55c\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.642540 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-repo-setup-combined-ca-bundle\") pod \"f54e3715-121a-4498-a552-5a9f1daed55c\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.642580 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f54e3715-121a-4498-a552-5a9f1daed55c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"f54e3715-121a-4498-a552-5a9f1daed55c\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.642601 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-libvirt-combined-ca-bundle\") pod \"f54e3715-121a-4498-a552-5a9f1daed55c\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.642638 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-nova-combined-ca-bundle\") pod \"f54e3715-121a-4498-a552-5a9f1daed55c\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.642665 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-inventory\") pod \"f54e3715-121a-4498-a552-5a9f1daed55c\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.642710 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f54e3715-121a-4498-a552-5a9f1daed55c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"f54e3715-121a-4498-a552-5a9f1daed55c\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.642734 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f54e3715-121a-4498-a552-5a9f1daed55c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"f54e3715-121a-4498-a552-5a9f1daed55c\" (UID: \"f54e3715-121a-4498-a552-5a9f1daed55c\") " Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.654030 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f54e3715-121a-4498-a552-5a9f1daed55c-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "f54e3715-121a-4498-a552-5a9f1daed55c" (UID: "f54e3715-121a-4498-a552-5a9f1daed55c"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.654839 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f54e3715-121a-4498-a552-5a9f1daed55c-kube-api-access-dcm9x" (OuterVolumeSpecName: "kube-api-access-dcm9x") pod "f54e3715-121a-4498-a552-5a9f1daed55c" (UID: "f54e3715-121a-4498-a552-5a9f1daed55c"). InnerVolumeSpecName "kube-api-access-dcm9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.655555 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f54e3715-121a-4498-a552-5a9f1daed55c-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "f54e3715-121a-4498-a552-5a9f1daed55c" (UID: "f54e3715-121a-4498-a552-5a9f1daed55c"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.655971 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "f54e3715-121a-4498-a552-5a9f1daed55c" (UID: "f54e3715-121a-4498-a552-5a9f1daed55c"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.656038 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "f54e3715-121a-4498-a552-5a9f1daed55c" (UID: "f54e3715-121a-4498-a552-5a9f1daed55c"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.656059 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "f54e3715-121a-4498-a552-5a9f1daed55c" (UID: "f54e3715-121a-4498-a552-5a9f1daed55c"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.656143 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "f54e3715-121a-4498-a552-5a9f1daed55c" (UID: "f54e3715-121a-4498-a552-5a9f1daed55c"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.656154 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "f54e3715-121a-4498-a552-5a9f1daed55c" (UID: "f54e3715-121a-4498-a552-5a9f1daed55c"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.656146 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f54e3715-121a-4498-a552-5a9f1daed55c-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "f54e3715-121a-4498-a552-5a9f1daed55c" (UID: "f54e3715-121a-4498-a552-5a9f1daed55c"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.656213 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "f54e3715-121a-4498-a552-5a9f1daed55c" (UID: "f54e3715-121a-4498-a552-5a9f1daed55c"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.656548 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f54e3715-121a-4498-a552-5a9f1daed55c-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "f54e3715-121a-4498-a552-5a9f1daed55c" (UID: "f54e3715-121a-4498-a552-5a9f1daed55c"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.659602 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "f54e3715-121a-4498-a552-5a9f1daed55c" (UID: "f54e3715-121a-4498-a552-5a9f1daed55c"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.681628 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f54e3715-121a-4498-a552-5a9f1daed55c" (UID: "f54e3715-121a-4498-a552-5a9f1daed55c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.684786 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-inventory" (OuterVolumeSpecName: "inventory") pod "f54e3715-121a-4498-a552-5a9f1daed55c" (UID: "f54e3715-121a-4498-a552-5a9f1daed55c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.745377 4817 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.745416 4817 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f54e3715-121a-4498-a552-5a9f1daed55c-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.745429 4817 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.745440 4817 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.745450 4817 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.745458 4817 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f54e3715-121a-4498-a552-5a9f1daed55c-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.745468 4817 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f54e3715-121a-4498-a552-5a9f1daed55c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.745483 4817 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f54e3715-121a-4498-a552-5a9f1daed55c-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.745503 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcm9x\" (UniqueName: \"kubernetes.io/projected/f54e3715-121a-4498-a552-5a9f1daed55c-kube-api-access-dcm9x\") on node \"crc\" DevicePath \"\"" Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.745544 4817 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.745557 4817 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.745570 4817 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.745581 4817 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:34:24 crc kubenswrapper[4817]: I0218 14:34:24.745589 4817 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f54e3715-121a-4498-a552-5a9f1daed55c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 14:34:25 crc kubenswrapper[4817]: I0218 14:34:25.086881 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" event={"ID":"f54e3715-121a-4498-a552-5a9f1daed55c","Type":"ContainerDied","Data":"b7093c3add64537f5583a66c33130d5eaeb268d210b231c99a1f406e88a84aaa"} Feb 18 14:34:25 crc kubenswrapper[4817]: I0218 14:34:25.086923 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7093c3add64537f5583a66c33130d5eaeb268d210b231c99a1f406e88a84aaa" Feb 18 14:34:25 crc kubenswrapper[4817]: I0218 14:34:25.086948 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg" Feb 18 14:34:25 crc kubenswrapper[4817]: I0218 14:34:25.197003 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-6twbf"] Feb 18 14:34:25 crc kubenswrapper[4817]: E0218 14:34:25.197411 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f54e3715-121a-4498-a552-5a9f1daed55c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 18 14:34:25 crc kubenswrapper[4817]: I0218 14:34:25.197429 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f54e3715-121a-4498-a552-5a9f1daed55c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 18 14:34:25 crc kubenswrapper[4817]: I0218 14:34:25.197622 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="f54e3715-121a-4498-a552-5a9f1daed55c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 18 14:34:25 crc kubenswrapper[4817]: I0218 14:34:25.198424 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6twbf" Feb 18 14:34:25 crc kubenswrapper[4817]: I0218 14:34:25.203872 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 14:34:25 crc kubenswrapper[4817]: I0218 14:34:25.204052 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 14:34:25 crc kubenswrapper[4817]: I0218 14:34:25.204230 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 18 14:34:25 crc kubenswrapper[4817]: I0218 14:34:25.204553 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 14:34:25 crc kubenswrapper[4817]: I0218 14:34:25.204675 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x8jkl" Feb 18 14:34:25 crc kubenswrapper[4817]: I0218 14:34:25.216313 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-6twbf"] Feb 18 14:34:25 crc kubenswrapper[4817]: I0218 14:34:25.354559 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae44a6c3-1d36-4f95-b52a-a1bedc6ec272-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6twbf\" (UID: \"ae44a6c3-1d36-4f95-b52a-a1bedc6ec272\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6twbf" Feb 18 14:34:25 crc kubenswrapper[4817]: I0218 14:34:25.354798 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znfrc\" (UniqueName: \"kubernetes.io/projected/ae44a6c3-1d36-4f95-b52a-a1bedc6ec272-kube-api-access-znfrc\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6twbf\" (UID: \"ae44a6c3-1d36-4f95-b52a-a1bedc6ec272\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6twbf" Feb 18 14:34:25 crc kubenswrapper[4817]: I0218 14:34:25.354847 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae44a6c3-1d36-4f95-b52a-a1bedc6ec272-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6twbf\" (UID: \"ae44a6c3-1d36-4f95-b52a-a1bedc6ec272\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6twbf" Feb 18 14:34:25 crc kubenswrapper[4817]: I0218 14:34:25.354884 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae44a6c3-1d36-4f95-b52a-a1bedc6ec272-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6twbf\" (UID: \"ae44a6c3-1d36-4f95-b52a-a1bedc6ec272\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6twbf" Feb 18 14:34:25 crc kubenswrapper[4817]: I0218 14:34:25.354915 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ae44a6c3-1d36-4f95-b52a-a1bedc6ec272-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6twbf\" (UID: \"ae44a6c3-1d36-4f95-b52a-a1bedc6ec272\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6twbf" Feb 18 14:34:25 crc kubenswrapper[4817]: I0218 14:34:25.456862 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae44a6c3-1d36-4f95-b52a-a1bedc6ec272-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6twbf\" (UID: \"ae44a6c3-1d36-4f95-b52a-a1bedc6ec272\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6twbf" Feb 18 14:34:25 crc kubenswrapper[4817]: I0218 14:34:25.457005 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znfrc\" (UniqueName: \"kubernetes.io/projected/ae44a6c3-1d36-4f95-b52a-a1bedc6ec272-kube-api-access-znfrc\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6twbf\" (UID: \"ae44a6c3-1d36-4f95-b52a-a1bedc6ec272\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6twbf" Feb 18 14:34:25 crc kubenswrapper[4817]: I0218 14:34:25.457040 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae44a6c3-1d36-4f95-b52a-a1bedc6ec272-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6twbf\" (UID: \"ae44a6c3-1d36-4f95-b52a-a1bedc6ec272\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6twbf" Feb 18 14:34:25 crc kubenswrapper[4817]: I0218 14:34:25.457067 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae44a6c3-1d36-4f95-b52a-a1bedc6ec272-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6twbf\" (UID: \"ae44a6c3-1d36-4f95-b52a-a1bedc6ec272\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6twbf" Feb 18 14:34:25 crc kubenswrapper[4817]: I0218 14:34:25.457090 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ae44a6c3-1d36-4f95-b52a-a1bedc6ec272-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6twbf\" (UID: \"ae44a6c3-1d36-4f95-b52a-a1bedc6ec272\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6twbf" Feb 18 14:34:25 crc kubenswrapper[4817]: I0218 14:34:25.458052 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ae44a6c3-1d36-4f95-b52a-a1bedc6ec272-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6twbf\" (UID: \"ae44a6c3-1d36-4f95-b52a-a1bedc6ec272\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6twbf" Feb 18 14:34:25 crc kubenswrapper[4817]: I0218 14:34:25.460808 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae44a6c3-1d36-4f95-b52a-a1bedc6ec272-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6twbf\" (UID: \"ae44a6c3-1d36-4f95-b52a-a1bedc6ec272\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6twbf" Feb 18 14:34:25 crc kubenswrapper[4817]: I0218 14:34:25.467470 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae44a6c3-1d36-4f95-b52a-a1bedc6ec272-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6twbf\" (UID: \"ae44a6c3-1d36-4f95-b52a-a1bedc6ec272\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6twbf" Feb 18 14:34:25 crc kubenswrapper[4817]: I0218 14:34:25.468193 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae44a6c3-1d36-4f95-b52a-a1bedc6ec272-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6twbf\" (UID: \"ae44a6c3-1d36-4f95-b52a-a1bedc6ec272\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6twbf" Feb 18 14:34:25 crc kubenswrapper[4817]: I0218 14:34:25.473436 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znfrc\" (UniqueName: \"kubernetes.io/projected/ae44a6c3-1d36-4f95-b52a-a1bedc6ec272-kube-api-access-znfrc\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6twbf\" (UID: \"ae44a6c3-1d36-4f95-b52a-a1bedc6ec272\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6twbf" Feb 18 14:34:25 crc kubenswrapper[4817]: I0218 14:34:25.532330 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6twbf" Feb 18 14:34:26 crc kubenswrapper[4817]: I0218 14:34:26.045484 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-6twbf"] Feb 18 14:34:26 crc kubenswrapper[4817]: I0218 14:34:26.098285 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6twbf" event={"ID":"ae44a6c3-1d36-4f95-b52a-a1bedc6ec272","Type":"ContainerStarted","Data":"2c3532432fd2eb750e5741b91f880f4124e5bd56a78b54144268d691de246718"} Feb 18 14:34:27 crc kubenswrapper[4817]: I0218 14:34:27.117562 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6twbf" event={"ID":"ae44a6c3-1d36-4f95-b52a-a1bedc6ec272","Type":"ContainerStarted","Data":"9813f58ea8ebfe5e88077b97ac607c300769d25b43d1163ce4f066f6db9d706f"} Feb 18 14:34:27 crc kubenswrapper[4817]: I0218 14:34:27.140874 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6twbf" podStartSLOduration=1.762472499 podStartE2EDuration="2.140856482s" podCreationTimestamp="2026-02-18 14:34:25 +0000 UTC" firstStartedPulling="2026-02-18 14:34:26.047539102 +0000 UTC m=+2128.623075085" lastFinishedPulling="2026-02-18 14:34:26.425923085 +0000 UTC m=+2129.001459068" observedRunningTime="2026-02-18 14:34:27.131942897 +0000 UTC m=+2129.707478880" watchObservedRunningTime="2026-02-18 14:34:27.140856482 +0000 UTC m=+2129.716392465" Feb 18 14:34:42 crc kubenswrapper[4817]: I0218 14:34:42.863872 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:34:42 crc kubenswrapper[4817]: I0218 14:34:42.864592 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:35:12 crc kubenswrapper[4817]: I0218 14:35:12.863902 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:35:12 crc kubenswrapper[4817]: I0218 14:35:12.864436 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:35:23 crc kubenswrapper[4817]: I0218 14:35:23.686097 4817 generic.go:334] "Generic (PLEG): container finished" podID="ae44a6c3-1d36-4f95-b52a-a1bedc6ec272" containerID="9813f58ea8ebfe5e88077b97ac607c300769d25b43d1163ce4f066f6db9d706f" exitCode=0 Feb 18 14:35:23 crc kubenswrapper[4817]: I0218 14:35:23.686168 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6twbf" event={"ID":"ae44a6c3-1d36-4f95-b52a-a1bedc6ec272","Type":"ContainerDied","Data":"9813f58ea8ebfe5e88077b97ac607c300769d25b43d1163ce4f066f6db9d706f"} Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.173743 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6twbf" Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.316960 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ae44a6c3-1d36-4f95-b52a-a1bedc6ec272-ovncontroller-config-0\") pod \"ae44a6c3-1d36-4f95-b52a-a1bedc6ec272\" (UID: \"ae44a6c3-1d36-4f95-b52a-a1bedc6ec272\") " Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.317094 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae44a6c3-1d36-4f95-b52a-a1bedc6ec272-ssh-key-openstack-edpm-ipam\") pod \"ae44a6c3-1d36-4f95-b52a-a1bedc6ec272\" (UID: \"ae44a6c3-1d36-4f95-b52a-a1bedc6ec272\") " Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.317344 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae44a6c3-1d36-4f95-b52a-a1bedc6ec272-ovn-combined-ca-bundle\") pod \"ae44a6c3-1d36-4f95-b52a-a1bedc6ec272\" (UID: \"ae44a6c3-1d36-4f95-b52a-a1bedc6ec272\") " Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.317381 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae44a6c3-1d36-4f95-b52a-a1bedc6ec272-inventory\") pod \"ae44a6c3-1d36-4f95-b52a-a1bedc6ec272\" (UID: \"ae44a6c3-1d36-4f95-b52a-a1bedc6ec272\") " Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.317406 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znfrc\" (UniqueName: \"kubernetes.io/projected/ae44a6c3-1d36-4f95-b52a-a1bedc6ec272-kube-api-access-znfrc\") pod \"ae44a6c3-1d36-4f95-b52a-a1bedc6ec272\" (UID: \"ae44a6c3-1d36-4f95-b52a-a1bedc6ec272\") " Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.322712 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae44a6c3-1d36-4f95-b52a-a1bedc6ec272-kube-api-access-znfrc" (OuterVolumeSpecName: "kube-api-access-znfrc") pod "ae44a6c3-1d36-4f95-b52a-a1bedc6ec272" (UID: "ae44a6c3-1d36-4f95-b52a-a1bedc6ec272"). InnerVolumeSpecName "kube-api-access-znfrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.324353 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae44a6c3-1d36-4f95-b52a-a1bedc6ec272-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "ae44a6c3-1d36-4f95-b52a-a1bedc6ec272" (UID: "ae44a6c3-1d36-4f95-b52a-a1bedc6ec272"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.360250 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae44a6c3-1d36-4f95-b52a-a1bedc6ec272-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ae44a6c3-1d36-4f95-b52a-a1bedc6ec272" (UID: "ae44a6c3-1d36-4f95-b52a-a1bedc6ec272"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.365757 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae44a6c3-1d36-4f95-b52a-a1bedc6ec272-inventory" (OuterVolumeSpecName: "inventory") pod "ae44a6c3-1d36-4f95-b52a-a1bedc6ec272" (UID: "ae44a6c3-1d36-4f95-b52a-a1bedc6ec272"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.384427 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae44a6c3-1d36-4f95-b52a-a1bedc6ec272-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "ae44a6c3-1d36-4f95-b52a-a1bedc6ec272" (UID: "ae44a6c3-1d36-4f95-b52a-a1bedc6ec272"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.420644 4817 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae44a6c3-1d36-4f95-b52a-a1bedc6ec272-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.420680 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znfrc\" (UniqueName: \"kubernetes.io/projected/ae44a6c3-1d36-4f95-b52a-a1bedc6ec272-kube-api-access-znfrc\") on node \"crc\" DevicePath \"\"" Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.420695 4817 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ae44a6c3-1d36-4f95-b52a-a1bedc6ec272-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.420704 4817 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae44a6c3-1d36-4f95-b52a-a1bedc6ec272-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.420715 4817 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae44a6c3-1d36-4f95-b52a-a1bedc6ec272-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.708090 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6twbf" event={"ID":"ae44a6c3-1d36-4f95-b52a-a1bedc6ec272","Type":"ContainerDied","Data":"2c3532432fd2eb750e5741b91f880f4124e5bd56a78b54144268d691de246718"} Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.708133 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c3532432fd2eb750e5741b91f880f4124e5bd56a78b54144268d691de246718" Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.708151 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6twbf" Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.843707 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2"] Feb 18 14:35:25 crc kubenswrapper[4817]: E0218 14:35:25.844219 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae44a6c3-1d36-4f95-b52a-a1bedc6ec272" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.844241 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae44a6c3-1d36-4f95-b52a-a1bedc6ec272" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.844518 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae44a6c3-1d36-4f95-b52a-a1bedc6ec272" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.845483 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2" Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.850257 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.850999 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.852886 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.853526 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.853547 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.853659 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x8jkl" Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.859510 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2"] Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.929646 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43e2549e-9d03-495a-852e-0d0c283c5d51-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2\" (UID: \"43e2549e-9d03-495a-852e-0d0c283c5d51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2" Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.929703 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43e2549e-9d03-495a-852e-0d0c283c5d51-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2\" (UID: \"43e2549e-9d03-495a-852e-0d0c283c5d51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2" Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.929749 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/43e2549e-9d03-495a-852e-0d0c283c5d51-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2\" (UID: \"43e2549e-9d03-495a-852e-0d0c283c5d51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2" Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.930001 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv7w4\" (UniqueName: \"kubernetes.io/projected/43e2549e-9d03-495a-852e-0d0c283c5d51-kube-api-access-qv7w4\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2\" (UID: \"43e2549e-9d03-495a-852e-0d0c283c5d51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2" Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.930075 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43e2549e-9d03-495a-852e-0d0c283c5d51-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2\" (UID: \"43e2549e-9d03-495a-852e-0d0c283c5d51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2" Feb 18 14:35:25 crc kubenswrapper[4817]: I0218 14:35:25.930116 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/43e2549e-9d03-495a-852e-0d0c283c5d51-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2\" (UID: \"43e2549e-9d03-495a-852e-0d0c283c5d51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2" Feb 18 14:35:26 crc kubenswrapper[4817]: I0218 14:35:26.031531 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv7w4\" (UniqueName: \"kubernetes.io/projected/43e2549e-9d03-495a-852e-0d0c283c5d51-kube-api-access-qv7w4\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2\" (UID: \"43e2549e-9d03-495a-852e-0d0c283c5d51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2" Feb 18 14:35:26 crc kubenswrapper[4817]: I0218 14:35:26.031574 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43e2549e-9d03-495a-852e-0d0c283c5d51-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2\" (UID: \"43e2549e-9d03-495a-852e-0d0c283c5d51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2" Feb 18 14:35:26 crc kubenswrapper[4817]: I0218 14:35:26.031605 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/43e2549e-9d03-495a-852e-0d0c283c5d51-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2\" (UID: \"43e2549e-9d03-495a-852e-0d0c283c5d51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2" Feb 18 14:35:26 crc kubenswrapper[4817]: I0218 14:35:26.031638 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43e2549e-9d03-495a-852e-0d0c283c5d51-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2\" (UID: \"43e2549e-9d03-495a-852e-0d0c283c5d51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2" Feb 18 14:35:26 crc kubenswrapper[4817]: I0218 14:35:26.031660 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43e2549e-9d03-495a-852e-0d0c283c5d51-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2\" (UID: \"43e2549e-9d03-495a-852e-0d0c283c5d51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2" Feb 18 14:35:26 crc kubenswrapper[4817]: I0218 14:35:26.031687 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/43e2549e-9d03-495a-852e-0d0c283c5d51-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2\" (UID: \"43e2549e-9d03-495a-852e-0d0c283c5d51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2" Feb 18 14:35:26 crc kubenswrapper[4817]: I0218 14:35:26.036297 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/43e2549e-9d03-495a-852e-0d0c283c5d51-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2\" (UID: \"43e2549e-9d03-495a-852e-0d0c283c5d51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2" Feb 18 14:35:26 crc kubenswrapper[4817]: I0218 14:35:26.036320 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43e2549e-9d03-495a-852e-0d0c283c5d51-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2\" (UID: \"43e2549e-9d03-495a-852e-0d0c283c5d51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2" Feb 18 14:35:26 crc kubenswrapper[4817]: I0218 14:35:26.036484 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43e2549e-9d03-495a-852e-0d0c283c5d51-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2\" (UID: \"43e2549e-9d03-495a-852e-0d0c283c5d51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2" Feb 18 14:35:26 crc kubenswrapper[4817]: I0218 14:35:26.037141 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43e2549e-9d03-495a-852e-0d0c283c5d51-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2\" (UID: \"43e2549e-9d03-495a-852e-0d0c283c5d51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2" Feb 18 14:35:26 crc kubenswrapper[4817]: I0218 14:35:26.037877 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/43e2549e-9d03-495a-852e-0d0c283c5d51-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2\" (UID: \"43e2549e-9d03-495a-852e-0d0c283c5d51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2" Feb 18 14:35:26 crc kubenswrapper[4817]: I0218 14:35:26.050957 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv7w4\" (UniqueName: \"kubernetes.io/projected/43e2549e-9d03-495a-852e-0d0c283c5d51-kube-api-access-qv7w4\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2\" (UID: \"43e2549e-9d03-495a-852e-0d0c283c5d51\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2" Feb 18 14:35:26 crc kubenswrapper[4817]: I0218 14:35:26.167264 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2" Feb 18 14:35:26 crc kubenswrapper[4817]: I0218 14:35:26.724430 4817 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 14:35:26 crc kubenswrapper[4817]: I0218 14:35:26.739949 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2"] Feb 18 14:35:26 crc kubenswrapper[4817]: I0218 14:35:26.750826 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2" event={"ID":"43e2549e-9d03-495a-852e-0d0c283c5d51","Type":"ContainerStarted","Data":"f9e9f7b9ded0b10ef70c37ea65c569b1fbe6e8953558b837762518c2c0594e86"} Feb 18 14:35:27 crc kubenswrapper[4817]: I0218 14:35:27.761965 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2" event={"ID":"43e2549e-9d03-495a-852e-0d0c283c5d51","Type":"ContainerStarted","Data":"f9eb751dfedf89c9b34d7bf854f01b58b83feb005d83b32ff405d03850d5426b"} Feb 18 14:35:27 crc kubenswrapper[4817]: I0218 14:35:27.790288 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2" podStartSLOduration=2.340906475 podStartE2EDuration="2.790260683s" podCreationTimestamp="2026-02-18 14:35:25 +0000 UTC" firstStartedPulling="2026-02-18 14:35:26.724133601 +0000 UTC m=+2189.299669584" lastFinishedPulling="2026-02-18 14:35:27.173487809 +0000 UTC m=+2189.749023792" observedRunningTime="2026-02-18 14:35:27.778936377 +0000 UTC m=+2190.354472370" watchObservedRunningTime="2026-02-18 14:35:27.790260683 +0000 UTC m=+2190.365796666" Feb 18 14:35:42 crc kubenswrapper[4817]: I0218 14:35:42.863441 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:35:42 crc kubenswrapper[4817]: I0218 14:35:42.865147 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:35:42 crc kubenswrapper[4817]: I0218 14:35:42.865234 4817 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" Feb 18 14:35:42 crc kubenswrapper[4817]: I0218 14:35:42.866184 4817 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"350b94e016e63e9dbf9f1cb2943e19d533f92423c570ba0a133fb08ef9bb2a0b"} pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 14:35:42 crc kubenswrapper[4817]: I0218 14:35:42.866276 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" containerID="cri-o://350b94e016e63e9dbf9f1cb2943e19d533f92423c570ba0a133fb08ef9bb2a0b" gracePeriod=600 Feb 18 14:35:43 crc kubenswrapper[4817]: I0218 14:35:43.897388 4817 generic.go:334] "Generic (PLEG): container finished" podID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerID="350b94e016e63e9dbf9f1cb2943e19d533f92423c570ba0a133fb08ef9bb2a0b" exitCode=0 Feb 18 14:35:43 crc kubenswrapper[4817]: I0218 14:35:43.897934 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerDied","Data":"350b94e016e63e9dbf9f1cb2943e19d533f92423c570ba0a133fb08ef9bb2a0b"} Feb 18 14:35:43 crc kubenswrapper[4817]: I0218 14:35:43.897961 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerStarted","Data":"e371808d6223e38f501e7cfedb8ba2c785e190ea4bdf5f6fac2fa1dfa7ee7693"} Feb 18 14:35:43 crc kubenswrapper[4817]: I0218 14:35:43.897995 4817 scope.go:117] "RemoveContainer" containerID="77626d3d8b06dd06e4699408c75ed0f84d2534db6eae4ab257458701371a9858" Feb 18 14:36:14 crc kubenswrapper[4817]: I0218 14:36:14.181919 4817 generic.go:334] "Generic (PLEG): container finished" podID="43e2549e-9d03-495a-852e-0d0c283c5d51" containerID="f9eb751dfedf89c9b34d7bf854f01b58b83feb005d83b32ff405d03850d5426b" exitCode=0 Feb 18 14:36:14 crc kubenswrapper[4817]: I0218 14:36:14.183659 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2" event={"ID":"43e2549e-9d03-495a-852e-0d0c283c5d51","Type":"ContainerDied","Data":"f9eb751dfedf89c9b34d7bf854f01b58b83feb005d83b32ff405d03850d5426b"} Feb 18 14:36:15 crc kubenswrapper[4817]: I0218 14:36:15.776259 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2" Feb 18 14:36:15 crc kubenswrapper[4817]: I0218 14:36:15.906283 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43e2549e-9d03-495a-852e-0d0c283c5d51-ssh-key-openstack-edpm-ipam\") pod \"43e2549e-9d03-495a-852e-0d0c283c5d51\" (UID: \"43e2549e-9d03-495a-852e-0d0c283c5d51\") " Feb 18 14:36:15 crc kubenswrapper[4817]: I0218 14:36:15.906397 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43e2549e-9d03-495a-852e-0d0c283c5d51-neutron-metadata-combined-ca-bundle\") pod \"43e2549e-9d03-495a-852e-0d0c283c5d51\" (UID: \"43e2549e-9d03-495a-852e-0d0c283c5d51\") " Feb 18 14:36:15 crc kubenswrapper[4817]: I0218 14:36:15.906576 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv7w4\" (UniqueName: \"kubernetes.io/projected/43e2549e-9d03-495a-852e-0d0c283c5d51-kube-api-access-qv7w4\") pod \"43e2549e-9d03-495a-852e-0d0c283c5d51\" (UID: \"43e2549e-9d03-495a-852e-0d0c283c5d51\") " Feb 18 14:36:15 crc kubenswrapper[4817]: I0218 14:36:15.906617 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/43e2549e-9d03-495a-852e-0d0c283c5d51-neutron-ovn-metadata-agent-neutron-config-0\") pod \"43e2549e-9d03-495a-852e-0d0c283c5d51\" (UID: \"43e2549e-9d03-495a-852e-0d0c283c5d51\") " Feb 18 14:36:15 crc kubenswrapper[4817]: I0218 14:36:15.906687 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/43e2549e-9d03-495a-852e-0d0c283c5d51-nova-metadata-neutron-config-0\") pod \"43e2549e-9d03-495a-852e-0d0c283c5d51\" (UID: \"43e2549e-9d03-495a-852e-0d0c283c5d51\") " Feb 18 14:36:15 crc kubenswrapper[4817]: I0218 14:36:15.906705 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43e2549e-9d03-495a-852e-0d0c283c5d51-inventory\") pod \"43e2549e-9d03-495a-852e-0d0c283c5d51\" (UID: \"43e2549e-9d03-495a-852e-0d0c283c5d51\") " Feb 18 14:36:15 crc kubenswrapper[4817]: I0218 14:36:15.912380 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43e2549e-9d03-495a-852e-0d0c283c5d51-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "43e2549e-9d03-495a-852e-0d0c283c5d51" (UID: "43e2549e-9d03-495a-852e-0d0c283c5d51"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:36:15 crc kubenswrapper[4817]: I0218 14:36:15.912845 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43e2549e-9d03-495a-852e-0d0c283c5d51-kube-api-access-qv7w4" (OuterVolumeSpecName: "kube-api-access-qv7w4") pod "43e2549e-9d03-495a-852e-0d0c283c5d51" (UID: "43e2549e-9d03-495a-852e-0d0c283c5d51"). InnerVolumeSpecName "kube-api-access-qv7w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:36:15 crc kubenswrapper[4817]: I0218 14:36:15.937973 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43e2549e-9d03-495a-852e-0d0c283c5d51-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "43e2549e-9d03-495a-852e-0d0c283c5d51" (UID: "43e2549e-9d03-495a-852e-0d0c283c5d51"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:36:15 crc kubenswrapper[4817]: I0218 14:36:15.938307 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43e2549e-9d03-495a-852e-0d0c283c5d51-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "43e2549e-9d03-495a-852e-0d0c283c5d51" (UID: "43e2549e-9d03-495a-852e-0d0c283c5d51"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:36:15 crc kubenswrapper[4817]: I0218 14:36:15.938836 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43e2549e-9d03-495a-852e-0d0c283c5d51-inventory" (OuterVolumeSpecName: "inventory") pod "43e2549e-9d03-495a-852e-0d0c283c5d51" (UID: "43e2549e-9d03-495a-852e-0d0c283c5d51"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:36:15 crc kubenswrapper[4817]: I0218 14:36:15.941176 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43e2549e-9d03-495a-852e-0d0c283c5d51-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "43e2549e-9d03-495a-852e-0d0c283c5d51" (UID: "43e2549e-9d03-495a-852e-0d0c283c5d51"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.009861 4817 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43e2549e-9d03-495a-852e-0d0c283c5d51-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.009898 4817 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/43e2549e-9d03-495a-852e-0d0c283c5d51-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.009911 4817 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43e2549e-9d03-495a-852e-0d0c283c5d51-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.009922 4817 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43e2549e-9d03-495a-852e-0d0c283c5d51-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.009937 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv7w4\" (UniqueName: \"kubernetes.io/projected/43e2549e-9d03-495a-852e-0d0c283c5d51-kube-api-access-qv7w4\") on node \"crc\" DevicePath \"\"" Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.009949 4817 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/43e2549e-9d03-495a-852e-0d0c283c5d51-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.203009 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2" event={"ID":"43e2549e-9d03-495a-852e-0d0c283c5d51","Type":"ContainerDied","Data":"f9e9f7b9ded0b10ef70c37ea65c569b1fbe6e8953558b837762518c2c0594e86"} Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.203057 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9e9f7b9ded0b10ef70c37ea65c569b1fbe6e8953558b837762518c2c0594e86" Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.203065 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2" Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.335296 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2"] Feb 18 14:36:16 crc kubenswrapper[4817]: E0218 14:36:16.335776 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43e2549e-9d03-495a-852e-0d0c283c5d51" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.335790 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="43e2549e-9d03-495a-852e-0d0c283c5d51" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.335991 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="43e2549e-9d03-495a-852e-0d0c283c5d51" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.343377 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2" Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.348728 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.349012 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.353433 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.353771 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x8jkl" Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.353989 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.385336 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2"] Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.423313 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/43d12b3f-f980-4075-8684-a97141a5474d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2\" (UID: \"43d12b3f-f980-4075-8684-a97141a5474d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2" Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.423436 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43d12b3f-f980-4075-8684-a97141a5474d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2\" (UID: \"43d12b3f-f980-4075-8684-a97141a5474d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2" Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.423535 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43d12b3f-f980-4075-8684-a97141a5474d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2\" (UID: \"43d12b3f-f980-4075-8684-a97141a5474d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2" Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.423611 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43d12b3f-f980-4075-8684-a97141a5474d-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2\" (UID: \"43d12b3f-f980-4075-8684-a97141a5474d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2" Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.423649 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5tsn\" (UniqueName: \"kubernetes.io/projected/43d12b3f-f980-4075-8684-a97141a5474d-kube-api-access-b5tsn\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2\" (UID: \"43d12b3f-f980-4075-8684-a97141a5474d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2" Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.525157 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/43d12b3f-f980-4075-8684-a97141a5474d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2\" (UID: \"43d12b3f-f980-4075-8684-a97141a5474d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2" Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.525296 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43d12b3f-f980-4075-8684-a97141a5474d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2\" (UID: \"43d12b3f-f980-4075-8684-a97141a5474d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2" Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.525366 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43d12b3f-f980-4075-8684-a97141a5474d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2\" (UID: \"43d12b3f-f980-4075-8684-a97141a5474d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2" Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.525391 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43d12b3f-f980-4075-8684-a97141a5474d-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2\" (UID: \"43d12b3f-f980-4075-8684-a97141a5474d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2" Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.525417 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5tsn\" (UniqueName: \"kubernetes.io/projected/43d12b3f-f980-4075-8684-a97141a5474d-kube-api-access-b5tsn\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2\" (UID: \"43d12b3f-f980-4075-8684-a97141a5474d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2" Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.528915 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/43d12b3f-f980-4075-8684-a97141a5474d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2\" (UID: \"43d12b3f-f980-4075-8684-a97141a5474d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2" Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.528931 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43d12b3f-f980-4075-8684-a97141a5474d-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2\" (UID: \"43d12b3f-f980-4075-8684-a97141a5474d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2" Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.528949 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43d12b3f-f980-4075-8684-a97141a5474d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2\" (UID: \"43d12b3f-f980-4075-8684-a97141a5474d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2" Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.529730 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43d12b3f-f980-4075-8684-a97141a5474d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2\" (UID: \"43d12b3f-f980-4075-8684-a97141a5474d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2" Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.542155 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5tsn\" (UniqueName: \"kubernetes.io/projected/43d12b3f-f980-4075-8684-a97141a5474d-kube-api-access-b5tsn\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2\" (UID: \"43d12b3f-f980-4075-8684-a97141a5474d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2" Feb 18 14:36:16 crc kubenswrapper[4817]: I0218 14:36:16.703665 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2" Feb 18 14:36:17 crc kubenswrapper[4817]: I0218 14:36:17.278859 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2"] Feb 18 14:36:18 crc kubenswrapper[4817]: I0218 14:36:18.229036 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2" event={"ID":"43d12b3f-f980-4075-8684-a97141a5474d","Type":"ContainerStarted","Data":"4813131052d8d4176fe80b9a4078cbfa21c3db3c004bf68d39815e51ea320642"} Feb 18 14:36:18 crc kubenswrapper[4817]: I0218 14:36:18.229864 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2" event={"ID":"43d12b3f-f980-4075-8684-a97141a5474d","Type":"ContainerStarted","Data":"6e1e268fc78a501984a69cede89253b84e2bfc2af9e7b75d9f19f318e1c5c52a"} Feb 18 14:36:18 crc kubenswrapper[4817]: I0218 14:36:18.248057 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2" podStartSLOduration=1.7533444459999998 podStartE2EDuration="2.248037031s" podCreationTimestamp="2026-02-18 14:36:16 +0000 UTC" firstStartedPulling="2026-02-18 14:36:17.29181812 +0000 UTC m=+2239.867354103" lastFinishedPulling="2026-02-18 14:36:17.786510695 +0000 UTC m=+2240.362046688" observedRunningTime="2026-02-18 14:36:18.247671052 +0000 UTC m=+2240.823207035" watchObservedRunningTime="2026-02-18 14:36:18.248037031 +0000 UTC m=+2240.823573014" Feb 18 14:37:06 crc kubenswrapper[4817]: I0218 14:37:06.942093 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k7dl8"] Feb 18 14:37:06 crc kubenswrapper[4817]: I0218 14:37:06.945001 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k7dl8" Feb 18 14:37:06 crc kubenswrapper[4817]: I0218 14:37:06.959091 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7dl8"] Feb 18 14:37:07 crc kubenswrapper[4817]: I0218 14:37:07.064150 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a7119d6-86df-4859-adc4-ee256c56da00-catalog-content\") pod \"redhat-marketplace-k7dl8\" (UID: \"3a7119d6-86df-4859-adc4-ee256c56da00\") " pod="openshift-marketplace/redhat-marketplace-k7dl8" Feb 18 14:37:07 crc kubenswrapper[4817]: I0218 14:37:07.064469 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a7119d6-86df-4859-adc4-ee256c56da00-utilities\") pod \"redhat-marketplace-k7dl8\" (UID: \"3a7119d6-86df-4859-adc4-ee256c56da00\") " pod="openshift-marketplace/redhat-marketplace-k7dl8" Feb 18 14:37:07 crc kubenswrapper[4817]: I0218 14:37:07.064626 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd8r2\" (UniqueName: \"kubernetes.io/projected/3a7119d6-86df-4859-adc4-ee256c56da00-kube-api-access-kd8r2\") pod \"redhat-marketplace-k7dl8\" (UID: \"3a7119d6-86df-4859-adc4-ee256c56da00\") " pod="openshift-marketplace/redhat-marketplace-k7dl8" Feb 18 14:37:07 crc kubenswrapper[4817]: I0218 14:37:07.166709 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a7119d6-86df-4859-adc4-ee256c56da00-catalog-content\") pod \"redhat-marketplace-k7dl8\" (UID: \"3a7119d6-86df-4859-adc4-ee256c56da00\") " pod="openshift-marketplace/redhat-marketplace-k7dl8" Feb 18 14:37:07 crc kubenswrapper[4817]: I0218 14:37:07.166954 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a7119d6-86df-4859-adc4-ee256c56da00-utilities\") pod \"redhat-marketplace-k7dl8\" (UID: \"3a7119d6-86df-4859-adc4-ee256c56da00\") " pod="openshift-marketplace/redhat-marketplace-k7dl8" Feb 18 14:37:07 crc kubenswrapper[4817]: I0218 14:37:07.167081 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd8r2\" (UniqueName: \"kubernetes.io/projected/3a7119d6-86df-4859-adc4-ee256c56da00-kube-api-access-kd8r2\") pod \"redhat-marketplace-k7dl8\" (UID: \"3a7119d6-86df-4859-adc4-ee256c56da00\") " pod="openshift-marketplace/redhat-marketplace-k7dl8" Feb 18 14:37:07 crc kubenswrapper[4817]: I0218 14:37:07.167144 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a7119d6-86df-4859-adc4-ee256c56da00-catalog-content\") pod \"redhat-marketplace-k7dl8\" (UID: \"3a7119d6-86df-4859-adc4-ee256c56da00\") " pod="openshift-marketplace/redhat-marketplace-k7dl8" Feb 18 14:37:07 crc kubenswrapper[4817]: I0218 14:37:07.167463 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a7119d6-86df-4859-adc4-ee256c56da00-utilities\") pod \"redhat-marketplace-k7dl8\" (UID: \"3a7119d6-86df-4859-adc4-ee256c56da00\") " pod="openshift-marketplace/redhat-marketplace-k7dl8" Feb 18 14:37:07 crc kubenswrapper[4817]: I0218 14:37:07.189949 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd8r2\" (UniqueName: \"kubernetes.io/projected/3a7119d6-86df-4859-adc4-ee256c56da00-kube-api-access-kd8r2\") pod \"redhat-marketplace-k7dl8\" (UID: \"3a7119d6-86df-4859-adc4-ee256c56da00\") " pod="openshift-marketplace/redhat-marketplace-k7dl8" Feb 18 14:37:07 crc kubenswrapper[4817]: I0218 14:37:07.269339 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k7dl8" Feb 18 14:37:07 crc kubenswrapper[4817]: I0218 14:37:07.806413 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7dl8"] Feb 18 14:37:07 crc kubenswrapper[4817]: W0218 14:37:07.807817 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a7119d6_86df_4859_adc4_ee256c56da00.slice/crio-8740c9250ee6c3c76e985994009529bc290191f6b08db54c6be3b07ff1f2d1e1 WatchSource:0}: Error finding container 8740c9250ee6c3c76e985994009529bc290191f6b08db54c6be3b07ff1f2d1e1: Status 404 returned error can't find the container with id 8740c9250ee6c3c76e985994009529bc290191f6b08db54c6be3b07ff1f2d1e1 Feb 18 14:37:08 crc kubenswrapper[4817]: I0218 14:37:08.699435 4817 generic.go:334] "Generic (PLEG): container finished" podID="3a7119d6-86df-4859-adc4-ee256c56da00" containerID="75d8c6124b37b666999ba1e7c0abf1a6f13fb590a83d1fa9c1fa8df583bce33f" exitCode=0 Feb 18 14:37:08 crc kubenswrapper[4817]: I0218 14:37:08.699487 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7dl8" event={"ID":"3a7119d6-86df-4859-adc4-ee256c56da00","Type":"ContainerDied","Data":"75d8c6124b37b666999ba1e7c0abf1a6f13fb590a83d1fa9c1fa8df583bce33f"} Feb 18 14:37:08 crc kubenswrapper[4817]: I0218 14:37:08.700000 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7dl8" event={"ID":"3a7119d6-86df-4859-adc4-ee256c56da00","Type":"ContainerStarted","Data":"8740c9250ee6c3c76e985994009529bc290191f6b08db54c6be3b07ff1f2d1e1"} Feb 18 14:37:10 crc kubenswrapper[4817]: I0218 14:37:10.721570 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7dl8" event={"ID":"3a7119d6-86df-4859-adc4-ee256c56da00","Type":"ContainerStarted","Data":"14ed2a1a0b29db201ffd59b9c99375e29bc40f42d6fe5d69c4fe2b8a0218845a"} Feb 18 14:37:11 crc kubenswrapper[4817]: I0218 14:37:11.733526 4817 generic.go:334] "Generic (PLEG): container finished" podID="3a7119d6-86df-4859-adc4-ee256c56da00" containerID="14ed2a1a0b29db201ffd59b9c99375e29bc40f42d6fe5d69c4fe2b8a0218845a" exitCode=0 Feb 18 14:37:11 crc kubenswrapper[4817]: I0218 14:37:11.733571 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7dl8" event={"ID":"3a7119d6-86df-4859-adc4-ee256c56da00","Type":"ContainerDied","Data":"14ed2a1a0b29db201ffd59b9c99375e29bc40f42d6fe5d69c4fe2b8a0218845a"} Feb 18 14:37:12 crc kubenswrapper[4817]: I0218 14:37:12.746725 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7dl8" event={"ID":"3a7119d6-86df-4859-adc4-ee256c56da00","Type":"ContainerStarted","Data":"49d6349a52c9b341e57715d4c2e8a8f56d69382618b4e47aaf56790b6ed34eab"} Feb 18 14:37:12 crc kubenswrapper[4817]: I0218 14:37:12.784324 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k7dl8" podStartSLOduration=3.275251936 podStartE2EDuration="6.784300302s" podCreationTimestamp="2026-02-18 14:37:06 +0000 UTC" firstStartedPulling="2026-02-18 14:37:08.702206168 +0000 UTC m=+2291.277742151" lastFinishedPulling="2026-02-18 14:37:12.211254534 +0000 UTC m=+2294.786790517" observedRunningTime="2026-02-18 14:37:12.772252517 +0000 UTC m=+2295.347788500" watchObservedRunningTime="2026-02-18 14:37:12.784300302 +0000 UTC m=+2295.359836295" Feb 18 14:37:17 crc kubenswrapper[4817]: I0218 14:37:17.270464 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k7dl8" Feb 18 14:37:17 crc kubenswrapper[4817]: I0218 14:37:17.272054 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k7dl8" Feb 18 14:37:17 crc kubenswrapper[4817]: I0218 14:37:17.341882 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k7dl8" Feb 18 14:37:17 crc kubenswrapper[4817]: I0218 14:37:17.852257 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k7dl8" Feb 18 14:37:17 crc kubenswrapper[4817]: I0218 14:37:17.902426 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7dl8"] Feb 18 14:37:19 crc kubenswrapper[4817]: I0218 14:37:19.809880 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k7dl8" podUID="3a7119d6-86df-4859-adc4-ee256c56da00" containerName="registry-server" containerID="cri-o://49d6349a52c9b341e57715d4c2e8a8f56d69382618b4e47aaf56790b6ed34eab" gracePeriod=2 Feb 18 14:37:20 crc kubenswrapper[4817]: I0218 14:37:20.820756 4817 generic.go:334] "Generic (PLEG): container finished" podID="3a7119d6-86df-4859-adc4-ee256c56da00" containerID="49d6349a52c9b341e57715d4c2e8a8f56d69382618b4e47aaf56790b6ed34eab" exitCode=0 Feb 18 14:37:20 crc kubenswrapper[4817]: I0218 14:37:20.820786 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7dl8" event={"ID":"3a7119d6-86df-4859-adc4-ee256c56da00","Type":"ContainerDied","Data":"49d6349a52c9b341e57715d4c2e8a8f56d69382618b4e47aaf56790b6ed34eab"} Feb 18 14:37:20 crc kubenswrapper[4817]: I0218 14:37:20.944812 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k7dl8" Feb 18 14:37:21 crc kubenswrapper[4817]: I0218 14:37:21.053886 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a7119d6-86df-4859-adc4-ee256c56da00-catalog-content\") pod \"3a7119d6-86df-4859-adc4-ee256c56da00\" (UID: \"3a7119d6-86df-4859-adc4-ee256c56da00\") " Feb 18 14:37:21 crc kubenswrapper[4817]: I0218 14:37:21.053991 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd8r2\" (UniqueName: \"kubernetes.io/projected/3a7119d6-86df-4859-adc4-ee256c56da00-kube-api-access-kd8r2\") pod \"3a7119d6-86df-4859-adc4-ee256c56da00\" (UID: \"3a7119d6-86df-4859-adc4-ee256c56da00\") " Feb 18 14:37:21 crc kubenswrapper[4817]: I0218 14:37:21.054138 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a7119d6-86df-4859-adc4-ee256c56da00-utilities\") pod \"3a7119d6-86df-4859-adc4-ee256c56da00\" (UID: \"3a7119d6-86df-4859-adc4-ee256c56da00\") " Feb 18 14:37:21 crc kubenswrapper[4817]: I0218 14:37:21.055698 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a7119d6-86df-4859-adc4-ee256c56da00-utilities" (OuterVolumeSpecName: "utilities") pod "3a7119d6-86df-4859-adc4-ee256c56da00" (UID: "3a7119d6-86df-4859-adc4-ee256c56da00"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:37:21 crc kubenswrapper[4817]: I0218 14:37:21.063319 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a7119d6-86df-4859-adc4-ee256c56da00-kube-api-access-kd8r2" (OuterVolumeSpecName: "kube-api-access-kd8r2") pod "3a7119d6-86df-4859-adc4-ee256c56da00" (UID: "3a7119d6-86df-4859-adc4-ee256c56da00"). InnerVolumeSpecName "kube-api-access-kd8r2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:37:21 crc kubenswrapper[4817]: I0218 14:37:21.080208 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a7119d6-86df-4859-adc4-ee256c56da00-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3a7119d6-86df-4859-adc4-ee256c56da00" (UID: "3a7119d6-86df-4859-adc4-ee256c56da00"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:37:21 crc kubenswrapper[4817]: I0218 14:37:21.156410 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a7119d6-86df-4859-adc4-ee256c56da00-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 14:37:21 crc kubenswrapper[4817]: I0218 14:37:21.156454 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a7119d6-86df-4859-adc4-ee256c56da00-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 14:37:21 crc kubenswrapper[4817]: I0218 14:37:21.156467 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd8r2\" (UniqueName: \"kubernetes.io/projected/3a7119d6-86df-4859-adc4-ee256c56da00-kube-api-access-kd8r2\") on node \"crc\" DevicePath \"\"" Feb 18 14:37:21 crc kubenswrapper[4817]: I0218 14:37:21.834791 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k7dl8" event={"ID":"3a7119d6-86df-4859-adc4-ee256c56da00","Type":"ContainerDied","Data":"8740c9250ee6c3c76e985994009529bc290191f6b08db54c6be3b07ff1f2d1e1"} Feb 18 14:37:21 crc kubenswrapper[4817]: I0218 14:37:21.834838 4817 scope.go:117] "RemoveContainer" containerID="49d6349a52c9b341e57715d4c2e8a8f56d69382618b4e47aaf56790b6ed34eab" Feb 18 14:37:21 crc kubenswrapper[4817]: I0218 14:37:21.834896 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k7dl8" Feb 18 14:37:21 crc kubenswrapper[4817]: I0218 14:37:21.866403 4817 scope.go:117] "RemoveContainer" containerID="14ed2a1a0b29db201ffd59b9c99375e29bc40f42d6fe5d69c4fe2b8a0218845a" Feb 18 14:37:21 crc kubenswrapper[4817]: I0218 14:37:21.876652 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7dl8"] Feb 18 14:37:21 crc kubenswrapper[4817]: I0218 14:37:21.889776 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k7dl8"] Feb 18 14:37:21 crc kubenswrapper[4817]: I0218 14:37:21.905777 4817 scope.go:117] "RemoveContainer" containerID="75d8c6124b37b666999ba1e7c0abf1a6f13fb590a83d1fa9c1fa8df583bce33f" Feb 18 14:37:22 crc kubenswrapper[4817]: I0218 14:37:22.183402 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a7119d6-86df-4859-adc4-ee256c56da00" path="/var/lib/kubelet/pods/3a7119d6-86df-4859-adc4-ee256c56da00/volumes" Feb 18 14:38:12 crc kubenswrapper[4817]: I0218 14:38:12.863995 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:38:12 crc kubenswrapper[4817]: I0218 14:38:12.864524 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:38:42 crc kubenswrapper[4817]: I0218 14:38:42.863691 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:38:42 crc kubenswrapper[4817]: I0218 14:38:42.864335 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:38:59 crc kubenswrapper[4817]: I0218 14:38:59.639900 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7bskb"] Feb 18 14:38:59 crc kubenswrapper[4817]: E0218 14:38:59.641093 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a7119d6-86df-4859-adc4-ee256c56da00" containerName="extract-utilities" Feb 18 14:38:59 crc kubenswrapper[4817]: I0218 14:38:59.641112 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a7119d6-86df-4859-adc4-ee256c56da00" containerName="extract-utilities" Feb 18 14:38:59 crc kubenswrapper[4817]: E0218 14:38:59.641142 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a7119d6-86df-4859-adc4-ee256c56da00" containerName="registry-server" Feb 18 14:38:59 crc kubenswrapper[4817]: I0218 14:38:59.641149 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a7119d6-86df-4859-adc4-ee256c56da00" containerName="registry-server" Feb 18 14:38:59 crc kubenswrapper[4817]: E0218 14:38:59.641163 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a7119d6-86df-4859-adc4-ee256c56da00" containerName="extract-content" Feb 18 14:38:59 crc kubenswrapper[4817]: I0218 14:38:59.641168 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a7119d6-86df-4859-adc4-ee256c56da00" containerName="extract-content" Feb 18 14:38:59 crc kubenswrapper[4817]: I0218 14:38:59.641380 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a7119d6-86df-4859-adc4-ee256c56da00" containerName="registry-server" Feb 18 14:38:59 crc kubenswrapper[4817]: I0218 14:38:59.642950 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7bskb" Feb 18 14:38:59 crc kubenswrapper[4817]: I0218 14:38:59.662260 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7bskb"] Feb 18 14:38:59 crc kubenswrapper[4817]: I0218 14:38:59.762676 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctl58\" (UniqueName: \"kubernetes.io/projected/ff3031a6-0a5f-4644-a824-5543706de42e-kube-api-access-ctl58\") pod \"community-operators-7bskb\" (UID: \"ff3031a6-0a5f-4644-a824-5543706de42e\") " pod="openshift-marketplace/community-operators-7bskb" Feb 18 14:38:59 crc kubenswrapper[4817]: I0218 14:38:59.762829 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff3031a6-0a5f-4644-a824-5543706de42e-catalog-content\") pod \"community-operators-7bskb\" (UID: \"ff3031a6-0a5f-4644-a824-5543706de42e\") " pod="openshift-marketplace/community-operators-7bskb" Feb 18 14:38:59 crc kubenswrapper[4817]: I0218 14:38:59.762872 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff3031a6-0a5f-4644-a824-5543706de42e-utilities\") pod \"community-operators-7bskb\" (UID: \"ff3031a6-0a5f-4644-a824-5543706de42e\") " pod="openshift-marketplace/community-operators-7bskb" Feb 18 14:38:59 crc kubenswrapper[4817]: I0218 14:38:59.864293 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctl58\" (UniqueName: \"kubernetes.io/projected/ff3031a6-0a5f-4644-a824-5543706de42e-kube-api-access-ctl58\") pod \"community-operators-7bskb\" (UID: \"ff3031a6-0a5f-4644-a824-5543706de42e\") " pod="openshift-marketplace/community-operators-7bskb" Feb 18 14:38:59 crc kubenswrapper[4817]: I0218 14:38:59.864425 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff3031a6-0a5f-4644-a824-5543706de42e-catalog-content\") pod \"community-operators-7bskb\" (UID: \"ff3031a6-0a5f-4644-a824-5543706de42e\") " pod="openshift-marketplace/community-operators-7bskb" Feb 18 14:38:59 crc kubenswrapper[4817]: I0218 14:38:59.864462 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff3031a6-0a5f-4644-a824-5543706de42e-utilities\") pod \"community-operators-7bskb\" (UID: \"ff3031a6-0a5f-4644-a824-5543706de42e\") " pod="openshift-marketplace/community-operators-7bskb" Feb 18 14:38:59 crc kubenswrapper[4817]: I0218 14:38:59.864895 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff3031a6-0a5f-4644-a824-5543706de42e-utilities\") pod \"community-operators-7bskb\" (UID: \"ff3031a6-0a5f-4644-a824-5543706de42e\") " pod="openshift-marketplace/community-operators-7bskb" Feb 18 14:38:59 crc kubenswrapper[4817]: I0218 14:38:59.865000 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff3031a6-0a5f-4644-a824-5543706de42e-catalog-content\") pod \"community-operators-7bskb\" (UID: \"ff3031a6-0a5f-4644-a824-5543706de42e\") " pod="openshift-marketplace/community-operators-7bskb" Feb 18 14:38:59 crc kubenswrapper[4817]: I0218 14:38:59.891955 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctl58\" (UniqueName: \"kubernetes.io/projected/ff3031a6-0a5f-4644-a824-5543706de42e-kube-api-access-ctl58\") pod \"community-operators-7bskb\" (UID: \"ff3031a6-0a5f-4644-a824-5543706de42e\") " pod="openshift-marketplace/community-operators-7bskb" Feb 18 14:38:59 crc kubenswrapper[4817]: I0218 14:38:59.969179 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7bskb" Feb 18 14:39:00 crc kubenswrapper[4817]: I0218 14:39:00.494616 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7bskb"] Feb 18 14:39:00 crc kubenswrapper[4817]: I0218 14:39:00.831202 4817 generic.go:334] "Generic (PLEG): container finished" podID="ff3031a6-0a5f-4644-a824-5543706de42e" containerID="ef480b9404cc0568d798355d7ce029d9bb17d1931a5616897c8449bdfcd9ce4b" exitCode=0 Feb 18 14:39:00 crc kubenswrapper[4817]: I0218 14:39:00.831455 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bskb" event={"ID":"ff3031a6-0a5f-4644-a824-5543706de42e","Type":"ContainerDied","Data":"ef480b9404cc0568d798355d7ce029d9bb17d1931a5616897c8449bdfcd9ce4b"} Feb 18 14:39:00 crc kubenswrapper[4817]: I0218 14:39:00.831533 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bskb" event={"ID":"ff3031a6-0a5f-4644-a824-5543706de42e","Type":"ContainerStarted","Data":"bf757fb2c49248d72d18bc0e70cf15dec87aa85162b64281d4ec2c29c3f4122e"} Feb 18 14:39:02 crc kubenswrapper[4817]: I0218 14:39:02.852017 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bskb" event={"ID":"ff3031a6-0a5f-4644-a824-5543706de42e","Type":"ContainerStarted","Data":"6e57058287ffbe2beac1cbba1a1a339345e6600d6cface8f405a4a5764de73d4"} Feb 18 14:39:03 crc kubenswrapper[4817]: I0218 14:39:03.864655 4817 generic.go:334] "Generic (PLEG): container finished" podID="ff3031a6-0a5f-4644-a824-5543706de42e" containerID="6e57058287ffbe2beac1cbba1a1a339345e6600d6cface8f405a4a5764de73d4" exitCode=0 Feb 18 14:39:03 crc kubenswrapper[4817]: I0218 14:39:03.864701 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bskb" event={"ID":"ff3031a6-0a5f-4644-a824-5543706de42e","Type":"ContainerDied","Data":"6e57058287ffbe2beac1cbba1a1a339345e6600d6cface8f405a4a5764de73d4"} Feb 18 14:39:04 crc kubenswrapper[4817]: I0218 14:39:04.878399 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bskb" event={"ID":"ff3031a6-0a5f-4644-a824-5543706de42e","Type":"ContainerStarted","Data":"9453190932fb0dd2b40de0330cd66f80ad806852c8ee7e5dac1dd70e11dbc1b2"} Feb 18 14:39:04 crc kubenswrapper[4817]: I0218 14:39:04.911261 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7bskb" podStartSLOduration=2.472344689 podStartE2EDuration="5.911242561s" podCreationTimestamp="2026-02-18 14:38:59 +0000 UTC" firstStartedPulling="2026-02-18 14:39:00.834046251 +0000 UTC m=+2403.409582244" lastFinishedPulling="2026-02-18 14:39:04.272944133 +0000 UTC m=+2406.848480116" observedRunningTime="2026-02-18 14:39:04.901661448 +0000 UTC m=+2407.477197471" watchObservedRunningTime="2026-02-18 14:39:04.911242561 +0000 UTC m=+2407.486778544" Feb 18 14:39:09 crc kubenswrapper[4817]: I0218 14:39:09.970139 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7bskb" Feb 18 14:39:09 crc kubenswrapper[4817]: I0218 14:39:09.970841 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7bskb" Feb 18 14:39:10 crc kubenswrapper[4817]: I0218 14:39:10.022854 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7bskb" Feb 18 14:39:10 crc kubenswrapper[4817]: I0218 14:39:10.983673 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7bskb" Feb 18 14:39:11 crc kubenswrapper[4817]: I0218 14:39:11.039316 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7bskb"] Feb 18 14:39:12 crc kubenswrapper[4817]: I0218 14:39:12.863728 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:39:12 crc kubenswrapper[4817]: I0218 14:39:12.863793 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:39:12 crc kubenswrapper[4817]: I0218 14:39:12.863859 4817 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" Feb 18 14:39:12 crc kubenswrapper[4817]: I0218 14:39:12.864820 4817 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e371808d6223e38f501e7cfedb8ba2c785e190ea4bdf5f6fac2fa1dfa7ee7693"} pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 14:39:12 crc kubenswrapper[4817]: I0218 14:39:12.864885 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" containerID="cri-o://e371808d6223e38f501e7cfedb8ba2c785e190ea4bdf5f6fac2fa1dfa7ee7693" gracePeriod=600 Feb 18 14:39:12 crc kubenswrapper[4817]: I0218 14:39:12.949311 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7bskb" podUID="ff3031a6-0a5f-4644-a824-5543706de42e" containerName="registry-server" containerID="cri-o://9453190932fb0dd2b40de0330cd66f80ad806852c8ee7e5dac1dd70e11dbc1b2" gracePeriod=2 Feb 18 14:39:12 crc kubenswrapper[4817]: E0218 14:39:12.984080 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:39:13 crc kubenswrapper[4817]: I0218 14:39:13.518086 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7bskb" Feb 18 14:39:13 crc kubenswrapper[4817]: I0218 14:39:13.553853 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff3031a6-0a5f-4644-a824-5543706de42e-catalog-content\") pod \"ff3031a6-0a5f-4644-a824-5543706de42e\" (UID: \"ff3031a6-0a5f-4644-a824-5543706de42e\") " Feb 18 14:39:13 crc kubenswrapper[4817]: I0218 14:39:13.554268 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctl58\" (UniqueName: \"kubernetes.io/projected/ff3031a6-0a5f-4644-a824-5543706de42e-kube-api-access-ctl58\") pod \"ff3031a6-0a5f-4644-a824-5543706de42e\" (UID: \"ff3031a6-0a5f-4644-a824-5543706de42e\") " Feb 18 14:39:13 crc kubenswrapper[4817]: I0218 14:39:13.554416 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff3031a6-0a5f-4644-a824-5543706de42e-utilities\") pod \"ff3031a6-0a5f-4644-a824-5543706de42e\" (UID: \"ff3031a6-0a5f-4644-a824-5543706de42e\") " Feb 18 14:39:13 crc kubenswrapper[4817]: I0218 14:39:13.555550 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff3031a6-0a5f-4644-a824-5543706de42e-utilities" (OuterVolumeSpecName: "utilities") pod "ff3031a6-0a5f-4644-a824-5543706de42e" (UID: "ff3031a6-0a5f-4644-a824-5543706de42e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:39:13 crc kubenswrapper[4817]: I0218 14:39:13.556195 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff3031a6-0a5f-4644-a824-5543706de42e-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 14:39:13 crc kubenswrapper[4817]: I0218 14:39:13.566780 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff3031a6-0a5f-4644-a824-5543706de42e-kube-api-access-ctl58" (OuterVolumeSpecName: "kube-api-access-ctl58") pod "ff3031a6-0a5f-4644-a824-5543706de42e" (UID: "ff3031a6-0a5f-4644-a824-5543706de42e"). InnerVolumeSpecName "kube-api-access-ctl58". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:39:13 crc kubenswrapper[4817]: I0218 14:39:13.609508 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff3031a6-0a5f-4644-a824-5543706de42e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff3031a6-0a5f-4644-a824-5543706de42e" (UID: "ff3031a6-0a5f-4644-a824-5543706de42e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:39:13 crc kubenswrapper[4817]: I0218 14:39:13.658879 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff3031a6-0a5f-4644-a824-5543706de42e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 14:39:13 crc kubenswrapper[4817]: I0218 14:39:13.658923 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctl58\" (UniqueName: \"kubernetes.io/projected/ff3031a6-0a5f-4644-a824-5543706de42e-kube-api-access-ctl58\") on node \"crc\" DevicePath \"\"" Feb 18 14:39:13 crc kubenswrapper[4817]: I0218 14:39:13.963405 4817 generic.go:334] "Generic (PLEG): container finished" podID="ff3031a6-0a5f-4644-a824-5543706de42e" containerID="9453190932fb0dd2b40de0330cd66f80ad806852c8ee7e5dac1dd70e11dbc1b2" exitCode=0 Feb 18 14:39:13 crc kubenswrapper[4817]: I0218 14:39:13.963487 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7bskb" Feb 18 14:39:13 crc kubenswrapper[4817]: I0218 14:39:13.963494 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bskb" event={"ID":"ff3031a6-0a5f-4644-a824-5543706de42e","Type":"ContainerDied","Data":"9453190932fb0dd2b40de0330cd66f80ad806852c8ee7e5dac1dd70e11dbc1b2"} Feb 18 14:39:13 crc kubenswrapper[4817]: I0218 14:39:13.964096 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bskb" event={"ID":"ff3031a6-0a5f-4644-a824-5543706de42e","Type":"ContainerDied","Data":"bf757fb2c49248d72d18bc0e70cf15dec87aa85162b64281d4ec2c29c3f4122e"} Feb 18 14:39:13 crc kubenswrapper[4817]: I0218 14:39:13.964131 4817 scope.go:117] "RemoveContainer" containerID="9453190932fb0dd2b40de0330cd66f80ad806852c8ee7e5dac1dd70e11dbc1b2" Feb 18 14:39:13 crc kubenswrapper[4817]: I0218 14:39:13.966836 4817 generic.go:334] "Generic (PLEG): container finished" podID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerID="e371808d6223e38f501e7cfedb8ba2c785e190ea4bdf5f6fac2fa1dfa7ee7693" exitCode=0 Feb 18 14:39:13 crc kubenswrapper[4817]: I0218 14:39:13.966877 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerDied","Data":"e371808d6223e38f501e7cfedb8ba2c785e190ea4bdf5f6fac2fa1dfa7ee7693"} Feb 18 14:39:13 crc kubenswrapper[4817]: I0218 14:39:13.967294 4817 scope.go:117] "RemoveContainer" containerID="e371808d6223e38f501e7cfedb8ba2c785e190ea4bdf5f6fac2fa1dfa7ee7693" Feb 18 14:39:13 crc kubenswrapper[4817]: E0218 14:39:13.967623 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:39:14 crc kubenswrapper[4817]: I0218 14:39:14.007146 4817 scope.go:117] "RemoveContainer" containerID="6e57058287ffbe2beac1cbba1a1a339345e6600d6cface8f405a4a5764de73d4" Feb 18 14:39:14 crc kubenswrapper[4817]: I0218 14:39:14.019095 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7bskb"] Feb 18 14:39:14 crc kubenswrapper[4817]: I0218 14:39:14.033190 4817 scope.go:117] "RemoveContainer" containerID="ef480b9404cc0568d798355d7ce029d9bb17d1931a5616897c8449bdfcd9ce4b" Feb 18 14:39:14 crc kubenswrapper[4817]: I0218 14:39:14.040160 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7bskb"] Feb 18 14:39:14 crc kubenswrapper[4817]: I0218 14:39:14.084380 4817 scope.go:117] "RemoveContainer" containerID="9453190932fb0dd2b40de0330cd66f80ad806852c8ee7e5dac1dd70e11dbc1b2" Feb 18 14:39:14 crc kubenswrapper[4817]: E0218 14:39:14.084831 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9453190932fb0dd2b40de0330cd66f80ad806852c8ee7e5dac1dd70e11dbc1b2\": container with ID starting with 9453190932fb0dd2b40de0330cd66f80ad806852c8ee7e5dac1dd70e11dbc1b2 not found: ID does not exist" containerID="9453190932fb0dd2b40de0330cd66f80ad806852c8ee7e5dac1dd70e11dbc1b2" Feb 18 14:39:14 crc kubenswrapper[4817]: I0218 14:39:14.084877 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9453190932fb0dd2b40de0330cd66f80ad806852c8ee7e5dac1dd70e11dbc1b2"} err="failed to get container status \"9453190932fb0dd2b40de0330cd66f80ad806852c8ee7e5dac1dd70e11dbc1b2\": rpc error: code = NotFound desc = could not find container \"9453190932fb0dd2b40de0330cd66f80ad806852c8ee7e5dac1dd70e11dbc1b2\": container with ID starting with 9453190932fb0dd2b40de0330cd66f80ad806852c8ee7e5dac1dd70e11dbc1b2 not found: ID does not exist" Feb 18 14:39:14 crc kubenswrapper[4817]: I0218 14:39:14.084906 4817 scope.go:117] "RemoveContainer" containerID="6e57058287ffbe2beac1cbba1a1a339345e6600d6cface8f405a4a5764de73d4" Feb 18 14:39:14 crc kubenswrapper[4817]: E0218 14:39:14.085166 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e57058287ffbe2beac1cbba1a1a339345e6600d6cface8f405a4a5764de73d4\": container with ID starting with 6e57058287ffbe2beac1cbba1a1a339345e6600d6cface8f405a4a5764de73d4 not found: ID does not exist" containerID="6e57058287ffbe2beac1cbba1a1a339345e6600d6cface8f405a4a5764de73d4" Feb 18 14:39:14 crc kubenswrapper[4817]: I0218 14:39:14.085199 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e57058287ffbe2beac1cbba1a1a339345e6600d6cface8f405a4a5764de73d4"} err="failed to get container status \"6e57058287ffbe2beac1cbba1a1a339345e6600d6cface8f405a4a5764de73d4\": rpc error: code = NotFound desc = could not find container \"6e57058287ffbe2beac1cbba1a1a339345e6600d6cface8f405a4a5764de73d4\": container with ID starting with 6e57058287ffbe2beac1cbba1a1a339345e6600d6cface8f405a4a5764de73d4 not found: ID does not exist" Feb 18 14:39:14 crc kubenswrapper[4817]: I0218 14:39:14.085218 4817 scope.go:117] "RemoveContainer" containerID="ef480b9404cc0568d798355d7ce029d9bb17d1931a5616897c8449bdfcd9ce4b" Feb 18 14:39:14 crc kubenswrapper[4817]: E0218 14:39:14.085435 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef480b9404cc0568d798355d7ce029d9bb17d1931a5616897c8449bdfcd9ce4b\": container with ID starting with ef480b9404cc0568d798355d7ce029d9bb17d1931a5616897c8449bdfcd9ce4b not found: ID does not exist" containerID="ef480b9404cc0568d798355d7ce029d9bb17d1931a5616897c8449bdfcd9ce4b" Feb 18 14:39:14 crc kubenswrapper[4817]: I0218 14:39:14.085464 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef480b9404cc0568d798355d7ce029d9bb17d1931a5616897c8449bdfcd9ce4b"} err="failed to get container status \"ef480b9404cc0568d798355d7ce029d9bb17d1931a5616897c8449bdfcd9ce4b\": rpc error: code = NotFound desc = could not find container \"ef480b9404cc0568d798355d7ce029d9bb17d1931a5616897c8449bdfcd9ce4b\": container with ID starting with ef480b9404cc0568d798355d7ce029d9bb17d1931a5616897c8449bdfcd9ce4b not found: ID does not exist" Feb 18 14:39:14 crc kubenswrapper[4817]: I0218 14:39:14.085481 4817 scope.go:117] "RemoveContainer" containerID="350b94e016e63e9dbf9f1cb2943e19d533f92423c570ba0a133fb08ef9bb2a0b" Feb 18 14:39:14 crc kubenswrapper[4817]: I0218 14:39:14.185783 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff3031a6-0a5f-4644-a824-5543706de42e" path="/var/lib/kubelet/pods/ff3031a6-0a5f-4644-a824-5543706de42e/volumes" Feb 18 14:39:26 crc kubenswrapper[4817]: I0218 14:39:26.172190 4817 scope.go:117] "RemoveContainer" containerID="e371808d6223e38f501e7cfedb8ba2c785e190ea4bdf5f6fac2fa1dfa7ee7693" Feb 18 14:39:26 crc kubenswrapper[4817]: E0218 14:39:26.174736 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:39:41 crc kubenswrapper[4817]: I0218 14:39:41.172034 4817 scope.go:117] "RemoveContainer" containerID="e371808d6223e38f501e7cfedb8ba2c785e190ea4bdf5f6fac2fa1dfa7ee7693" Feb 18 14:39:41 crc kubenswrapper[4817]: E0218 14:39:41.172852 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:39:53 crc kubenswrapper[4817]: I0218 14:39:53.173672 4817 scope.go:117] "RemoveContainer" containerID="e371808d6223e38f501e7cfedb8ba2c785e190ea4bdf5f6fac2fa1dfa7ee7693" Feb 18 14:39:53 crc kubenswrapper[4817]: E0218 14:39:53.176302 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:39:53 crc kubenswrapper[4817]: I0218 14:39:53.362252 4817 generic.go:334] "Generic (PLEG): container finished" podID="43d12b3f-f980-4075-8684-a97141a5474d" containerID="4813131052d8d4176fe80b9a4078cbfa21c3db3c004bf68d39815e51ea320642" exitCode=0 Feb 18 14:39:53 crc kubenswrapper[4817]: I0218 14:39:53.362294 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2" event={"ID":"43d12b3f-f980-4075-8684-a97141a5474d","Type":"ContainerDied","Data":"4813131052d8d4176fe80b9a4078cbfa21c3db3c004bf68d39815e51ea320642"} Feb 18 14:39:54 crc kubenswrapper[4817]: I0218 14:39:54.964487 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.051414 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/43d12b3f-f980-4075-8684-a97141a5474d-libvirt-secret-0\") pod \"43d12b3f-f980-4075-8684-a97141a5474d\" (UID: \"43d12b3f-f980-4075-8684-a97141a5474d\") " Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.052311 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5tsn\" (UniqueName: \"kubernetes.io/projected/43d12b3f-f980-4075-8684-a97141a5474d-kube-api-access-b5tsn\") pod \"43d12b3f-f980-4075-8684-a97141a5474d\" (UID: \"43d12b3f-f980-4075-8684-a97141a5474d\") " Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.052423 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43d12b3f-f980-4075-8684-a97141a5474d-inventory\") pod \"43d12b3f-f980-4075-8684-a97141a5474d\" (UID: \"43d12b3f-f980-4075-8684-a97141a5474d\") " Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.052478 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43d12b3f-f980-4075-8684-a97141a5474d-ssh-key-openstack-edpm-ipam\") pod \"43d12b3f-f980-4075-8684-a97141a5474d\" (UID: \"43d12b3f-f980-4075-8684-a97141a5474d\") " Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.052537 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43d12b3f-f980-4075-8684-a97141a5474d-libvirt-combined-ca-bundle\") pod \"43d12b3f-f980-4075-8684-a97141a5474d\" (UID: \"43d12b3f-f980-4075-8684-a97141a5474d\") " Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.057462 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43d12b3f-f980-4075-8684-a97141a5474d-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "43d12b3f-f980-4075-8684-a97141a5474d" (UID: "43d12b3f-f980-4075-8684-a97141a5474d"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.057663 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43d12b3f-f980-4075-8684-a97141a5474d-kube-api-access-b5tsn" (OuterVolumeSpecName: "kube-api-access-b5tsn") pod "43d12b3f-f980-4075-8684-a97141a5474d" (UID: "43d12b3f-f980-4075-8684-a97141a5474d"). InnerVolumeSpecName "kube-api-access-b5tsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.079922 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43d12b3f-f980-4075-8684-a97141a5474d-inventory" (OuterVolumeSpecName: "inventory") pod "43d12b3f-f980-4075-8684-a97141a5474d" (UID: "43d12b3f-f980-4075-8684-a97141a5474d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.080718 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43d12b3f-f980-4075-8684-a97141a5474d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "43d12b3f-f980-4075-8684-a97141a5474d" (UID: "43d12b3f-f980-4075-8684-a97141a5474d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.091500 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43d12b3f-f980-4075-8684-a97141a5474d-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "43d12b3f-f980-4075-8684-a97141a5474d" (UID: "43d12b3f-f980-4075-8684-a97141a5474d"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.154972 4817 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/43d12b3f-f980-4075-8684-a97141a5474d-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.155016 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5tsn\" (UniqueName: \"kubernetes.io/projected/43d12b3f-f980-4075-8684-a97141a5474d-kube-api-access-b5tsn\") on node \"crc\" DevicePath \"\"" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.155028 4817 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43d12b3f-f980-4075-8684-a97141a5474d-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.155037 4817 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43d12b3f-f980-4075-8684-a97141a5474d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.155046 4817 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43d12b3f-f980-4075-8684-a97141a5474d-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.380683 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2" event={"ID":"43d12b3f-f980-4075-8684-a97141a5474d","Type":"ContainerDied","Data":"6e1e268fc78a501984a69cede89253b84e2bfc2af9e7b75d9f19f318e1c5c52a"} Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.380733 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e1e268fc78a501984a69cede89253b84e2bfc2af9e7b75d9f19f318e1c5c52a" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.380802 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.474225 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6"] Feb 18 14:39:55 crc kubenswrapper[4817]: E0218 14:39:55.474678 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff3031a6-0a5f-4644-a824-5543706de42e" containerName="extract-utilities" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.474700 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff3031a6-0a5f-4644-a824-5543706de42e" containerName="extract-utilities" Feb 18 14:39:55 crc kubenswrapper[4817]: E0218 14:39:55.474715 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43d12b3f-f980-4075-8684-a97141a5474d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.474725 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="43d12b3f-f980-4075-8684-a97141a5474d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 18 14:39:55 crc kubenswrapper[4817]: E0218 14:39:55.474748 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff3031a6-0a5f-4644-a824-5543706de42e" containerName="extract-content" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.474759 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff3031a6-0a5f-4644-a824-5543706de42e" containerName="extract-content" Feb 18 14:39:55 crc kubenswrapper[4817]: E0218 14:39:55.474773 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff3031a6-0a5f-4644-a824-5543706de42e" containerName="registry-server" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.474781 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff3031a6-0a5f-4644-a824-5543706de42e" containerName="registry-server" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.475084 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="43d12b3f-f980-4075-8684-a97141a5474d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.475132 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff3031a6-0a5f-4644-a824-5543706de42e" containerName="registry-server" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.476068 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.478219 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.481360 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.481735 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.481881 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.482101 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.482206 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x8jkl" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.482457 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.489780 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6"] Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.562279 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jd5f6\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.562547 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jd5f6\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.562604 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jd5f6\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.562716 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jd5f6\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.562835 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jd5f6\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.562884 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l56b\" (UniqueName: \"kubernetes.io/projected/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-kube-api-access-2l56b\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jd5f6\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.562939 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jd5f6\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.563049 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jd5f6\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.563105 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jd5f6\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.563132 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jd5f6\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.563302 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jd5f6\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.665394 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jd5f6\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.665444 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jd5f6\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.665493 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jd5f6\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.665549 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jd5f6\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.665572 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l56b\" (UniqueName: \"kubernetes.io/projected/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-kube-api-access-2l56b\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jd5f6\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.665595 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jd5f6\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.666482 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jd5f6\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.666548 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jd5f6\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.666624 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jd5f6\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.666647 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jd5f6\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.666750 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jd5f6\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.666848 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jd5f6\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.669400 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jd5f6\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.670513 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jd5f6\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.670680 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jd5f6\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.670757 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jd5f6\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.671201 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jd5f6\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.671497 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jd5f6\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.671770 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jd5f6\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.672577 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jd5f6\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.676389 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jd5f6\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.686249 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l56b\" (UniqueName: \"kubernetes.io/projected/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-kube-api-access-2l56b\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jd5f6\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:55 crc kubenswrapper[4817]: I0218 14:39:55.794586 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:39:56 crc kubenswrapper[4817]: I0218 14:39:56.329323 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6"] Feb 18 14:39:56 crc kubenswrapper[4817]: I0218 14:39:56.390692 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" event={"ID":"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0","Type":"ContainerStarted","Data":"0ab1fe699b0d7e9913331c7576a340509500335360bc6cc05fdc1f9f0024401f"} Feb 18 14:39:57 crc kubenswrapper[4817]: I0218 14:39:57.400058 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" event={"ID":"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0","Type":"ContainerStarted","Data":"f29c726cfcef758b70f1f0397ef61a9ae5acd71467988d8674c9244f67fd2dc4"} Feb 18 14:40:08 crc kubenswrapper[4817]: I0218 14:40:08.181080 4817 scope.go:117] "RemoveContainer" containerID="e371808d6223e38f501e7cfedb8ba2c785e190ea4bdf5f6fac2fa1dfa7ee7693" Feb 18 14:40:08 crc kubenswrapper[4817]: E0218 14:40:08.201135 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:40:20 crc kubenswrapper[4817]: I0218 14:40:20.172112 4817 scope.go:117] "RemoveContainer" containerID="e371808d6223e38f501e7cfedb8ba2c785e190ea4bdf5f6fac2fa1dfa7ee7693" Feb 18 14:40:20 crc kubenswrapper[4817]: E0218 14:40:20.172926 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:40:35 crc kubenswrapper[4817]: I0218 14:40:35.172577 4817 scope.go:117] "RemoveContainer" containerID="e371808d6223e38f501e7cfedb8ba2c785e190ea4bdf5f6fac2fa1dfa7ee7693" Feb 18 14:40:35 crc kubenswrapper[4817]: E0218 14:40:35.173444 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:40:46 crc kubenswrapper[4817]: I0218 14:40:46.172046 4817 scope.go:117] "RemoveContainer" containerID="e371808d6223e38f501e7cfedb8ba2c785e190ea4bdf5f6fac2fa1dfa7ee7693" Feb 18 14:40:46 crc kubenswrapper[4817]: E0218 14:40:46.172709 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:41:01 crc kubenswrapper[4817]: I0218 14:41:01.172104 4817 scope.go:117] "RemoveContainer" containerID="e371808d6223e38f501e7cfedb8ba2c785e190ea4bdf5f6fac2fa1dfa7ee7693" Feb 18 14:41:01 crc kubenswrapper[4817]: E0218 14:41:01.172876 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:41:05 crc kubenswrapper[4817]: E0218 14:41:05.300798 4817 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.13s" Feb 18 14:41:16 crc kubenswrapper[4817]: I0218 14:41:16.171864 4817 scope.go:117] "RemoveContainer" containerID="e371808d6223e38f501e7cfedb8ba2c785e190ea4bdf5f6fac2fa1dfa7ee7693" Feb 18 14:41:16 crc kubenswrapper[4817]: E0218 14:41:16.172758 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:41:18 crc kubenswrapper[4817]: I0218 14:41:18.573145 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" podStartSLOduration=83.144423023 podStartE2EDuration="1m23.573121689s" podCreationTimestamp="2026-02-18 14:39:55 +0000 UTC" firstStartedPulling="2026-02-18 14:39:56.335160612 +0000 UTC m=+2458.910696595" lastFinishedPulling="2026-02-18 14:39:56.763859278 +0000 UTC m=+2459.339395261" observedRunningTime="2026-02-18 14:39:57.419062005 +0000 UTC m=+2459.994597988" watchObservedRunningTime="2026-02-18 14:41:18.573121689 +0000 UTC m=+2541.148657682" Feb 18 14:41:18 crc kubenswrapper[4817]: I0218 14:41:18.583358 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ffxhn"] Feb 18 14:41:18 crc kubenswrapper[4817]: I0218 14:41:18.587386 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ffxhn" Feb 18 14:41:18 crc kubenswrapper[4817]: I0218 14:41:18.603988 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ffxhn"] Feb 18 14:41:18 crc kubenswrapper[4817]: I0218 14:41:18.734733 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw29w\" (UniqueName: \"kubernetes.io/projected/284f9866-4eb6-4809-b1e0-3cbc545747ad-kube-api-access-rw29w\") pod \"certified-operators-ffxhn\" (UID: \"284f9866-4eb6-4809-b1e0-3cbc545747ad\") " pod="openshift-marketplace/certified-operators-ffxhn" Feb 18 14:41:18 crc kubenswrapper[4817]: I0218 14:41:18.734827 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/284f9866-4eb6-4809-b1e0-3cbc545747ad-utilities\") pod \"certified-operators-ffxhn\" (UID: \"284f9866-4eb6-4809-b1e0-3cbc545747ad\") " pod="openshift-marketplace/certified-operators-ffxhn" Feb 18 14:41:18 crc kubenswrapper[4817]: I0218 14:41:18.735104 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/284f9866-4eb6-4809-b1e0-3cbc545747ad-catalog-content\") pod \"certified-operators-ffxhn\" (UID: \"284f9866-4eb6-4809-b1e0-3cbc545747ad\") " pod="openshift-marketplace/certified-operators-ffxhn" Feb 18 14:41:18 crc kubenswrapper[4817]: I0218 14:41:18.837342 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw29w\" (UniqueName: \"kubernetes.io/projected/284f9866-4eb6-4809-b1e0-3cbc545747ad-kube-api-access-rw29w\") pod \"certified-operators-ffxhn\" (UID: \"284f9866-4eb6-4809-b1e0-3cbc545747ad\") " pod="openshift-marketplace/certified-operators-ffxhn" Feb 18 14:41:18 crc kubenswrapper[4817]: I0218 14:41:18.837465 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/284f9866-4eb6-4809-b1e0-3cbc545747ad-utilities\") pod \"certified-operators-ffxhn\" (UID: \"284f9866-4eb6-4809-b1e0-3cbc545747ad\") " pod="openshift-marketplace/certified-operators-ffxhn" Feb 18 14:41:18 crc kubenswrapper[4817]: I0218 14:41:18.837528 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/284f9866-4eb6-4809-b1e0-3cbc545747ad-catalog-content\") pod \"certified-operators-ffxhn\" (UID: \"284f9866-4eb6-4809-b1e0-3cbc545747ad\") " pod="openshift-marketplace/certified-operators-ffxhn" Feb 18 14:41:18 crc kubenswrapper[4817]: I0218 14:41:18.838133 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/284f9866-4eb6-4809-b1e0-3cbc545747ad-utilities\") pod \"certified-operators-ffxhn\" (UID: \"284f9866-4eb6-4809-b1e0-3cbc545747ad\") " pod="openshift-marketplace/certified-operators-ffxhn" Feb 18 14:41:18 crc kubenswrapper[4817]: I0218 14:41:18.838158 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/284f9866-4eb6-4809-b1e0-3cbc545747ad-catalog-content\") pod \"certified-operators-ffxhn\" (UID: \"284f9866-4eb6-4809-b1e0-3cbc545747ad\") " pod="openshift-marketplace/certified-operators-ffxhn" Feb 18 14:41:18 crc kubenswrapper[4817]: I0218 14:41:18.861752 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw29w\" (UniqueName: \"kubernetes.io/projected/284f9866-4eb6-4809-b1e0-3cbc545747ad-kube-api-access-rw29w\") pod \"certified-operators-ffxhn\" (UID: \"284f9866-4eb6-4809-b1e0-3cbc545747ad\") " pod="openshift-marketplace/certified-operators-ffxhn" Feb 18 14:41:18 crc kubenswrapper[4817]: I0218 14:41:18.904266 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ffxhn" Feb 18 14:41:19 crc kubenswrapper[4817]: I0218 14:41:19.440884 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ffxhn"] Feb 18 14:41:19 crc kubenswrapper[4817]: W0218 14:41:19.450505 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod284f9866_4eb6_4809_b1e0_3cbc545747ad.slice/crio-565f75c26f0abf1bc4fd6e0ffe0be08c8f136cd9c69b3f521d8ef3297bf76f58 WatchSource:0}: Error finding container 565f75c26f0abf1bc4fd6e0ffe0be08c8f136cd9c69b3f521d8ef3297bf76f58: Status 404 returned error can't find the container with id 565f75c26f0abf1bc4fd6e0ffe0be08c8f136cd9c69b3f521d8ef3297bf76f58 Feb 18 14:41:20 crc kubenswrapper[4817]: I0218 14:41:20.461208 4817 generic.go:334] "Generic (PLEG): container finished" podID="284f9866-4eb6-4809-b1e0-3cbc545747ad" containerID="f0058a5fcdc86badc1d41938e9a61987083f07bccde053adc3828ad27baa6ef5" exitCode=0 Feb 18 14:41:20 crc kubenswrapper[4817]: I0218 14:41:20.461286 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ffxhn" event={"ID":"284f9866-4eb6-4809-b1e0-3cbc545747ad","Type":"ContainerDied","Data":"f0058a5fcdc86badc1d41938e9a61987083f07bccde053adc3828ad27baa6ef5"} Feb 18 14:41:20 crc kubenswrapper[4817]: I0218 14:41:20.461588 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ffxhn" event={"ID":"284f9866-4eb6-4809-b1e0-3cbc545747ad","Type":"ContainerStarted","Data":"565f75c26f0abf1bc4fd6e0ffe0be08c8f136cd9c69b3f521d8ef3297bf76f58"} Feb 18 14:41:20 crc kubenswrapper[4817]: I0218 14:41:20.463781 4817 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 14:41:21 crc kubenswrapper[4817]: I0218 14:41:21.472767 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ffxhn" event={"ID":"284f9866-4eb6-4809-b1e0-3cbc545747ad","Type":"ContainerStarted","Data":"722425ea06291c46dd21c8a5498d272722ac570c8e97ea8a0c2ed43ad535ba8f"} Feb 18 14:41:23 crc kubenswrapper[4817]: I0218 14:41:23.492409 4817 generic.go:334] "Generic (PLEG): container finished" podID="284f9866-4eb6-4809-b1e0-3cbc545747ad" containerID="722425ea06291c46dd21c8a5498d272722ac570c8e97ea8a0c2ed43ad535ba8f" exitCode=0 Feb 18 14:41:23 crc kubenswrapper[4817]: I0218 14:41:23.492496 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ffxhn" event={"ID":"284f9866-4eb6-4809-b1e0-3cbc545747ad","Type":"ContainerDied","Data":"722425ea06291c46dd21c8a5498d272722ac570c8e97ea8a0c2ed43ad535ba8f"} Feb 18 14:41:24 crc kubenswrapper[4817]: I0218 14:41:24.504890 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ffxhn" event={"ID":"284f9866-4eb6-4809-b1e0-3cbc545747ad","Type":"ContainerStarted","Data":"feefddb85e224ff64253ef8b89a35be0808a92eda24e87ef3a3a966c40603036"} Feb 18 14:41:24 crc kubenswrapper[4817]: I0218 14:41:24.543675 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ffxhn" podStartSLOduration=3.10292946 podStartE2EDuration="6.543653728s" podCreationTimestamp="2026-02-18 14:41:18 +0000 UTC" firstStartedPulling="2026-02-18 14:41:20.463517784 +0000 UTC m=+2543.039053767" lastFinishedPulling="2026-02-18 14:41:23.904242052 +0000 UTC m=+2546.479778035" observedRunningTime="2026-02-18 14:41:24.529535291 +0000 UTC m=+2547.105071284" watchObservedRunningTime="2026-02-18 14:41:24.543653728 +0000 UTC m=+2547.119189701" Feb 18 14:41:28 crc kubenswrapper[4817]: I0218 14:41:28.186443 4817 scope.go:117] "RemoveContainer" containerID="e371808d6223e38f501e7cfedb8ba2c785e190ea4bdf5f6fac2fa1dfa7ee7693" Feb 18 14:41:28 crc kubenswrapper[4817]: E0218 14:41:28.187269 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:41:28 crc kubenswrapper[4817]: I0218 14:41:28.905040 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ffxhn" Feb 18 14:41:28 crc kubenswrapper[4817]: I0218 14:41:28.905089 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ffxhn" Feb 18 14:41:28 crc kubenswrapper[4817]: I0218 14:41:28.965465 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ffxhn" Feb 18 14:41:29 crc kubenswrapper[4817]: I0218 14:41:29.664529 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ffxhn" Feb 18 14:41:29 crc kubenswrapper[4817]: I0218 14:41:29.755496 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ffxhn"] Feb 18 14:41:31 crc kubenswrapper[4817]: I0218 14:41:31.595761 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ffxhn" podUID="284f9866-4eb6-4809-b1e0-3cbc545747ad" containerName="registry-server" containerID="cri-o://feefddb85e224ff64253ef8b89a35be0808a92eda24e87ef3a3a966c40603036" gracePeriod=2 Feb 18 14:41:32 crc kubenswrapper[4817]: I0218 14:41:32.142475 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ffxhn" Feb 18 14:41:32 crc kubenswrapper[4817]: I0218 14:41:32.209587 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/284f9866-4eb6-4809-b1e0-3cbc545747ad-catalog-content\") pod \"284f9866-4eb6-4809-b1e0-3cbc545747ad\" (UID: \"284f9866-4eb6-4809-b1e0-3cbc545747ad\") " Feb 18 14:41:32 crc kubenswrapper[4817]: I0218 14:41:32.209662 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw29w\" (UniqueName: \"kubernetes.io/projected/284f9866-4eb6-4809-b1e0-3cbc545747ad-kube-api-access-rw29w\") pod \"284f9866-4eb6-4809-b1e0-3cbc545747ad\" (UID: \"284f9866-4eb6-4809-b1e0-3cbc545747ad\") " Feb 18 14:41:32 crc kubenswrapper[4817]: I0218 14:41:32.209709 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/284f9866-4eb6-4809-b1e0-3cbc545747ad-utilities\") pod \"284f9866-4eb6-4809-b1e0-3cbc545747ad\" (UID: \"284f9866-4eb6-4809-b1e0-3cbc545747ad\") " Feb 18 14:41:32 crc kubenswrapper[4817]: I0218 14:41:32.210795 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/284f9866-4eb6-4809-b1e0-3cbc545747ad-utilities" (OuterVolumeSpecName: "utilities") pod "284f9866-4eb6-4809-b1e0-3cbc545747ad" (UID: "284f9866-4eb6-4809-b1e0-3cbc545747ad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:41:32 crc kubenswrapper[4817]: I0218 14:41:32.217832 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/284f9866-4eb6-4809-b1e0-3cbc545747ad-kube-api-access-rw29w" (OuterVolumeSpecName: "kube-api-access-rw29w") pod "284f9866-4eb6-4809-b1e0-3cbc545747ad" (UID: "284f9866-4eb6-4809-b1e0-3cbc545747ad"). InnerVolumeSpecName "kube-api-access-rw29w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:41:32 crc kubenswrapper[4817]: I0218 14:41:32.269290 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/284f9866-4eb6-4809-b1e0-3cbc545747ad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "284f9866-4eb6-4809-b1e0-3cbc545747ad" (UID: "284f9866-4eb6-4809-b1e0-3cbc545747ad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:41:32 crc kubenswrapper[4817]: I0218 14:41:32.314186 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/284f9866-4eb6-4809-b1e0-3cbc545747ad-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 14:41:32 crc kubenswrapper[4817]: I0218 14:41:32.314216 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw29w\" (UniqueName: \"kubernetes.io/projected/284f9866-4eb6-4809-b1e0-3cbc545747ad-kube-api-access-rw29w\") on node \"crc\" DevicePath \"\"" Feb 18 14:41:32 crc kubenswrapper[4817]: I0218 14:41:32.314226 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/284f9866-4eb6-4809-b1e0-3cbc545747ad-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 14:41:32 crc kubenswrapper[4817]: I0218 14:41:32.608306 4817 generic.go:334] "Generic (PLEG): container finished" podID="284f9866-4eb6-4809-b1e0-3cbc545747ad" containerID="feefddb85e224ff64253ef8b89a35be0808a92eda24e87ef3a3a966c40603036" exitCode=0 Feb 18 14:41:32 crc kubenswrapper[4817]: I0218 14:41:32.608359 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ffxhn" event={"ID":"284f9866-4eb6-4809-b1e0-3cbc545747ad","Type":"ContainerDied","Data":"feefddb85e224ff64253ef8b89a35be0808a92eda24e87ef3a3a966c40603036"} Feb 18 14:41:32 crc kubenswrapper[4817]: I0218 14:41:32.608391 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ffxhn" event={"ID":"284f9866-4eb6-4809-b1e0-3cbc545747ad","Type":"ContainerDied","Data":"565f75c26f0abf1bc4fd6e0ffe0be08c8f136cd9c69b3f521d8ef3297bf76f58"} Feb 18 14:41:32 crc kubenswrapper[4817]: I0218 14:41:32.608410 4817 scope.go:117] "RemoveContainer" containerID="feefddb85e224ff64253ef8b89a35be0808a92eda24e87ef3a3a966c40603036" Feb 18 14:41:32 crc kubenswrapper[4817]: I0218 14:41:32.608562 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ffxhn" Feb 18 14:41:32 crc kubenswrapper[4817]: I0218 14:41:32.649446 4817 scope.go:117] "RemoveContainer" containerID="722425ea06291c46dd21c8a5498d272722ac570c8e97ea8a0c2ed43ad535ba8f" Feb 18 14:41:32 crc kubenswrapper[4817]: I0218 14:41:32.657636 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ffxhn"] Feb 18 14:41:32 crc kubenswrapper[4817]: I0218 14:41:32.673790 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ffxhn"] Feb 18 14:41:32 crc kubenswrapper[4817]: I0218 14:41:32.691723 4817 scope.go:117] "RemoveContainer" containerID="f0058a5fcdc86badc1d41938e9a61987083f07bccde053adc3828ad27baa6ef5" Feb 18 14:41:32 crc kubenswrapper[4817]: I0218 14:41:32.723431 4817 scope.go:117] "RemoveContainer" containerID="feefddb85e224ff64253ef8b89a35be0808a92eda24e87ef3a3a966c40603036" Feb 18 14:41:32 crc kubenswrapper[4817]: E0218 14:41:32.723791 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"feefddb85e224ff64253ef8b89a35be0808a92eda24e87ef3a3a966c40603036\": container with ID starting with feefddb85e224ff64253ef8b89a35be0808a92eda24e87ef3a3a966c40603036 not found: ID does not exist" containerID="feefddb85e224ff64253ef8b89a35be0808a92eda24e87ef3a3a966c40603036" Feb 18 14:41:32 crc kubenswrapper[4817]: I0218 14:41:32.723825 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"feefddb85e224ff64253ef8b89a35be0808a92eda24e87ef3a3a966c40603036"} err="failed to get container status \"feefddb85e224ff64253ef8b89a35be0808a92eda24e87ef3a3a966c40603036\": rpc error: code = NotFound desc = could not find container \"feefddb85e224ff64253ef8b89a35be0808a92eda24e87ef3a3a966c40603036\": container with ID starting with feefddb85e224ff64253ef8b89a35be0808a92eda24e87ef3a3a966c40603036 not found: ID does not exist" Feb 18 14:41:32 crc kubenswrapper[4817]: I0218 14:41:32.723847 4817 scope.go:117] "RemoveContainer" containerID="722425ea06291c46dd21c8a5498d272722ac570c8e97ea8a0c2ed43ad535ba8f" Feb 18 14:41:32 crc kubenswrapper[4817]: E0218 14:41:32.724305 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"722425ea06291c46dd21c8a5498d272722ac570c8e97ea8a0c2ed43ad535ba8f\": container with ID starting with 722425ea06291c46dd21c8a5498d272722ac570c8e97ea8a0c2ed43ad535ba8f not found: ID does not exist" containerID="722425ea06291c46dd21c8a5498d272722ac570c8e97ea8a0c2ed43ad535ba8f" Feb 18 14:41:32 crc kubenswrapper[4817]: I0218 14:41:32.724336 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"722425ea06291c46dd21c8a5498d272722ac570c8e97ea8a0c2ed43ad535ba8f"} err="failed to get container status \"722425ea06291c46dd21c8a5498d272722ac570c8e97ea8a0c2ed43ad535ba8f\": rpc error: code = NotFound desc = could not find container \"722425ea06291c46dd21c8a5498d272722ac570c8e97ea8a0c2ed43ad535ba8f\": container with ID starting with 722425ea06291c46dd21c8a5498d272722ac570c8e97ea8a0c2ed43ad535ba8f not found: ID does not exist" Feb 18 14:41:32 crc kubenswrapper[4817]: I0218 14:41:32.724355 4817 scope.go:117] "RemoveContainer" containerID="f0058a5fcdc86badc1d41938e9a61987083f07bccde053adc3828ad27baa6ef5" Feb 18 14:41:32 crc kubenswrapper[4817]: E0218 14:41:32.724670 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0058a5fcdc86badc1d41938e9a61987083f07bccde053adc3828ad27baa6ef5\": container with ID starting with f0058a5fcdc86badc1d41938e9a61987083f07bccde053adc3828ad27baa6ef5 not found: ID does not exist" containerID="f0058a5fcdc86badc1d41938e9a61987083f07bccde053adc3828ad27baa6ef5" Feb 18 14:41:32 crc kubenswrapper[4817]: I0218 14:41:32.724689 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0058a5fcdc86badc1d41938e9a61987083f07bccde053adc3828ad27baa6ef5"} err="failed to get container status \"f0058a5fcdc86badc1d41938e9a61987083f07bccde053adc3828ad27baa6ef5\": rpc error: code = NotFound desc = could not find container \"f0058a5fcdc86badc1d41938e9a61987083f07bccde053adc3828ad27baa6ef5\": container with ID starting with f0058a5fcdc86badc1d41938e9a61987083f07bccde053adc3828ad27baa6ef5 not found: ID does not exist" Feb 18 14:41:34 crc kubenswrapper[4817]: I0218 14:41:34.183423 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="284f9866-4eb6-4809-b1e0-3cbc545747ad" path="/var/lib/kubelet/pods/284f9866-4eb6-4809-b1e0-3cbc545747ad/volumes" Feb 18 14:41:39 crc kubenswrapper[4817]: I0218 14:41:39.172478 4817 scope.go:117] "RemoveContainer" containerID="e371808d6223e38f501e7cfedb8ba2c785e190ea4bdf5f6fac2fa1dfa7ee7693" Feb 18 14:41:39 crc kubenswrapper[4817]: E0218 14:41:39.173439 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:41:54 crc kubenswrapper[4817]: I0218 14:41:54.173343 4817 scope.go:117] "RemoveContainer" containerID="e371808d6223e38f501e7cfedb8ba2c785e190ea4bdf5f6fac2fa1dfa7ee7693" Feb 18 14:41:54 crc kubenswrapper[4817]: E0218 14:41:54.175380 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:42:05 crc kubenswrapper[4817]: I0218 14:42:05.172475 4817 scope.go:117] "RemoveContainer" containerID="e371808d6223e38f501e7cfedb8ba2c785e190ea4bdf5f6fac2fa1dfa7ee7693" Feb 18 14:42:05 crc kubenswrapper[4817]: E0218 14:42:05.173383 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:42:13 crc kubenswrapper[4817]: I0218 14:42:13.026136 4817 generic.go:334] "Generic (PLEG): container finished" podID="095f77dc-6f9e-4845-9cfe-6aeac65d3ab0" containerID="f29c726cfcef758b70f1f0397ef61a9ae5acd71467988d8674c9244f67fd2dc4" exitCode=0 Feb 18 14:42:13 crc kubenswrapper[4817]: I0218 14:42:13.026353 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" event={"ID":"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0","Type":"ContainerDied","Data":"f29c726cfcef758b70f1f0397ef61a9ae5acd71467988d8674c9244f67fd2dc4"} Feb 18 14:42:14 crc kubenswrapper[4817]: I0218 14:42:14.574499 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:42:14 crc kubenswrapper[4817]: I0218 14:42:14.648119 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-cell1-compute-config-2\") pod \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " Feb 18 14:42:14 crc kubenswrapper[4817]: I0218 14:42:14.648197 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-ssh-key-openstack-edpm-ipam\") pod \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " Feb 18 14:42:14 crc kubenswrapper[4817]: I0218 14:42:14.648235 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-migration-ssh-key-0\") pod \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " Feb 18 14:42:14 crc kubenswrapper[4817]: I0218 14:42:14.648521 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-inventory\") pod \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " Feb 18 14:42:14 crc kubenswrapper[4817]: I0218 14:42:14.648570 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-cell1-compute-config-3\") pod \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " Feb 18 14:42:14 crc kubenswrapper[4817]: I0218 14:42:14.648595 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-migration-ssh-key-1\") pod \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " Feb 18 14:42:14 crc kubenswrapper[4817]: I0218 14:42:14.648637 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-combined-ca-bundle\") pod \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " Feb 18 14:42:14 crc kubenswrapper[4817]: I0218 14:42:14.648667 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-extra-config-0\") pod \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " Feb 18 14:42:14 crc kubenswrapper[4817]: I0218 14:42:14.648692 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2l56b\" (UniqueName: \"kubernetes.io/projected/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-kube-api-access-2l56b\") pod \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " Feb 18 14:42:14 crc kubenswrapper[4817]: I0218 14:42:14.648716 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-cell1-compute-config-1\") pod \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " Feb 18 14:42:14 crc kubenswrapper[4817]: I0218 14:42:14.648760 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-cell1-compute-config-0\") pod \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\" (UID: \"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0\") " Feb 18 14:42:14 crc kubenswrapper[4817]: I0218 14:42:14.665227 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-kube-api-access-2l56b" (OuterVolumeSpecName: "kube-api-access-2l56b") pod "095f77dc-6f9e-4845-9cfe-6aeac65d3ab0" (UID: "095f77dc-6f9e-4845-9cfe-6aeac65d3ab0"). InnerVolumeSpecName "kube-api-access-2l56b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:42:14 crc kubenswrapper[4817]: I0218 14:42:14.671853 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "095f77dc-6f9e-4845-9cfe-6aeac65d3ab0" (UID: "095f77dc-6f9e-4845-9cfe-6aeac65d3ab0"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:42:14 crc kubenswrapper[4817]: I0218 14:42:14.676163 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "095f77dc-6f9e-4845-9cfe-6aeac65d3ab0" (UID: "095f77dc-6f9e-4845-9cfe-6aeac65d3ab0"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:42:14 crc kubenswrapper[4817]: I0218 14:42:14.677429 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "095f77dc-6f9e-4845-9cfe-6aeac65d3ab0" (UID: "095f77dc-6f9e-4845-9cfe-6aeac65d3ab0"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:42:14 crc kubenswrapper[4817]: I0218 14:42:14.679478 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "095f77dc-6f9e-4845-9cfe-6aeac65d3ab0" (UID: "095f77dc-6f9e-4845-9cfe-6aeac65d3ab0"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:42:14 crc kubenswrapper[4817]: I0218 14:42:14.682384 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "095f77dc-6f9e-4845-9cfe-6aeac65d3ab0" (UID: "095f77dc-6f9e-4845-9cfe-6aeac65d3ab0"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:42:14 crc kubenswrapper[4817]: I0218 14:42:14.684426 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "095f77dc-6f9e-4845-9cfe-6aeac65d3ab0" (UID: "095f77dc-6f9e-4845-9cfe-6aeac65d3ab0"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:42:14 crc kubenswrapper[4817]: I0218 14:42:14.685355 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "095f77dc-6f9e-4845-9cfe-6aeac65d3ab0" (UID: "095f77dc-6f9e-4845-9cfe-6aeac65d3ab0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:42:14 crc kubenswrapper[4817]: I0218 14:42:14.688275 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "095f77dc-6f9e-4845-9cfe-6aeac65d3ab0" (UID: "095f77dc-6f9e-4845-9cfe-6aeac65d3ab0"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:42:14 crc kubenswrapper[4817]: I0218 14:42:14.688452 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "095f77dc-6f9e-4845-9cfe-6aeac65d3ab0" (UID: "095f77dc-6f9e-4845-9cfe-6aeac65d3ab0"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:42:14 crc kubenswrapper[4817]: I0218 14:42:14.712278 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-inventory" (OuterVolumeSpecName: "inventory") pod "095f77dc-6f9e-4845-9cfe-6aeac65d3ab0" (UID: "095f77dc-6f9e-4845-9cfe-6aeac65d3ab0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:42:14 crc kubenswrapper[4817]: I0218 14:42:14.763367 4817 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 14:42:14 crc kubenswrapper[4817]: I0218 14:42:14.763406 4817 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 18 14:42:14 crc kubenswrapper[4817]: I0218 14:42:14.763421 4817 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 18 14:42:14 crc kubenswrapper[4817]: I0218 14:42:14.763432 4817 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:42:14 crc kubenswrapper[4817]: I0218 14:42:14.763444 4817 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Feb 18 14:42:14 crc kubenswrapper[4817]: I0218 14:42:14.763456 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2l56b\" (UniqueName: \"kubernetes.io/projected/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-kube-api-access-2l56b\") on node \"crc\" DevicePath \"\"" Feb 18 14:42:14 crc kubenswrapper[4817]: I0218 14:42:14.763467 4817 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 18 14:42:14 crc kubenswrapper[4817]: I0218 14:42:14.763480 4817 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 18 14:42:14 crc kubenswrapper[4817]: I0218 14:42:14.763491 4817 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 18 14:42:14 crc kubenswrapper[4817]: I0218 14:42:14.763502 4817 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 14:42:14 crc kubenswrapper[4817]: I0218 14:42:14.763514 4817 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/095f77dc-6f9e-4845-9cfe-6aeac65d3ab0-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.045365 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" event={"ID":"095f77dc-6f9e-4845-9cfe-6aeac65d3ab0","Type":"ContainerDied","Data":"0ab1fe699b0d7e9913331c7576a340509500335360bc6cc05fdc1f9f0024401f"} Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.045398 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jd5f6" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.045410 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ab1fe699b0d7e9913331c7576a340509500335360bc6cc05fdc1f9f0024401f" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.143891 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qxktm"] Feb 18 14:42:15 crc kubenswrapper[4817]: E0218 14:42:15.144340 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="284f9866-4eb6-4809-b1e0-3cbc545747ad" containerName="extract-content" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.144356 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="284f9866-4eb6-4809-b1e0-3cbc545747ad" containerName="extract-content" Feb 18 14:42:15 crc kubenswrapper[4817]: E0218 14:42:15.144378 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="284f9866-4eb6-4809-b1e0-3cbc545747ad" containerName="extract-utilities" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.144385 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="284f9866-4eb6-4809-b1e0-3cbc545747ad" containerName="extract-utilities" Feb 18 14:42:15 crc kubenswrapper[4817]: E0218 14:42:15.144414 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="095f77dc-6f9e-4845-9cfe-6aeac65d3ab0" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.144421 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="095f77dc-6f9e-4845-9cfe-6aeac65d3ab0" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 18 14:42:15 crc kubenswrapper[4817]: E0218 14:42:15.144430 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="284f9866-4eb6-4809-b1e0-3cbc545747ad" containerName="registry-server" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.144437 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="284f9866-4eb6-4809-b1e0-3cbc545747ad" containerName="registry-server" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.144616 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="095f77dc-6f9e-4845-9cfe-6aeac65d3ab0" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.144627 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="284f9866-4eb6-4809-b1e0-3cbc545747ad" containerName="registry-server" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.145314 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qxktm" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.147450 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.147466 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.147475 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.147487 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.147961 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x8jkl" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.166117 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qxktm"] Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.273941 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qxktm\" (UID: \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qxktm" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.274606 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qxktm\" (UID: \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qxktm" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.274714 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qxktm\" (UID: \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qxktm" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.275070 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qxktm\" (UID: \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qxktm" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.275284 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d8m9\" (UniqueName: \"kubernetes.io/projected/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-kube-api-access-2d8m9\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qxktm\" (UID: \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qxktm" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.275373 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qxktm\" (UID: \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qxktm" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.275442 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qxktm\" (UID: \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qxktm" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.377716 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qxktm\" (UID: \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qxktm" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.377801 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qxktm\" (UID: \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qxktm" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.377862 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qxktm\" (UID: \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qxktm" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.377900 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d8m9\" (UniqueName: \"kubernetes.io/projected/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-kube-api-access-2d8m9\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qxktm\" (UID: \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qxktm" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.377995 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qxktm\" (UID: \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qxktm" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.378047 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qxktm\" (UID: \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qxktm" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.378095 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qxktm\" (UID: \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qxktm" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.382941 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qxktm\" (UID: \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qxktm" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.383125 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qxktm\" (UID: \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qxktm" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.383150 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qxktm\" (UID: \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qxktm" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.383621 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qxktm\" (UID: \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qxktm" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.383674 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qxktm\" (UID: \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qxktm" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.384682 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qxktm\" (UID: \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qxktm" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.397059 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d8m9\" (UniqueName: \"kubernetes.io/projected/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-kube-api-access-2d8m9\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qxktm\" (UID: \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qxktm" Feb 18 14:42:15 crc kubenswrapper[4817]: I0218 14:42:15.466470 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qxktm" Feb 18 14:42:16 crc kubenswrapper[4817]: I0218 14:42:16.107254 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qxktm"] Feb 18 14:42:17 crc kubenswrapper[4817]: I0218 14:42:17.064770 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qxktm" event={"ID":"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab","Type":"ContainerStarted","Data":"96023eb91853dd90bfbc5e1289faf4367fb5c1b8373d1290f9320df52d1fa89e"} Feb 18 14:42:17 crc kubenswrapper[4817]: I0218 14:42:17.065089 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qxktm" event={"ID":"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab","Type":"ContainerStarted","Data":"732be39b7b69c2a346cbb1e94c22a5dd322f80291887f3eeec032a5f83a066ba"} Feb 18 14:42:17 crc kubenswrapper[4817]: I0218 14:42:17.086293 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qxktm" podStartSLOduration=1.604717719 podStartE2EDuration="2.086269802s" podCreationTimestamp="2026-02-18 14:42:15 +0000 UTC" firstStartedPulling="2026-02-18 14:42:16.107760306 +0000 UTC m=+2598.683296289" lastFinishedPulling="2026-02-18 14:42:16.589312389 +0000 UTC m=+2599.164848372" observedRunningTime="2026-02-18 14:42:17.077947842 +0000 UTC m=+2599.653483845" watchObservedRunningTime="2026-02-18 14:42:17.086269802 +0000 UTC m=+2599.661805795" Feb 18 14:42:18 crc kubenswrapper[4817]: I0218 14:42:18.177940 4817 scope.go:117] "RemoveContainer" containerID="e371808d6223e38f501e7cfedb8ba2c785e190ea4bdf5f6fac2fa1dfa7ee7693" Feb 18 14:42:18 crc kubenswrapper[4817]: E0218 14:42:18.178538 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:42:31 crc kubenswrapper[4817]: I0218 14:42:31.172126 4817 scope.go:117] "RemoveContainer" containerID="e371808d6223e38f501e7cfedb8ba2c785e190ea4bdf5f6fac2fa1dfa7ee7693" Feb 18 14:42:31 crc kubenswrapper[4817]: E0218 14:42:31.173137 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:42:45 crc kubenswrapper[4817]: I0218 14:42:45.171841 4817 scope.go:117] "RemoveContainer" containerID="e371808d6223e38f501e7cfedb8ba2c785e190ea4bdf5f6fac2fa1dfa7ee7693" Feb 18 14:42:45 crc kubenswrapper[4817]: E0218 14:42:45.172638 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:42:57 crc kubenswrapper[4817]: I0218 14:42:57.172533 4817 scope.go:117] "RemoveContainer" containerID="e371808d6223e38f501e7cfedb8ba2c785e190ea4bdf5f6fac2fa1dfa7ee7693" Feb 18 14:42:57 crc kubenswrapper[4817]: E0218 14:42:57.173136 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:43:09 crc kubenswrapper[4817]: I0218 14:43:09.171887 4817 scope.go:117] "RemoveContainer" containerID="e371808d6223e38f501e7cfedb8ba2c785e190ea4bdf5f6fac2fa1dfa7ee7693" Feb 18 14:43:09 crc kubenswrapper[4817]: E0218 14:43:09.172947 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:43:23 crc kubenswrapper[4817]: I0218 14:43:23.171908 4817 scope.go:117] "RemoveContainer" containerID="e371808d6223e38f501e7cfedb8ba2c785e190ea4bdf5f6fac2fa1dfa7ee7693" Feb 18 14:43:23 crc kubenswrapper[4817]: E0218 14:43:23.172834 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:43:36 crc kubenswrapper[4817]: I0218 14:43:36.172880 4817 scope.go:117] "RemoveContainer" containerID="e371808d6223e38f501e7cfedb8ba2c785e190ea4bdf5f6fac2fa1dfa7ee7693" Feb 18 14:43:36 crc kubenswrapper[4817]: E0218 14:43:36.175498 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:43:50 crc kubenswrapper[4817]: I0218 14:43:50.172202 4817 scope.go:117] "RemoveContainer" containerID="e371808d6223e38f501e7cfedb8ba2c785e190ea4bdf5f6fac2fa1dfa7ee7693" Feb 18 14:43:50 crc kubenswrapper[4817]: E0218 14:43:50.173057 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:44:03 crc kubenswrapper[4817]: I0218 14:44:03.235141 4817 scope.go:117] "RemoveContainer" containerID="e371808d6223e38f501e7cfedb8ba2c785e190ea4bdf5f6fac2fa1dfa7ee7693" Feb 18 14:44:03 crc kubenswrapper[4817]: E0218 14:44:03.235963 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:44:18 crc kubenswrapper[4817]: I0218 14:44:18.177357 4817 scope.go:117] "RemoveContainer" containerID="e371808d6223e38f501e7cfedb8ba2c785e190ea4bdf5f6fac2fa1dfa7ee7693" Feb 18 14:44:19 crc kubenswrapper[4817]: I0218 14:44:19.388638 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerStarted","Data":"5901a88315dae55cda41a0e2e61d35f9c231b8a69ea7d46b8ef93af1fee56d87"} Feb 18 14:44:26 crc kubenswrapper[4817]: I0218 14:44:26.452261 4817 generic.go:334] "Generic (PLEG): container finished" podID="5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab" containerID="96023eb91853dd90bfbc5e1289faf4367fb5c1b8373d1290f9320df52d1fa89e" exitCode=0 Feb 18 14:44:26 crc kubenswrapper[4817]: I0218 14:44:26.452320 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qxktm" event={"ID":"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab","Type":"ContainerDied","Data":"96023eb91853dd90bfbc5e1289faf4367fb5c1b8373d1290f9320df52d1fa89e"} Feb 18 14:44:28 crc kubenswrapper[4817]: I0218 14:44:28.064289 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qxktm" Feb 18 14:44:28 crc kubenswrapper[4817]: I0218 14:44:28.102828 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d8m9\" (UniqueName: \"kubernetes.io/projected/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-kube-api-access-2d8m9\") pod \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\" (UID: \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\") " Feb 18 14:44:28 crc kubenswrapper[4817]: I0218 14:44:28.103958 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-ceilometer-compute-config-data-2\") pod \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\" (UID: \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\") " Feb 18 14:44:28 crc kubenswrapper[4817]: I0218 14:44:28.104128 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-ceilometer-compute-config-data-1\") pod \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\" (UID: \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\") " Feb 18 14:44:28 crc kubenswrapper[4817]: I0218 14:44:28.104381 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-inventory\") pod \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\" (UID: \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\") " Feb 18 14:44:28 crc kubenswrapper[4817]: I0218 14:44:28.104552 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-ssh-key-openstack-edpm-ipam\") pod \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\" (UID: \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\") " Feb 18 14:44:28 crc kubenswrapper[4817]: I0218 14:44:28.104845 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-telemetry-combined-ca-bundle\") pod \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\" (UID: \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\") " Feb 18 14:44:28 crc kubenswrapper[4817]: I0218 14:44:28.104968 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-ceilometer-compute-config-data-0\") pod \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\" (UID: \"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab\") " Feb 18 14:44:28 crc kubenswrapper[4817]: I0218 14:44:28.109578 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-kube-api-access-2d8m9" (OuterVolumeSpecName: "kube-api-access-2d8m9") pod "5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab" (UID: "5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab"). InnerVolumeSpecName "kube-api-access-2d8m9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:44:28 crc kubenswrapper[4817]: I0218 14:44:28.114593 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab" (UID: "5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:44:28 crc kubenswrapper[4817]: I0218 14:44:28.153539 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab" (UID: "5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:44:28 crc kubenswrapper[4817]: I0218 14:44:28.153563 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab" (UID: "5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:44:28 crc kubenswrapper[4817]: I0218 14:44:28.159157 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab" (UID: "5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:44:28 crc kubenswrapper[4817]: I0218 14:44:28.166270 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab" (UID: "5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:44:28 crc kubenswrapper[4817]: I0218 14:44:28.166598 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-inventory" (OuterVolumeSpecName: "inventory") pod "5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab" (UID: "5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:44:28 crc kubenswrapper[4817]: I0218 14:44:28.207857 4817 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 14:44:28 crc kubenswrapper[4817]: I0218 14:44:28.207897 4817 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 18 14:44:28 crc kubenswrapper[4817]: I0218 14:44:28.207911 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d8m9\" (UniqueName: \"kubernetes.io/projected/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-kube-api-access-2d8m9\") on node \"crc\" DevicePath \"\"" Feb 18 14:44:28 crc kubenswrapper[4817]: I0218 14:44:28.207924 4817 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 18 14:44:28 crc kubenswrapper[4817]: I0218 14:44:28.207937 4817 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 18 14:44:28 crc kubenswrapper[4817]: I0218 14:44:28.207950 4817 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-inventory\") on node \"crc\" DevicePath \"\"" Feb 18 14:44:28 crc kubenswrapper[4817]: I0218 14:44:28.207963 4817 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 18 14:44:28 crc kubenswrapper[4817]: I0218 14:44:28.472210 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qxktm" event={"ID":"5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab","Type":"ContainerDied","Data":"732be39b7b69c2a346cbb1e94c22a5dd322f80291887f3eeec032a5f83a066ba"} Feb 18 14:44:28 crc kubenswrapper[4817]: I0218 14:44:28.472266 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="732be39b7b69c2a346cbb1e94c22a5dd322f80291887f3eeec032a5f83a066ba" Feb 18 14:44:28 crc kubenswrapper[4817]: I0218 14:44:28.472292 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qxktm" Feb 18 14:45:00 crc kubenswrapper[4817]: I0218 14:45:00.205870 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523765-hjbpq"] Feb 18 14:45:00 crc kubenswrapper[4817]: E0218 14:45:00.222572 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 18 14:45:00 crc kubenswrapper[4817]: I0218 14:45:00.222596 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 18 14:45:00 crc kubenswrapper[4817]: I0218 14:45:00.222800 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 18 14:45:00 crc kubenswrapper[4817]: I0218 14:45:00.223646 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523765-hjbpq" Feb 18 14:45:00 crc kubenswrapper[4817]: I0218 14:45:00.228292 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 18 14:45:00 crc kubenswrapper[4817]: I0218 14:45:00.229605 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 18 14:45:00 crc kubenswrapper[4817]: I0218 14:45:00.232763 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523765-hjbpq"] Feb 18 14:45:00 crc kubenswrapper[4817]: I0218 14:45:00.350647 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7cb5930f-4bff-4d3b-aa87-1fa0c403d342-secret-volume\") pod \"collect-profiles-29523765-hjbpq\" (UID: \"7cb5930f-4bff-4d3b-aa87-1fa0c403d342\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523765-hjbpq" Feb 18 14:45:00 crc kubenswrapper[4817]: I0218 14:45:00.350877 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7cb5930f-4bff-4d3b-aa87-1fa0c403d342-config-volume\") pod \"collect-profiles-29523765-hjbpq\" (UID: \"7cb5930f-4bff-4d3b-aa87-1fa0c403d342\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523765-hjbpq" Feb 18 14:45:00 crc kubenswrapper[4817]: I0218 14:45:00.350950 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzqqm\" (UniqueName: \"kubernetes.io/projected/7cb5930f-4bff-4d3b-aa87-1fa0c403d342-kube-api-access-tzqqm\") pod \"collect-profiles-29523765-hjbpq\" (UID: \"7cb5930f-4bff-4d3b-aa87-1fa0c403d342\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523765-hjbpq" Feb 18 14:45:00 crc kubenswrapper[4817]: I0218 14:45:00.452729 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7cb5930f-4bff-4d3b-aa87-1fa0c403d342-config-volume\") pod \"collect-profiles-29523765-hjbpq\" (UID: \"7cb5930f-4bff-4d3b-aa87-1fa0c403d342\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523765-hjbpq" Feb 18 14:45:00 crc kubenswrapper[4817]: I0218 14:45:00.452824 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzqqm\" (UniqueName: \"kubernetes.io/projected/7cb5930f-4bff-4d3b-aa87-1fa0c403d342-kube-api-access-tzqqm\") pod \"collect-profiles-29523765-hjbpq\" (UID: \"7cb5930f-4bff-4d3b-aa87-1fa0c403d342\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523765-hjbpq" Feb 18 14:45:00 crc kubenswrapper[4817]: I0218 14:45:00.452967 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7cb5930f-4bff-4d3b-aa87-1fa0c403d342-secret-volume\") pod \"collect-profiles-29523765-hjbpq\" (UID: \"7cb5930f-4bff-4d3b-aa87-1fa0c403d342\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523765-hjbpq" Feb 18 14:45:00 crc kubenswrapper[4817]: I0218 14:45:00.454028 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7cb5930f-4bff-4d3b-aa87-1fa0c403d342-config-volume\") pod \"collect-profiles-29523765-hjbpq\" (UID: \"7cb5930f-4bff-4d3b-aa87-1fa0c403d342\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523765-hjbpq" Feb 18 14:45:00 crc kubenswrapper[4817]: I0218 14:45:00.467341 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7cb5930f-4bff-4d3b-aa87-1fa0c403d342-secret-volume\") pod \"collect-profiles-29523765-hjbpq\" (UID: \"7cb5930f-4bff-4d3b-aa87-1fa0c403d342\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523765-hjbpq" Feb 18 14:45:00 crc kubenswrapper[4817]: I0218 14:45:00.473132 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzqqm\" (UniqueName: \"kubernetes.io/projected/7cb5930f-4bff-4d3b-aa87-1fa0c403d342-kube-api-access-tzqqm\") pod \"collect-profiles-29523765-hjbpq\" (UID: \"7cb5930f-4bff-4d3b-aa87-1fa0c403d342\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523765-hjbpq" Feb 18 14:45:00 crc kubenswrapper[4817]: I0218 14:45:00.562160 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523765-hjbpq" Feb 18 14:45:01 crc kubenswrapper[4817]: I0218 14:45:01.143943 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523765-hjbpq"] Feb 18 14:45:01 crc kubenswrapper[4817]: I0218 14:45:01.796449 4817 generic.go:334] "Generic (PLEG): container finished" podID="7cb5930f-4bff-4d3b-aa87-1fa0c403d342" containerID="de9407e60ced8c69134b232d728c93bf4d9328f9b5557fd38070be0e7b056e1f" exitCode=0 Feb 18 14:45:01 crc kubenswrapper[4817]: I0218 14:45:01.797617 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523765-hjbpq" event={"ID":"7cb5930f-4bff-4d3b-aa87-1fa0c403d342","Type":"ContainerDied","Data":"de9407e60ced8c69134b232d728c93bf4d9328f9b5557fd38070be0e7b056e1f"} Feb 18 14:45:01 crc kubenswrapper[4817]: I0218 14:45:01.798067 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523765-hjbpq" event={"ID":"7cb5930f-4bff-4d3b-aa87-1fa0c403d342","Type":"ContainerStarted","Data":"d5d16d0bd72c7e74ca9f0a72f456cd70e63a76151444d57f72c0870fcd151542"} Feb 18 14:45:03 crc kubenswrapper[4817]: I0218 14:45:03.243070 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523765-hjbpq" Feb 18 14:45:03 crc kubenswrapper[4817]: I0218 14:45:03.424349 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7cb5930f-4bff-4d3b-aa87-1fa0c403d342-config-volume\") pod \"7cb5930f-4bff-4d3b-aa87-1fa0c403d342\" (UID: \"7cb5930f-4bff-4d3b-aa87-1fa0c403d342\") " Feb 18 14:45:03 crc kubenswrapper[4817]: I0218 14:45:03.424426 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7cb5930f-4bff-4d3b-aa87-1fa0c403d342-secret-volume\") pod \"7cb5930f-4bff-4d3b-aa87-1fa0c403d342\" (UID: \"7cb5930f-4bff-4d3b-aa87-1fa0c403d342\") " Feb 18 14:45:03 crc kubenswrapper[4817]: I0218 14:45:03.424459 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzqqm\" (UniqueName: \"kubernetes.io/projected/7cb5930f-4bff-4d3b-aa87-1fa0c403d342-kube-api-access-tzqqm\") pod \"7cb5930f-4bff-4d3b-aa87-1fa0c403d342\" (UID: \"7cb5930f-4bff-4d3b-aa87-1fa0c403d342\") " Feb 18 14:45:03 crc kubenswrapper[4817]: I0218 14:45:03.426035 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cb5930f-4bff-4d3b-aa87-1fa0c403d342-config-volume" (OuterVolumeSpecName: "config-volume") pod "7cb5930f-4bff-4d3b-aa87-1fa0c403d342" (UID: "7cb5930f-4bff-4d3b-aa87-1fa0c403d342"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 14:45:03 crc kubenswrapper[4817]: I0218 14:45:03.431569 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cb5930f-4bff-4d3b-aa87-1fa0c403d342-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7cb5930f-4bff-4d3b-aa87-1fa0c403d342" (UID: "7cb5930f-4bff-4d3b-aa87-1fa0c403d342"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 14:45:03 crc kubenswrapper[4817]: I0218 14:45:03.432267 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cb5930f-4bff-4d3b-aa87-1fa0c403d342-kube-api-access-tzqqm" (OuterVolumeSpecName: "kube-api-access-tzqqm") pod "7cb5930f-4bff-4d3b-aa87-1fa0c403d342" (UID: "7cb5930f-4bff-4d3b-aa87-1fa0c403d342"). InnerVolumeSpecName "kube-api-access-tzqqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:45:03 crc kubenswrapper[4817]: I0218 14:45:03.527506 4817 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7cb5930f-4bff-4d3b-aa87-1fa0c403d342-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 14:45:03 crc kubenswrapper[4817]: I0218 14:45:03.527537 4817 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7cb5930f-4bff-4d3b-aa87-1fa0c403d342-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 18 14:45:03 crc kubenswrapper[4817]: I0218 14:45:03.527547 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzqqm\" (UniqueName: \"kubernetes.io/projected/7cb5930f-4bff-4d3b-aa87-1fa0c403d342-kube-api-access-tzqqm\") on node \"crc\" DevicePath \"\"" Feb 18 14:45:03 crc kubenswrapper[4817]: I0218 14:45:03.821045 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523765-hjbpq" event={"ID":"7cb5930f-4bff-4d3b-aa87-1fa0c403d342","Type":"ContainerDied","Data":"d5d16d0bd72c7e74ca9f0a72f456cd70e63a76151444d57f72c0870fcd151542"} Feb 18 14:45:03 crc kubenswrapper[4817]: I0218 14:45:03.821086 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5d16d0bd72c7e74ca9f0a72f456cd70e63a76151444d57f72c0870fcd151542" Feb 18 14:45:03 crc kubenswrapper[4817]: I0218 14:45:03.821109 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523765-hjbpq" Feb 18 14:45:04 crc kubenswrapper[4817]: I0218 14:45:04.314202 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523720-6wkzh"] Feb 18 14:45:04 crc kubenswrapper[4817]: I0218 14:45:04.322591 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523720-6wkzh"] Feb 18 14:45:06 crc kubenswrapper[4817]: I0218 14:45:06.201952 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22a49a67-343c-4b86-87b7-68804e001fb2" path="/var/lib/kubelet/pods/22a49a67-343c-4b86-87b7-68804e001fb2/volumes" Feb 18 14:45:33 crc kubenswrapper[4817]: I0218 14:45:33.024683 4817 scope.go:117] "RemoveContainer" containerID="4ba66eb21295d93203ca6f817ddb9a3cf504e5521deb2f3fc1a0b9db0a0b6954" Feb 18 14:46:42 crc kubenswrapper[4817]: I0218 14:46:42.863817 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:46:42 crc kubenswrapper[4817]: I0218 14:46:42.864282 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:47:12 crc kubenswrapper[4817]: I0218 14:47:12.863276 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:47:12 crc kubenswrapper[4817]: I0218 14:47:12.863874 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:47:42 crc kubenswrapper[4817]: I0218 14:47:42.863714 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:47:42 crc kubenswrapper[4817]: I0218 14:47:42.864307 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:47:42 crc kubenswrapper[4817]: I0218 14:47:42.864359 4817 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" Feb 18 14:47:42 crc kubenswrapper[4817]: I0218 14:47:42.865222 4817 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5901a88315dae55cda41a0e2e61d35f9c231b8a69ea7d46b8ef93af1fee56d87"} pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 14:47:42 crc kubenswrapper[4817]: I0218 14:47:42.865293 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" containerID="cri-o://5901a88315dae55cda41a0e2e61d35f9c231b8a69ea7d46b8ef93af1fee56d87" gracePeriod=600 Feb 18 14:47:43 crc kubenswrapper[4817]: I0218 14:47:43.363661 4817 generic.go:334] "Generic (PLEG): container finished" podID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerID="5901a88315dae55cda41a0e2e61d35f9c231b8a69ea7d46b8ef93af1fee56d87" exitCode=0 Feb 18 14:47:43 crc kubenswrapper[4817]: I0218 14:47:43.363719 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerDied","Data":"5901a88315dae55cda41a0e2e61d35f9c231b8a69ea7d46b8ef93af1fee56d87"} Feb 18 14:47:43 crc kubenswrapper[4817]: I0218 14:47:43.364101 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerStarted","Data":"6050ca918ef20814018f5d5707bf221e3958126567785aad804d21ce8f34c866"} Feb 18 14:47:43 crc kubenswrapper[4817]: I0218 14:47:43.364129 4817 scope.go:117] "RemoveContainer" containerID="e371808d6223e38f501e7cfedb8ba2c785e190ea4bdf5f6fac2fa1dfa7ee7693" Feb 18 14:49:12 crc kubenswrapper[4817]: I0218 14:49:12.770240 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sw745"] Feb 18 14:49:12 crc kubenswrapper[4817]: E0218 14:49:12.771344 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cb5930f-4bff-4d3b-aa87-1fa0c403d342" containerName="collect-profiles" Feb 18 14:49:12 crc kubenswrapper[4817]: I0218 14:49:12.771365 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cb5930f-4bff-4d3b-aa87-1fa0c403d342" containerName="collect-profiles" Feb 18 14:49:12 crc kubenswrapper[4817]: I0218 14:49:12.771633 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cb5930f-4bff-4d3b-aa87-1fa0c403d342" containerName="collect-profiles" Feb 18 14:49:12 crc kubenswrapper[4817]: I0218 14:49:12.773704 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sw745" Feb 18 14:49:12 crc kubenswrapper[4817]: I0218 14:49:12.785997 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sw745"] Feb 18 14:49:12 crc kubenswrapper[4817]: I0218 14:49:12.812477 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b900265-a2c8-40fa-8855-794813c7f65c-catalog-content\") pod \"community-operators-sw745\" (UID: \"6b900265-a2c8-40fa-8855-794813c7f65c\") " pod="openshift-marketplace/community-operators-sw745" Feb 18 14:49:12 crc kubenswrapper[4817]: I0218 14:49:12.812631 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b900265-a2c8-40fa-8855-794813c7f65c-utilities\") pod \"community-operators-sw745\" (UID: \"6b900265-a2c8-40fa-8855-794813c7f65c\") " pod="openshift-marketplace/community-operators-sw745" Feb 18 14:49:12 crc kubenswrapper[4817]: I0218 14:49:12.812695 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mh2h\" (UniqueName: \"kubernetes.io/projected/6b900265-a2c8-40fa-8855-794813c7f65c-kube-api-access-4mh2h\") pod \"community-operators-sw745\" (UID: \"6b900265-a2c8-40fa-8855-794813c7f65c\") " pod="openshift-marketplace/community-operators-sw745" Feb 18 14:49:12 crc kubenswrapper[4817]: I0218 14:49:12.914507 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b900265-a2c8-40fa-8855-794813c7f65c-utilities\") pod \"community-operators-sw745\" (UID: \"6b900265-a2c8-40fa-8855-794813c7f65c\") " pod="openshift-marketplace/community-operators-sw745" Feb 18 14:49:12 crc kubenswrapper[4817]: I0218 14:49:12.914671 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mh2h\" (UniqueName: \"kubernetes.io/projected/6b900265-a2c8-40fa-8855-794813c7f65c-kube-api-access-4mh2h\") pod \"community-operators-sw745\" (UID: \"6b900265-a2c8-40fa-8855-794813c7f65c\") " pod="openshift-marketplace/community-operators-sw745" Feb 18 14:49:12 crc kubenswrapper[4817]: I0218 14:49:12.914794 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b900265-a2c8-40fa-8855-794813c7f65c-catalog-content\") pod \"community-operators-sw745\" (UID: \"6b900265-a2c8-40fa-8855-794813c7f65c\") " pod="openshift-marketplace/community-operators-sw745" Feb 18 14:49:12 crc kubenswrapper[4817]: I0218 14:49:12.915421 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b900265-a2c8-40fa-8855-794813c7f65c-utilities\") pod \"community-operators-sw745\" (UID: \"6b900265-a2c8-40fa-8855-794813c7f65c\") " pod="openshift-marketplace/community-operators-sw745" Feb 18 14:49:12 crc kubenswrapper[4817]: I0218 14:49:12.915460 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b900265-a2c8-40fa-8855-794813c7f65c-catalog-content\") pod \"community-operators-sw745\" (UID: \"6b900265-a2c8-40fa-8855-794813c7f65c\") " pod="openshift-marketplace/community-operators-sw745" Feb 18 14:49:12 crc kubenswrapper[4817]: I0218 14:49:12.935524 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mh2h\" (UniqueName: \"kubernetes.io/projected/6b900265-a2c8-40fa-8855-794813c7f65c-kube-api-access-4mh2h\") pod \"community-operators-sw745\" (UID: \"6b900265-a2c8-40fa-8855-794813c7f65c\") " pod="openshift-marketplace/community-operators-sw745" Feb 18 14:49:13 crc kubenswrapper[4817]: I0218 14:49:13.100683 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sw745" Feb 18 14:49:13 crc kubenswrapper[4817]: I0218 14:49:13.627649 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sw745"] Feb 18 14:49:14 crc kubenswrapper[4817]: I0218 14:49:14.611372 4817 generic.go:334] "Generic (PLEG): container finished" podID="6b900265-a2c8-40fa-8855-794813c7f65c" containerID="aa4a4a7805f6d900b4b479e9bea79016172f6dfb906620d090817329a42232c0" exitCode=0 Feb 18 14:49:14 crc kubenswrapper[4817]: I0218 14:49:14.611504 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sw745" event={"ID":"6b900265-a2c8-40fa-8855-794813c7f65c","Type":"ContainerDied","Data":"aa4a4a7805f6d900b4b479e9bea79016172f6dfb906620d090817329a42232c0"} Feb 18 14:49:14 crc kubenswrapper[4817]: I0218 14:49:14.611746 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sw745" event={"ID":"6b900265-a2c8-40fa-8855-794813c7f65c","Type":"ContainerStarted","Data":"9801ee30c7a15c27f7bde81d1ccb095adf3a315701b3c1cdc097cbf7f5f4bcb0"} Feb 18 14:49:14 crc kubenswrapper[4817]: I0218 14:49:14.615225 4817 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 14:49:15 crc kubenswrapper[4817]: I0218 14:49:15.627851 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sw745" event={"ID":"6b900265-a2c8-40fa-8855-794813c7f65c","Type":"ContainerStarted","Data":"54e2b91fd9f7cb47f31b917014d18c0dc1860908ec293adfe4b63cfd0ebc7224"} Feb 18 14:49:17 crc kubenswrapper[4817]: I0218 14:49:17.645638 4817 generic.go:334] "Generic (PLEG): container finished" podID="6b900265-a2c8-40fa-8855-794813c7f65c" containerID="54e2b91fd9f7cb47f31b917014d18c0dc1860908ec293adfe4b63cfd0ebc7224" exitCode=0 Feb 18 14:49:17 crc kubenswrapper[4817]: I0218 14:49:17.645725 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sw745" event={"ID":"6b900265-a2c8-40fa-8855-794813c7f65c","Type":"ContainerDied","Data":"54e2b91fd9f7cb47f31b917014d18c0dc1860908ec293adfe4b63cfd0ebc7224"} Feb 18 14:49:18 crc kubenswrapper[4817]: I0218 14:49:18.659786 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sw745" event={"ID":"6b900265-a2c8-40fa-8855-794813c7f65c","Type":"ContainerStarted","Data":"9405ebb32dc802aa62c24bc785dee17917731d9271831dbd9917297ef0cd72db"} Feb 18 14:49:18 crc kubenswrapper[4817]: I0218 14:49:18.683416 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sw745" podStartSLOduration=3.160378315 podStartE2EDuration="6.683395377s" podCreationTimestamp="2026-02-18 14:49:12 +0000 UTC" firstStartedPulling="2026-02-18 14:49:14.614883232 +0000 UTC m=+3017.190419225" lastFinishedPulling="2026-02-18 14:49:18.137900304 +0000 UTC m=+3020.713436287" observedRunningTime="2026-02-18 14:49:18.681728424 +0000 UTC m=+3021.257264407" watchObservedRunningTime="2026-02-18 14:49:18.683395377 +0000 UTC m=+3021.258931360" Feb 18 14:49:23 crc kubenswrapper[4817]: I0218 14:49:23.101338 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sw745" Feb 18 14:49:23 crc kubenswrapper[4817]: I0218 14:49:23.101780 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sw745" Feb 18 14:49:23 crc kubenswrapper[4817]: I0218 14:49:23.173857 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sw745" Feb 18 14:49:23 crc kubenswrapper[4817]: I0218 14:49:23.758914 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sw745" Feb 18 14:49:23 crc kubenswrapper[4817]: I0218 14:49:23.818259 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sw745"] Feb 18 14:49:25 crc kubenswrapper[4817]: I0218 14:49:25.732788 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sw745" podUID="6b900265-a2c8-40fa-8855-794813c7f65c" containerName="registry-server" containerID="cri-o://9405ebb32dc802aa62c24bc785dee17917731d9271831dbd9917297ef0cd72db" gracePeriod=2 Feb 18 14:49:26 crc kubenswrapper[4817]: I0218 14:49:26.275490 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sw745" Feb 18 14:49:26 crc kubenswrapper[4817]: I0218 14:49:26.410042 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mh2h\" (UniqueName: \"kubernetes.io/projected/6b900265-a2c8-40fa-8855-794813c7f65c-kube-api-access-4mh2h\") pod \"6b900265-a2c8-40fa-8855-794813c7f65c\" (UID: \"6b900265-a2c8-40fa-8855-794813c7f65c\") " Feb 18 14:49:26 crc kubenswrapper[4817]: I0218 14:49:26.410382 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b900265-a2c8-40fa-8855-794813c7f65c-catalog-content\") pod \"6b900265-a2c8-40fa-8855-794813c7f65c\" (UID: \"6b900265-a2c8-40fa-8855-794813c7f65c\") " Feb 18 14:49:26 crc kubenswrapper[4817]: I0218 14:49:26.410636 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b900265-a2c8-40fa-8855-794813c7f65c-utilities\") pod \"6b900265-a2c8-40fa-8855-794813c7f65c\" (UID: \"6b900265-a2c8-40fa-8855-794813c7f65c\") " Feb 18 14:49:26 crc kubenswrapper[4817]: I0218 14:49:26.411633 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b900265-a2c8-40fa-8855-794813c7f65c-utilities" (OuterVolumeSpecName: "utilities") pod "6b900265-a2c8-40fa-8855-794813c7f65c" (UID: "6b900265-a2c8-40fa-8855-794813c7f65c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:49:26 crc kubenswrapper[4817]: I0218 14:49:26.419827 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b900265-a2c8-40fa-8855-794813c7f65c-kube-api-access-4mh2h" (OuterVolumeSpecName: "kube-api-access-4mh2h") pod "6b900265-a2c8-40fa-8855-794813c7f65c" (UID: "6b900265-a2c8-40fa-8855-794813c7f65c"). InnerVolumeSpecName "kube-api-access-4mh2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:49:26 crc kubenswrapper[4817]: I0218 14:49:26.513736 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b900265-a2c8-40fa-8855-794813c7f65c-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 14:49:26 crc kubenswrapper[4817]: I0218 14:49:26.513782 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mh2h\" (UniqueName: \"kubernetes.io/projected/6b900265-a2c8-40fa-8855-794813c7f65c-kube-api-access-4mh2h\") on node \"crc\" DevicePath \"\"" Feb 18 14:49:26 crc kubenswrapper[4817]: I0218 14:49:26.748159 4817 generic.go:334] "Generic (PLEG): container finished" podID="6b900265-a2c8-40fa-8855-794813c7f65c" containerID="9405ebb32dc802aa62c24bc785dee17917731d9271831dbd9917297ef0cd72db" exitCode=0 Feb 18 14:49:26 crc kubenswrapper[4817]: I0218 14:49:26.748272 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sw745" Feb 18 14:49:26 crc kubenswrapper[4817]: I0218 14:49:26.748275 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sw745" event={"ID":"6b900265-a2c8-40fa-8855-794813c7f65c","Type":"ContainerDied","Data":"9405ebb32dc802aa62c24bc785dee17917731d9271831dbd9917297ef0cd72db"} Feb 18 14:49:26 crc kubenswrapper[4817]: I0218 14:49:26.748347 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sw745" event={"ID":"6b900265-a2c8-40fa-8855-794813c7f65c","Type":"ContainerDied","Data":"9801ee30c7a15c27f7bde81d1ccb095adf3a315701b3c1cdc097cbf7f5f4bcb0"} Feb 18 14:49:26 crc kubenswrapper[4817]: I0218 14:49:26.748373 4817 scope.go:117] "RemoveContainer" containerID="9405ebb32dc802aa62c24bc785dee17917731d9271831dbd9917297ef0cd72db" Feb 18 14:49:26 crc kubenswrapper[4817]: I0218 14:49:26.781266 4817 scope.go:117] "RemoveContainer" containerID="54e2b91fd9f7cb47f31b917014d18c0dc1860908ec293adfe4b63cfd0ebc7224" Feb 18 14:49:26 crc kubenswrapper[4817]: I0218 14:49:26.813488 4817 scope.go:117] "RemoveContainer" containerID="aa4a4a7805f6d900b4b479e9bea79016172f6dfb906620d090817329a42232c0" Feb 18 14:49:26 crc kubenswrapper[4817]: I0218 14:49:26.845559 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b900265-a2c8-40fa-8855-794813c7f65c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b900265-a2c8-40fa-8855-794813c7f65c" (UID: "6b900265-a2c8-40fa-8855-794813c7f65c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:49:26 crc kubenswrapper[4817]: I0218 14:49:26.853899 4817 scope.go:117] "RemoveContainer" containerID="9405ebb32dc802aa62c24bc785dee17917731d9271831dbd9917297ef0cd72db" Feb 18 14:49:26 crc kubenswrapper[4817]: E0218 14:49:26.854291 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9405ebb32dc802aa62c24bc785dee17917731d9271831dbd9917297ef0cd72db\": container with ID starting with 9405ebb32dc802aa62c24bc785dee17917731d9271831dbd9917297ef0cd72db not found: ID does not exist" containerID="9405ebb32dc802aa62c24bc785dee17917731d9271831dbd9917297ef0cd72db" Feb 18 14:49:26 crc kubenswrapper[4817]: I0218 14:49:26.854319 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9405ebb32dc802aa62c24bc785dee17917731d9271831dbd9917297ef0cd72db"} err="failed to get container status \"9405ebb32dc802aa62c24bc785dee17917731d9271831dbd9917297ef0cd72db\": rpc error: code = NotFound desc = could not find container \"9405ebb32dc802aa62c24bc785dee17917731d9271831dbd9917297ef0cd72db\": container with ID starting with 9405ebb32dc802aa62c24bc785dee17917731d9271831dbd9917297ef0cd72db not found: ID does not exist" Feb 18 14:49:26 crc kubenswrapper[4817]: I0218 14:49:26.854353 4817 scope.go:117] "RemoveContainer" containerID="54e2b91fd9f7cb47f31b917014d18c0dc1860908ec293adfe4b63cfd0ebc7224" Feb 18 14:49:26 crc kubenswrapper[4817]: E0218 14:49:26.854669 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54e2b91fd9f7cb47f31b917014d18c0dc1860908ec293adfe4b63cfd0ebc7224\": container with ID starting with 54e2b91fd9f7cb47f31b917014d18c0dc1860908ec293adfe4b63cfd0ebc7224 not found: ID does not exist" containerID="54e2b91fd9f7cb47f31b917014d18c0dc1860908ec293adfe4b63cfd0ebc7224" Feb 18 14:49:26 crc kubenswrapper[4817]: I0218 14:49:26.854701 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54e2b91fd9f7cb47f31b917014d18c0dc1860908ec293adfe4b63cfd0ebc7224"} err="failed to get container status \"54e2b91fd9f7cb47f31b917014d18c0dc1860908ec293adfe4b63cfd0ebc7224\": rpc error: code = NotFound desc = could not find container \"54e2b91fd9f7cb47f31b917014d18c0dc1860908ec293adfe4b63cfd0ebc7224\": container with ID starting with 54e2b91fd9f7cb47f31b917014d18c0dc1860908ec293adfe4b63cfd0ebc7224 not found: ID does not exist" Feb 18 14:49:26 crc kubenswrapper[4817]: I0218 14:49:26.854727 4817 scope.go:117] "RemoveContainer" containerID="aa4a4a7805f6d900b4b479e9bea79016172f6dfb906620d090817329a42232c0" Feb 18 14:49:26 crc kubenswrapper[4817]: E0218 14:49:26.855045 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa4a4a7805f6d900b4b479e9bea79016172f6dfb906620d090817329a42232c0\": container with ID starting with aa4a4a7805f6d900b4b479e9bea79016172f6dfb906620d090817329a42232c0 not found: ID does not exist" containerID="aa4a4a7805f6d900b4b479e9bea79016172f6dfb906620d090817329a42232c0" Feb 18 14:49:26 crc kubenswrapper[4817]: I0218 14:49:26.855107 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa4a4a7805f6d900b4b479e9bea79016172f6dfb906620d090817329a42232c0"} err="failed to get container status \"aa4a4a7805f6d900b4b479e9bea79016172f6dfb906620d090817329a42232c0\": rpc error: code = NotFound desc = could not find container \"aa4a4a7805f6d900b4b479e9bea79016172f6dfb906620d090817329a42232c0\": container with ID starting with aa4a4a7805f6d900b4b479e9bea79016172f6dfb906620d090817329a42232c0 not found: ID does not exist" Feb 18 14:49:26 crc kubenswrapper[4817]: I0218 14:49:26.920294 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b900265-a2c8-40fa-8855-794813c7f65c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 14:49:27 crc kubenswrapper[4817]: I0218 14:49:27.091423 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sw745"] Feb 18 14:49:27 crc kubenswrapper[4817]: I0218 14:49:27.099439 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sw745"] Feb 18 14:49:28 crc kubenswrapper[4817]: I0218 14:49:28.185733 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b900265-a2c8-40fa-8855-794813c7f65c" path="/var/lib/kubelet/pods/6b900265-a2c8-40fa-8855-794813c7f65c/volumes" Feb 18 14:50:03 crc kubenswrapper[4817]: E0218 14:50:03.254585 4817 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.084s" Feb 18 14:50:12 crc kubenswrapper[4817]: I0218 14:50:12.383854 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wfhs8"] Feb 18 14:50:12 crc kubenswrapper[4817]: E0218 14:50:12.385538 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b900265-a2c8-40fa-8855-794813c7f65c" containerName="extract-utilities" Feb 18 14:50:12 crc kubenswrapper[4817]: I0218 14:50:12.385555 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b900265-a2c8-40fa-8855-794813c7f65c" containerName="extract-utilities" Feb 18 14:50:12 crc kubenswrapper[4817]: E0218 14:50:12.385588 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b900265-a2c8-40fa-8855-794813c7f65c" containerName="registry-server" Feb 18 14:50:12 crc kubenswrapper[4817]: I0218 14:50:12.385596 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b900265-a2c8-40fa-8855-794813c7f65c" containerName="registry-server" Feb 18 14:50:12 crc kubenswrapper[4817]: E0218 14:50:12.385625 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b900265-a2c8-40fa-8855-794813c7f65c" containerName="extract-content" Feb 18 14:50:12 crc kubenswrapper[4817]: I0218 14:50:12.385632 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b900265-a2c8-40fa-8855-794813c7f65c" containerName="extract-content" Feb 18 14:50:12 crc kubenswrapper[4817]: I0218 14:50:12.385894 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b900265-a2c8-40fa-8855-794813c7f65c" containerName="registry-server" Feb 18 14:50:12 crc kubenswrapper[4817]: I0218 14:50:12.388115 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wfhs8" Feb 18 14:50:12 crc kubenswrapper[4817]: I0218 14:50:12.405910 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wfhs8"] Feb 18 14:50:12 crc kubenswrapper[4817]: I0218 14:50:12.540387 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/395eb3a9-9a33-4911-8b88-8138c6b01997-utilities\") pod \"redhat-marketplace-wfhs8\" (UID: \"395eb3a9-9a33-4911-8b88-8138c6b01997\") " pod="openshift-marketplace/redhat-marketplace-wfhs8" Feb 18 14:50:12 crc kubenswrapper[4817]: I0218 14:50:12.540557 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/395eb3a9-9a33-4911-8b88-8138c6b01997-catalog-content\") pod \"redhat-marketplace-wfhs8\" (UID: \"395eb3a9-9a33-4911-8b88-8138c6b01997\") " pod="openshift-marketplace/redhat-marketplace-wfhs8" Feb 18 14:50:12 crc kubenswrapper[4817]: I0218 14:50:12.540643 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr5xs\" (UniqueName: \"kubernetes.io/projected/395eb3a9-9a33-4911-8b88-8138c6b01997-kube-api-access-sr5xs\") pod \"redhat-marketplace-wfhs8\" (UID: \"395eb3a9-9a33-4911-8b88-8138c6b01997\") " pod="openshift-marketplace/redhat-marketplace-wfhs8" Feb 18 14:50:12 crc kubenswrapper[4817]: I0218 14:50:12.643130 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr5xs\" (UniqueName: \"kubernetes.io/projected/395eb3a9-9a33-4911-8b88-8138c6b01997-kube-api-access-sr5xs\") pod \"redhat-marketplace-wfhs8\" (UID: \"395eb3a9-9a33-4911-8b88-8138c6b01997\") " pod="openshift-marketplace/redhat-marketplace-wfhs8" Feb 18 14:50:12 crc kubenswrapper[4817]: I0218 14:50:12.643298 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/395eb3a9-9a33-4911-8b88-8138c6b01997-utilities\") pod \"redhat-marketplace-wfhs8\" (UID: \"395eb3a9-9a33-4911-8b88-8138c6b01997\") " pod="openshift-marketplace/redhat-marketplace-wfhs8" Feb 18 14:50:12 crc kubenswrapper[4817]: I0218 14:50:12.643400 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/395eb3a9-9a33-4911-8b88-8138c6b01997-catalog-content\") pod \"redhat-marketplace-wfhs8\" (UID: \"395eb3a9-9a33-4911-8b88-8138c6b01997\") " pod="openshift-marketplace/redhat-marketplace-wfhs8" Feb 18 14:50:12 crc kubenswrapper[4817]: I0218 14:50:12.643900 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/395eb3a9-9a33-4911-8b88-8138c6b01997-utilities\") pod \"redhat-marketplace-wfhs8\" (UID: \"395eb3a9-9a33-4911-8b88-8138c6b01997\") " pod="openshift-marketplace/redhat-marketplace-wfhs8" Feb 18 14:50:12 crc kubenswrapper[4817]: I0218 14:50:12.644119 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/395eb3a9-9a33-4911-8b88-8138c6b01997-catalog-content\") pod \"redhat-marketplace-wfhs8\" (UID: \"395eb3a9-9a33-4911-8b88-8138c6b01997\") " pod="openshift-marketplace/redhat-marketplace-wfhs8" Feb 18 14:50:12 crc kubenswrapper[4817]: I0218 14:50:12.666396 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr5xs\" (UniqueName: \"kubernetes.io/projected/395eb3a9-9a33-4911-8b88-8138c6b01997-kube-api-access-sr5xs\") pod \"redhat-marketplace-wfhs8\" (UID: \"395eb3a9-9a33-4911-8b88-8138c6b01997\") " pod="openshift-marketplace/redhat-marketplace-wfhs8" Feb 18 14:50:12 crc kubenswrapper[4817]: I0218 14:50:12.717262 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wfhs8" Feb 18 14:50:12 crc kubenswrapper[4817]: I0218 14:50:12.863711 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:50:12 crc kubenswrapper[4817]: I0218 14:50:12.864028 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:50:13 crc kubenswrapper[4817]: I0218 14:50:13.231917 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wfhs8"] Feb 18 14:50:13 crc kubenswrapper[4817]: I0218 14:50:13.386912 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wfhs8" event={"ID":"395eb3a9-9a33-4911-8b88-8138c6b01997","Type":"ContainerStarted","Data":"03fb72412ff56008c78eb80e293b4255fefa3c5edf3ed4d5cf5270ff26e60a61"} Feb 18 14:50:14 crc kubenswrapper[4817]: I0218 14:50:14.405134 4817 generic.go:334] "Generic (PLEG): container finished" podID="395eb3a9-9a33-4911-8b88-8138c6b01997" containerID="592341ac5679cb55587bb825565ec2547ce9234780d7248a724ac97f88b89dee" exitCode=0 Feb 18 14:50:14 crc kubenswrapper[4817]: I0218 14:50:14.405231 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wfhs8" event={"ID":"395eb3a9-9a33-4911-8b88-8138c6b01997","Type":"ContainerDied","Data":"592341ac5679cb55587bb825565ec2547ce9234780d7248a724ac97f88b89dee"} Feb 18 14:50:15 crc kubenswrapper[4817]: I0218 14:50:15.346779 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jtzwm"] Feb 18 14:50:15 crc kubenswrapper[4817]: I0218 14:50:15.349507 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtzwm" Feb 18 14:50:15 crc kubenswrapper[4817]: I0218 14:50:15.370250 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jtzwm"] Feb 18 14:50:15 crc kubenswrapper[4817]: I0218 14:50:15.410158 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2734b9c-4b1c-44bd-9708-f5e9541dc977-utilities\") pod \"redhat-operators-jtzwm\" (UID: \"a2734b9c-4b1c-44bd-9708-f5e9541dc977\") " pod="openshift-marketplace/redhat-operators-jtzwm" Feb 18 14:50:15 crc kubenswrapper[4817]: I0218 14:50:15.410424 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd2l6\" (UniqueName: \"kubernetes.io/projected/a2734b9c-4b1c-44bd-9708-f5e9541dc977-kube-api-access-kd2l6\") pod \"redhat-operators-jtzwm\" (UID: \"a2734b9c-4b1c-44bd-9708-f5e9541dc977\") " pod="openshift-marketplace/redhat-operators-jtzwm" Feb 18 14:50:15 crc kubenswrapper[4817]: I0218 14:50:15.410472 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2734b9c-4b1c-44bd-9708-f5e9541dc977-catalog-content\") pod \"redhat-operators-jtzwm\" (UID: \"a2734b9c-4b1c-44bd-9708-f5e9541dc977\") " pod="openshift-marketplace/redhat-operators-jtzwm" Feb 18 14:50:15 crc kubenswrapper[4817]: I0218 14:50:15.418840 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wfhs8" event={"ID":"395eb3a9-9a33-4911-8b88-8138c6b01997","Type":"ContainerStarted","Data":"5070ebc38f6c147a3a1da0c1a8c53acde7b2acee86d29993e28c3c3b80ea5173"} Feb 18 14:50:15 crc kubenswrapper[4817]: I0218 14:50:15.512172 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2734b9c-4b1c-44bd-9708-f5e9541dc977-utilities\") pod \"redhat-operators-jtzwm\" (UID: \"a2734b9c-4b1c-44bd-9708-f5e9541dc977\") " pod="openshift-marketplace/redhat-operators-jtzwm" Feb 18 14:50:15 crc kubenswrapper[4817]: I0218 14:50:15.512313 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd2l6\" (UniqueName: \"kubernetes.io/projected/a2734b9c-4b1c-44bd-9708-f5e9541dc977-kube-api-access-kd2l6\") pod \"redhat-operators-jtzwm\" (UID: \"a2734b9c-4b1c-44bd-9708-f5e9541dc977\") " pod="openshift-marketplace/redhat-operators-jtzwm" Feb 18 14:50:15 crc kubenswrapper[4817]: I0218 14:50:15.512345 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2734b9c-4b1c-44bd-9708-f5e9541dc977-catalog-content\") pod \"redhat-operators-jtzwm\" (UID: \"a2734b9c-4b1c-44bd-9708-f5e9541dc977\") " pod="openshift-marketplace/redhat-operators-jtzwm" Feb 18 14:50:15 crc kubenswrapper[4817]: I0218 14:50:15.512964 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2734b9c-4b1c-44bd-9708-f5e9541dc977-catalog-content\") pod \"redhat-operators-jtzwm\" (UID: \"a2734b9c-4b1c-44bd-9708-f5e9541dc977\") " pod="openshift-marketplace/redhat-operators-jtzwm" Feb 18 14:50:15 crc kubenswrapper[4817]: I0218 14:50:15.513026 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2734b9c-4b1c-44bd-9708-f5e9541dc977-utilities\") pod \"redhat-operators-jtzwm\" (UID: \"a2734b9c-4b1c-44bd-9708-f5e9541dc977\") " pod="openshift-marketplace/redhat-operators-jtzwm" Feb 18 14:50:15 crc kubenswrapper[4817]: I0218 14:50:15.532673 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd2l6\" (UniqueName: \"kubernetes.io/projected/a2734b9c-4b1c-44bd-9708-f5e9541dc977-kube-api-access-kd2l6\") pod \"redhat-operators-jtzwm\" (UID: \"a2734b9c-4b1c-44bd-9708-f5e9541dc977\") " pod="openshift-marketplace/redhat-operators-jtzwm" Feb 18 14:50:15 crc kubenswrapper[4817]: I0218 14:50:15.670598 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtzwm" Feb 18 14:50:16 crc kubenswrapper[4817]: I0218 14:50:16.187141 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jtzwm"] Feb 18 14:50:16 crc kubenswrapper[4817]: W0218 14:50:16.190509 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2734b9c_4b1c_44bd_9708_f5e9541dc977.slice/crio-0f6b8c24354012e34c0c553d9036a44cc11dafd2ac544ade9ba9aecf189ce374 WatchSource:0}: Error finding container 0f6b8c24354012e34c0c553d9036a44cc11dafd2ac544ade9ba9aecf189ce374: Status 404 returned error can't find the container with id 0f6b8c24354012e34c0c553d9036a44cc11dafd2ac544ade9ba9aecf189ce374 Feb 18 14:50:16 crc kubenswrapper[4817]: I0218 14:50:16.431381 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtzwm" event={"ID":"a2734b9c-4b1c-44bd-9708-f5e9541dc977","Type":"ContainerStarted","Data":"dfa7f2189f6887de0ff70d4eaf8467bc0cdebcb3dec4260a7a0a8ac543dccbd5"} Feb 18 14:50:16 crc kubenswrapper[4817]: I0218 14:50:16.431807 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtzwm" event={"ID":"a2734b9c-4b1c-44bd-9708-f5e9541dc977","Type":"ContainerStarted","Data":"0f6b8c24354012e34c0c553d9036a44cc11dafd2ac544ade9ba9aecf189ce374"} Feb 18 14:50:16 crc kubenswrapper[4817]: I0218 14:50:16.435180 4817 generic.go:334] "Generic (PLEG): container finished" podID="395eb3a9-9a33-4911-8b88-8138c6b01997" containerID="5070ebc38f6c147a3a1da0c1a8c53acde7b2acee86d29993e28c3c3b80ea5173" exitCode=0 Feb 18 14:50:16 crc kubenswrapper[4817]: I0218 14:50:16.435224 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wfhs8" event={"ID":"395eb3a9-9a33-4911-8b88-8138c6b01997","Type":"ContainerDied","Data":"5070ebc38f6c147a3a1da0c1a8c53acde7b2acee86d29993e28c3c3b80ea5173"} Feb 18 14:50:17 crc kubenswrapper[4817]: I0218 14:50:17.448099 4817 generic.go:334] "Generic (PLEG): container finished" podID="a2734b9c-4b1c-44bd-9708-f5e9541dc977" containerID="dfa7f2189f6887de0ff70d4eaf8467bc0cdebcb3dec4260a7a0a8ac543dccbd5" exitCode=0 Feb 18 14:50:17 crc kubenswrapper[4817]: I0218 14:50:17.448162 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtzwm" event={"ID":"a2734b9c-4b1c-44bd-9708-f5e9541dc977","Type":"ContainerDied","Data":"dfa7f2189f6887de0ff70d4eaf8467bc0cdebcb3dec4260a7a0a8ac543dccbd5"} Feb 18 14:50:17 crc kubenswrapper[4817]: I0218 14:50:17.451882 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wfhs8" event={"ID":"395eb3a9-9a33-4911-8b88-8138c6b01997","Type":"ContainerStarted","Data":"d3dd2d73c113dda6bf853bfbd2d2471624b3c9bdc5355d7bf100869374790bdb"} Feb 18 14:50:17 crc kubenswrapper[4817]: I0218 14:50:17.484417 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wfhs8" podStartSLOduration=2.825438199 podStartE2EDuration="5.484388777s" podCreationTimestamp="2026-02-18 14:50:12 +0000 UTC" firstStartedPulling="2026-02-18 14:50:14.409931466 +0000 UTC m=+3076.985467449" lastFinishedPulling="2026-02-18 14:50:17.068882014 +0000 UTC m=+3079.644418027" observedRunningTime="2026-02-18 14:50:17.481301539 +0000 UTC m=+3080.056837582" watchObservedRunningTime="2026-02-18 14:50:17.484388777 +0000 UTC m=+3080.059924760" Feb 18 14:50:18 crc kubenswrapper[4817]: I0218 14:50:18.462502 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtzwm" event={"ID":"a2734b9c-4b1c-44bd-9708-f5e9541dc977","Type":"ContainerStarted","Data":"6c917a188d61a63b8918e7726396f86a3c3b5eb12bdca4c5c498fe20b2c34a41"} Feb 18 14:50:20 crc kubenswrapper[4817]: I0218 14:50:20.484201 4817 generic.go:334] "Generic (PLEG): container finished" podID="a2734b9c-4b1c-44bd-9708-f5e9541dc977" containerID="6c917a188d61a63b8918e7726396f86a3c3b5eb12bdca4c5c498fe20b2c34a41" exitCode=0 Feb 18 14:50:20 crc kubenswrapper[4817]: I0218 14:50:20.484313 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtzwm" event={"ID":"a2734b9c-4b1c-44bd-9708-f5e9541dc977","Type":"ContainerDied","Data":"6c917a188d61a63b8918e7726396f86a3c3b5eb12bdca4c5c498fe20b2c34a41"} Feb 18 14:50:22 crc kubenswrapper[4817]: I0218 14:50:22.507788 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtzwm" event={"ID":"a2734b9c-4b1c-44bd-9708-f5e9541dc977","Type":"ContainerStarted","Data":"857de57e86011e0ca06d597cbee593c62b13cb5a23d86d78a657c6528c11b2c3"} Feb 18 14:50:22 crc kubenswrapper[4817]: I0218 14:50:22.537772 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jtzwm" podStartSLOduration=4.044610734 podStartE2EDuration="7.537747548s" podCreationTimestamp="2026-02-18 14:50:15 +0000 UTC" firstStartedPulling="2026-02-18 14:50:17.450280454 +0000 UTC m=+3080.025816437" lastFinishedPulling="2026-02-18 14:50:20.943417268 +0000 UTC m=+3083.518953251" observedRunningTime="2026-02-18 14:50:22.528467623 +0000 UTC m=+3085.104003606" watchObservedRunningTime="2026-02-18 14:50:22.537747548 +0000 UTC m=+3085.113283531" Feb 18 14:50:22 crc kubenswrapper[4817]: I0218 14:50:22.718126 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wfhs8" Feb 18 14:50:22 crc kubenswrapper[4817]: I0218 14:50:22.718187 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wfhs8" Feb 18 14:50:22 crc kubenswrapper[4817]: I0218 14:50:22.771862 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wfhs8" Feb 18 14:50:23 crc kubenswrapper[4817]: I0218 14:50:23.575548 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wfhs8" Feb 18 14:50:25 crc kubenswrapper[4817]: I0218 14:50:25.671784 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jtzwm" Feb 18 14:50:25 crc kubenswrapper[4817]: I0218 14:50:25.671849 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jtzwm" Feb 18 14:50:26 crc kubenswrapper[4817]: I0218 14:50:26.721735 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jtzwm" podUID="a2734b9c-4b1c-44bd-9708-f5e9541dc977" containerName="registry-server" probeResult="failure" output=< Feb 18 14:50:26 crc kubenswrapper[4817]: timeout: failed to connect service ":50051" within 1s Feb 18 14:50:26 crc kubenswrapper[4817]: > Feb 18 14:50:27 crc kubenswrapper[4817]: I0218 14:50:27.535358 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wfhs8"] Feb 18 14:50:27 crc kubenswrapper[4817]: I0218 14:50:27.535641 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wfhs8" podUID="395eb3a9-9a33-4911-8b88-8138c6b01997" containerName="registry-server" containerID="cri-o://d3dd2d73c113dda6bf853bfbd2d2471624b3c9bdc5355d7bf100869374790bdb" gracePeriod=2 Feb 18 14:50:28 crc kubenswrapper[4817]: I0218 14:50:28.131079 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wfhs8" Feb 18 14:50:28 crc kubenswrapper[4817]: I0218 14:50:28.193172 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/395eb3a9-9a33-4911-8b88-8138c6b01997-catalog-content\") pod \"395eb3a9-9a33-4911-8b88-8138c6b01997\" (UID: \"395eb3a9-9a33-4911-8b88-8138c6b01997\") " Feb 18 14:50:28 crc kubenswrapper[4817]: I0218 14:50:28.193345 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/395eb3a9-9a33-4911-8b88-8138c6b01997-utilities\") pod \"395eb3a9-9a33-4911-8b88-8138c6b01997\" (UID: \"395eb3a9-9a33-4911-8b88-8138c6b01997\") " Feb 18 14:50:28 crc kubenswrapper[4817]: I0218 14:50:28.193449 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sr5xs\" (UniqueName: \"kubernetes.io/projected/395eb3a9-9a33-4911-8b88-8138c6b01997-kube-api-access-sr5xs\") pod \"395eb3a9-9a33-4911-8b88-8138c6b01997\" (UID: \"395eb3a9-9a33-4911-8b88-8138c6b01997\") " Feb 18 14:50:28 crc kubenswrapper[4817]: I0218 14:50:28.193968 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/395eb3a9-9a33-4911-8b88-8138c6b01997-utilities" (OuterVolumeSpecName: "utilities") pod "395eb3a9-9a33-4911-8b88-8138c6b01997" (UID: "395eb3a9-9a33-4911-8b88-8138c6b01997"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:50:28 crc kubenswrapper[4817]: I0218 14:50:28.194145 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/395eb3a9-9a33-4911-8b88-8138c6b01997-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 14:50:28 crc kubenswrapper[4817]: I0218 14:50:28.200647 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/395eb3a9-9a33-4911-8b88-8138c6b01997-kube-api-access-sr5xs" (OuterVolumeSpecName: "kube-api-access-sr5xs") pod "395eb3a9-9a33-4911-8b88-8138c6b01997" (UID: "395eb3a9-9a33-4911-8b88-8138c6b01997"). InnerVolumeSpecName "kube-api-access-sr5xs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:50:28 crc kubenswrapper[4817]: I0218 14:50:28.216607 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/395eb3a9-9a33-4911-8b88-8138c6b01997-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "395eb3a9-9a33-4911-8b88-8138c6b01997" (UID: "395eb3a9-9a33-4911-8b88-8138c6b01997"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:50:28 crc kubenswrapper[4817]: I0218 14:50:28.295756 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/395eb3a9-9a33-4911-8b88-8138c6b01997-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 14:50:28 crc kubenswrapper[4817]: I0218 14:50:28.296078 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sr5xs\" (UniqueName: \"kubernetes.io/projected/395eb3a9-9a33-4911-8b88-8138c6b01997-kube-api-access-sr5xs\") on node \"crc\" DevicePath \"\"" Feb 18 14:50:28 crc kubenswrapper[4817]: I0218 14:50:28.583855 4817 generic.go:334] "Generic (PLEG): container finished" podID="395eb3a9-9a33-4911-8b88-8138c6b01997" containerID="d3dd2d73c113dda6bf853bfbd2d2471624b3c9bdc5355d7bf100869374790bdb" exitCode=0 Feb 18 14:50:28 crc kubenswrapper[4817]: I0218 14:50:28.583899 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wfhs8" event={"ID":"395eb3a9-9a33-4911-8b88-8138c6b01997","Type":"ContainerDied","Data":"d3dd2d73c113dda6bf853bfbd2d2471624b3c9bdc5355d7bf100869374790bdb"} Feb 18 14:50:28 crc kubenswrapper[4817]: I0218 14:50:28.583934 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wfhs8" event={"ID":"395eb3a9-9a33-4911-8b88-8138c6b01997","Type":"ContainerDied","Data":"03fb72412ff56008c78eb80e293b4255fefa3c5edf3ed4d5cf5270ff26e60a61"} Feb 18 14:50:28 crc kubenswrapper[4817]: I0218 14:50:28.583955 4817 scope.go:117] "RemoveContainer" containerID="d3dd2d73c113dda6bf853bfbd2d2471624b3c9bdc5355d7bf100869374790bdb" Feb 18 14:50:28 crc kubenswrapper[4817]: I0218 14:50:28.584065 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wfhs8" Feb 18 14:50:28 crc kubenswrapper[4817]: I0218 14:50:28.608003 4817 scope.go:117] "RemoveContainer" containerID="5070ebc38f6c147a3a1da0c1a8c53acde7b2acee86d29993e28c3c3b80ea5173" Feb 18 14:50:28 crc kubenswrapper[4817]: I0218 14:50:28.620131 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wfhs8"] Feb 18 14:50:28 crc kubenswrapper[4817]: I0218 14:50:28.631081 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wfhs8"] Feb 18 14:50:28 crc kubenswrapper[4817]: I0218 14:50:28.657930 4817 scope.go:117] "RemoveContainer" containerID="592341ac5679cb55587bb825565ec2547ce9234780d7248a724ac97f88b89dee" Feb 18 14:50:28 crc kubenswrapper[4817]: I0218 14:50:28.688356 4817 scope.go:117] "RemoveContainer" containerID="d3dd2d73c113dda6bf853bfbd2d2471624b3c9bdc5355d7bf100869374790bdb" Feb 18 14:50:28 crc kubenswrapper[4817]: E0218 14:50:28.688787 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3dd2d73c113dda6bf853bfbd2d2471624b3c9bdc5355d7bf100869374790bdb\": container with ID starting with d3dd2d73c113dda6bf853bfbd2d2471624b3c9bdc5355d7bf100869374790bdb not found: ID does not exist" containerID="d3dd2d73c113dda6bf853bfbd2d2471624b3c9bdc5355d7bf100869374790bdb" Feb 18 14:50:28 crc kubenswrapper[4817]: I0218 14:50:28.688834 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3dd2d73c113dda6bf853bfbd2d2471624b3c9bdc5355d7bf100869374790bdb"} err="failed to get container status \"d3dd2d73c113dda6bf853bfbd2d2471624b3c9bdc5355d7bf100869374790bdb\": rpc error: code = NotFound desc = could not find container \"d3dd2d73c113dda6bf853bfbd2d2471624b3c9bdc5355d7bf100869374790bdb\": container with ID starting with d3dd2d73c113dda6bf853bfbd2d2471624b3c9bdc5355d7bf100869374790bdb not found: ID does not exist" Feb 18 14:50:28 crc kubenswrapper[4817]: I0218 14:50:28.688861 4817 scope.go:117] "RemoveContainer" containerID="5070ebc38f6c147a3a1da0c1a8c53acde7b2acee86d29993e28c3c3b80ea5173" Feb 18 14:50:28 crc kubenswrapper[4817]: E0218 14:50:28.689382 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5070ebc38f6c147a3a1da0c1a8c53acde7b2acee86d29993e28c3c3b80ea5173\": container with ID starting with 5070ebc38f6c147a3a1da0c1a8c53acde7b2acee86d29993e28c3c3b80ea5173 not found: ID does not exist" containerID="5070ebc38f6c147a3a1da0c1a8c53acde7b2acee86d29993e28c3c3b80ea5173" Feb 18 14:50:28 crc kubenswrapper[4817]: I0218 14:50:28.689410 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5070ebc38f6c147a3a1da0c1a8c53acde7b2acee86d29993e28c3c3b80ea5173"} err="failed to get container status \"5070ebc38f6c147a3a1da0c1a8c53acde7b2acee86d29993e28c3c3b80ea5173\": rpc error: code = NotFound desc = could not find container \"5070ebc38f6c147a3a1da0c1a8c53acde7b2acee86d29993e28c3c3b80ea5173\": container with ID starting with 5070ebc38f6c147a3a1da0c1a8c53acde7b2acee86d29993e28c3c3b80ea5173 not found: ID does not exist" Feb 18 14:50:28 crc kubenswrapper[4817]: I0218 14:50:28.689424 4817 scope.go:117] "RemoveContainer" containerID="592341ac5679cb55587bb825565ec2547ce9234780d7248a724ac97f88b89dee" Feb 18 14:50:28 crc kubenswrapper[4817]: E0218 14:50:28.689717 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"592341ac5679cb55587bb825565ec2547ce9234780d7248a724ac97f88b89dee\": container with ID starting with 592341ac5679cb55587bb825565ec2547ce9234780d7248a724ac97f88b89dee not found: ID does not exist" containerID="592341ac5679cb55587bb825565ec2547ce9234780d7248a724ac97f88b89dee" Feb 18 14:50:28 crc kubenswrapper[4817]: I0218 14:50:28.689763 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"592341ac5679cb55587bb825565ec2547ce9234780d7248a724ac97f88b89dee"} err="failed to get container status \"592341ac5679cb55587bb825565ec2547ce9234780d7248a724ac97f88b89dee\": rpc error: code = NotFound desc = could not find container \"592341ac5679cb55587bb825565ec2547ce9234780d7248a724ac97f88b89dee\": container with ID starting with 592341ac5679cb55587bb825565ec2547ce9234780d7248a724ac97f88b89dee not found: ID does not exist" Feb 18 14:50:30 crc kubenswrapper[4817]: I0218 14:50:30.196086 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="395eb3a9-9a33-4911-8b88-8138c6b01997" path="/var/lib/kubelet/pods/395eb3a9-9a33-4911-8b88-8138c6b01997/volumes" Feb 18 14:50:35 crc kubenswrapper[4817]: I0218 14:50:35.725229 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jtzwm" Feb 18 14:50:35 crc kubenswrapper[4817]: I0218 14:50:35.778394 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jtzwm" Feb 18 14:50:39 crc kubenswrapper[4817]: I0218 14:50:39.338411 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jtzwm"] Feb 18 14:50:39 crc kubenswrapper[4817]: I0218 14:50:39.339192 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jtzwm" podUID="a2734b9c-4b1c-44bd-9708-f5e9541dc977" containerName="registry-server" containerID="cri-o://857de57e86011e0ca06d597cbee593c62b13cb5a23d86d78a657c6528c11b2c3" gracePeriod=2 Feb 18 14:50:39 crc kubenswrapper[4817]: I0218 14:50:39.688257 4817 generic.go:334] "Generic (PLEG): container finished" podID="a2734b9c-4b1c-44bd-9708-f5e9541dc977" containerID="857de57e86011e0ca06d597cbee593c62b13cb5a23d86d78a657c6528c11b2c3" exitCode=0 Feb 18 14:50:39 crc kubenswrapper[4817]: I0218 14:50:39.688341 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtzwm" event={"ID":"a2734b9c-4b1c-44bd-9708-f5e9541dc977","Type":"ContainerDied","Data":"857de57e86011e0ca06d597cbee593c62b13cb5a23d86d78a657c6528c11b2c3"} Feb 18 14:50:39 crc kubenswrapper[4817]: I0218 14:50:39.871624 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtzwm" Feb 18 14:50:39 crc kubenswrapper[4817]: I0218 14:50:39.937614 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd2l6\" (UniqueName: \"kubernetes.io/projected/a2734b9c-4b1c-44bd-9708-f5e9541dc977-kube-api-access-kd2l6\") pod \"a2734b9c-4b1c-44bd-9708-f5e9541dc977\" (UID: \"a2734b9c-4b1c-44bd-9708-f5e9541dc977\") " Feb 18 14:50:39 crc kubenswrapper[4817]: I0218 14:50:39.937688 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2734b9c-4b1c-44bd-9708-f5e9541dc977-utilities\") pod \"a2734b9c-4b1c-44bd-9708-f5e9541dc977\" (UID: \"a2734b9c-4b1c-44bd-9708-f5e9541dc977\") " Feb 18 14:50:39 crc kubenswrapper[4817]: I0218 14:50:39.937871 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2734b9c-4b1c-44bd-9708-f5e9541dc977-catalog-content\") pod \"a2734b9c-4b1c-44bd-9708-f5e9541dc977\" (UID: \"a2734b9c-4b1c-44bd-9708-f5e9541dc977\") " Feb 18 14:50:39 crc kubenswrapper[4817]: I0218 14:50:39.939122 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2734b9c-4b1c-44bd-9708-f5e9541dc977-utilities" (OuterVolumeSpecName: "utilities") pod "a2734b9c-4b1c-44bd-9708-f5e9541dc977" (UID: "a2734b9c-4b1c-44bd-9708-f5e9541dc977"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:50:39 crc kubenswrapper[4817]: I0218 14:50:39.955206 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2734b9c-4b1c-44bd-9708-f5e9541dc977-kube-api-access-kd2l6" (OuterVolumeSpecName: "kube-api-access-kd2l6") pod "a2734b9c-4b1c-44bd-9708-f5e9541dc977" (UID: "a2734b9c-4b1c-44bd-9708-f5e9541dc977"). InnerVolumeSpecName "kube-api-access-kd2l6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:50:40 crc kubenswrapper[4817]: I0218 14:50:40.040589 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2734b9c-4b1c-44bd-9708-f5e9541dc977-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 14:50:40 crc kubenswrapper[4817]: I0218 14:50:40.040625 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd2l6\" (UniqueName: \"kubernetes.io/projected/a2734b9c-4b1c-44bd-9708-f5e9541dc977-kube-api-access-kd2l6\") on node \"crc\" DevicePath \"\"" Feb 18 14:50:40 crc kubenswrapper[4817]: I0218 14:50:40.085228 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2734b9c-4b1c-44bd-9708-f5e9541dc977-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a2734b9c-4b1c-44bd-9708-f5e9541dc977" (UID: "a2734b9c-4b1c-44bd-9708-f5e9541dc977"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:50:40 crc kubenswrapper[4817]: I0218 14:50:40.142632 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2734b9c-4b1c-44bd-9708-f5e9541dc977-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 14:50:40 crc kubenswrapper[4817]: I0218 14:50:40.699582 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtzwm" event={"ID":"a2734b9c-4b1c-44bd-9708-f5e9541dc977","Type":"ContainerDied","Data":"0f6b8c24354012e34c0c553d9036a44cc11dafd2ac544ade9ba9aecf189ce374"} Feb 18 14:50:40 crc kubenswrapper[4817]: I0218 14:50:40.700020 4817 scope.go:117] "RemoveContainer" containerID="857de57e86011e0ca06d597cbee593c62b13cb5a23d86d78a657c6528c11b2c3" Feb 18 14:50:40 crc kubenswrapper[4817]: I0218 14:50:40.699653 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtzwm" Feb 18 14:50:40 crc kubenswrapper[4817]: I0218 14:50:40.721855 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jtzwm"] Feb 18 14:50:40 crc kubenswrapper[4817]: I0218 14:50:40.726487 4817 scope.go:117] "RemoveContainer" containerID="6c917a188d61a63b8918e7726396f86a3c3b5eb12bdca4c5c498fe20b2c34a41" Feb 18 14:50:40 crc kubenswrapper[4817]: I0218 14:50:40.730713 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jtzwm"] Feb 18 14:50:40 crc kubenswrapper[4817]: I0218 14:50:40.749695 4817 scope.go:117] "RemoveContainer" containerID="dfa7f2189f6887de0ff70d4eaf8467bc0cdebcb3dec4260a7a0a8ac543dccbd5" Feb 18 14:50:42 crc kubenswrapper[4817]: I0218 14:50:42.185543 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2734b9c-4b1c-44bd-9708-f5e9541dc977" path="/var/lib/kubelet/pods/a2734b9c-4b1c-44bd-9708-f5e9541dc977/volumes" Feb 18 14:50:42 crc kubenswrapper[4817]: I0218 14:50:42.863896 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:50:42 crc kubenswrapper[4817]: I0218 14:50:42.864246 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:51:12 crc kubenswrapper[4817]: I0218 14:51:12.864066 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:51:12 crc kubenswrapper[4817]: I0218 14:51:12.864724 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:51:12 crc kubenswrapper[4817]: I0218 14:51:12.864783 4817 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" Feb 18 14:51:12 crc kubenswrapper[4817]: I0218 14:51:12.865741 4817 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6050ca918ef20814018f5d5707bf221e3958126567785aad804d21ce8f34c866"} pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 14:51:12 crc kubenswrapper[4817]: I0218 14:51:12.865808 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" containerID="cri-o://6050ca918ef20814018f5d5707bf221e3958126567785aad804d21ce8f34c866" gracePeriod=600 Feb 18 14:51:12 crc kubenswrapper[4817]: E0218 14:51:12.985815 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:51:13 crc kubenswrapper[4817]: I0218 14:51:13.120756 4817 generic.go:334] "Generic (PLEG): container finished" podID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerID="6050ca918ef20814018f5d5707bf221e3958126567785aad804d21ce8f34c866" exitCode=0 Feb 18 14:51:13 crc kubenswrapper[4817]: I0218 14:51:13.120844 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerDied","Data":"6050ca918ef20814018f5d5707bf221e3958126567785aad804d21ce8f34c866"} Feb 18 14:51:13 crc kubenswrapper[4817]: I0218 14:51:13.121249 4817 scope.go:117] "RemoveContainer" containerID="5901a88315dae55cda41a0e2e61d35f9c231b8a69ea7d46b8ef93af1fee56d87" Feb 18 14:51:13 crc kubenswrapper[4817]: I0218 14:51:13.122000 4817 scope.go:117] "RemoveContainer" containerID="6050ca918ef20814018f5d5707bf221e3958126567785aad804d21ce8f34c866" Feb 18 14:51:13 crc kubenswrapper[4817]: E0218 14:51:13.122269 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:51:27 crc kubenswrapper[4817]: I0218 14:51:27.172168 4817 scope.go:117] "RemoveContainer" containerID="6050ca918ef20814018f5d5707bf221e3958126567785aad804d21ce8f34c866" Feb 18 14:51:27 crc kubenswrapper[4817]: E0218 14:51:27.173362 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:51:39 crc kubenswrapper[4817]: I0218 14:51:39.172271 4817 scope.go:117] "RemoveContainer" containerID="6050ca918ef20814018f5d5707bf221e3958126567785aad804d21ce8f34c866" Feb 18 14:51:39 crc kubenswrapper[4817]: E0218 14:51:39.173154 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:51:53 crc kubenswrapper[4817]: I0218 14:51:53.171497 4817 scope.go:117] "RemoveContainer" containerID="6050ca918ef20814018f5d5707bf221e3958126567785aad804d21ce8f34c866" Feb 18 14:51:53 crc kubenswrapper[4817]: E0218 14:51:53.172407 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:52:05 crc kubenswrapper[4817]: I0218 14:52:05.172355 4817 scope.go:117] "RemoveContainer" containerID="6050ca918ef20814018f5d5707bf221e3958126567785aad804d21ce8f34c866" Feb 18 14:52:05 crc kubenswrapper[4817]: E0218 14:52:05.173233 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:52:18 crc kubenswrapper[4817]: I0218 14:52:18.184047 4817 scope.go:117] "RemoveContainer" containerID="6050ca918ef20814018f5d5707bf221e3958126567785aad804d21ce8f34c866" Feb 18 14:52:18 crc kubenswrapper[4817]: E0218 14:52:18.185020 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:52:28 crc kubenswrapper[4817]: I0218 14:52:28.101745 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t7hwd"] Feb 18 14:52:28 crc kubenswrapper[4817]: E0218 14:52:28.102761 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="395eb3a9-9a33-4911-8b88-8138c6b01997" containerName="extract-content" Feb 18 14:52:28 crc kubenswrapper[4817]: I0218 14:52:28.102774 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="395eb3a9-9a33-4911-8b88-8138c6b01997" containerName="extract-content" Feb 18 14:52:28 crc kubenswrapper[4817]: E0218 14:52:28.102796 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2734b9c-4b1c-44bd-9708-f5e9541dc977" containerName="extract-utilities" Feb 18 14:52:28 crc kubenswrapper[4817]: I0218 14:52:28.102802 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2734b9c-4b1c-44bd-9708-f5e9541dc977" containerName="extract-utilities" Feb 18 14:52:28 crc kubenswrapper[4817]: E0218 14:52:28.102832 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2734b9c-4b1c-44bd-9708-f5e9541dc977" containerName="extract-content" Feb 18 14:52:28 crc kubenswrapper[4817]: I0218 14:52:28.102839 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2734b9c-4b1c-44bd-9708-f5e9541dc977" containerName="extract-content" Feb 18 14:52:28 crc kubenswrapper[4817]: E0218 14:52:28.102854 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="395eb3a9-9a33-4911-8b88-8138c6b01997" containerName="registry-server" Feb 18 14:52:28 crc kubenswrapper[4817]: I0218 14:52:28.102859 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="395eb3a9-9a33-4911-8b88-8138c6b01997" containerName="registry-server" Feb 18 14:52:28 crc kubenswrapper[4817]: E0218 14:52:28.102870 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2734b9c-4b1c-44bd-9708-f5e9541dc977" containerName="registry-server" Feb 18 14:52:28 crc kubenswrapper[4817]: I0218 14:52:28.102875 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2734b9c-4b1c-44bd-9708-f5e9541dc977" containerName="registry-server" Feb 18 14:52:28 crc kubenswrapper[4817]: E0218 14:52:28.102890 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="395eb3a9-9a33-4911-8b88-8138c6b01997" containerName="extract-utilities" Feb 18 14:52:28 crc kubenswrapper[4817]: I0218 14:52:28.102895 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="395eb3a9-9a33-4911-8b88-8138c6b01997" containerName="extract-utilities" Feb 18 14:52:28 crc kubenswrapper[4817]: I0218 14:52:28.103106 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2734b9c-4b1c-44bd-9708-f5e9541dc977" containerName="registry-server" Feb 18 14:52:28 crc kubenswrapper[4817]: I0218 14:52:28.103121 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="395eb3a9-9a33-4911-8b88-8138c6b01997" containerName="registry-server" Feb 18 14:52:28 crc kubenswrapper[4817]: I0218 14:52:28.105021 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t7hwd" Feb 18 14:52:28 crc kubenswrapper[4817]: I0218 14:52:28.116076 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t7hwd"] Feb 18 14:52:28 crc kubenswrapper[4817]: I0218 14:52:28.249398 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq86l\" (UniqueName: \"kubernetes.io/projected/5ac3b5eb-198f-4b4a-857d-d24ec3f42282-kube-api-access-vq86l\") pod \"certified-operators-t7hwd\" (UID: \"5ac3b5eb-198f-4b4a-857d-d24ec3f42282\") " pod="openshift-marketplace/certified-operators-t7hwd" Feb 18 14:52:28 crc kubenswrapper[4817]: I0218 14:52:28.249735 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ac3b5eb-198f-4b4a-857d-d24ec3f42282-utilities\") pod \"certified-operators-t7hwd\" (UID: \"5ac3b5eb-198f-4b4a-857d-d24ec3f42282\") " pod="openshift-marketplace/certified-operators-t7hwd" Feb 18 14:52:28 crc kubenswrapper[4817]: I0218 14:52:28.249801 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ac3b5eb-198f-4b4a-857d-d24ec3f42282-catalog-content\") pod \"certified-operators-t7hwd\" (UID: \"5ac3b5eb-198f-4b4a-857d-d24ec3f42282\") " pod="openshift-marketplace/certified-operators-t7hwd" Feb 18 14:52:28 crc kubenswrapper[4817]: I0218 14:52:28.353503 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq86l\" (UniqueName: \"kubernetes.io/projected/5ac3b5eb-198f-4b4a-857d-d24ec3f42282-kube-api-access-vq86l\") pod \"certified-operators-t7hwd\" (UID: \"5ac3b5eb-198f-4b4a-857d-d24ec3f42282\") " pod="openshift-marketplace/certified-operators-t7hwd" Feb 18 14:52:28 crc kubenswrapper[4817]: I0218 14:52:28.353585 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ac3b5eb-198f-4b4a-857d-d24ec3f42282-utilities\") pod \"certified-operators-t7hwd\" (UID: \"5ac3b5eb-198f-4b4a-857d-d24ec3f42282\") " pod="openshift-marketplace/certified-operators-t7hwd" Feb 18 14:52:28 crc kubenswrapper[4817]: I0218 14:52:28.353758 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ac3b5eb-198f-4b4a-857d-d24ec3f42282-catalog-content\") pod \"certified-operators-t7hwd\" (UID: \"5ac3b5eb-198f-4b4a-857d-d24ec3f42282\") " pod="openshift-marketplace/certified-operators-t7hwd" Feb 18 14:52:28 crc kubenswrapper[4817]: I0218 14:52:28.353972 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ac3b5eb-198f-4b4a-857d-d24ec3f42282-utilities\") pod \"certified-operators-t7hwd\" (UID: \"5ac3b5eb-198f-4b4a-857d-d24ec3f42282\") " pod="openshift-marketplace/certified-operators-t7hwd" Feb 18 14:52:28 crc kubenswrapper[4817]: I0218 14:52:28.354197 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ac3b5eb-198f-4b4a-857d-d24ec3f42282-catalog-content\") pod \"certified-operators-t7hwd\" (UID: \"5ac3b5eb-198f-4b4a-857d-d24ec3f42282\") " pod="openshift-marketplace/certified-operators-t7hwd" Feb 18 14:52:28 crc kubenswrapper[4817]: I0218 14:52:28.375836 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq86l\" (UniqueName: \"kubernetes.io/projected/5ac3b5eb-198f-4b4a-857d-d24ec3f42282-kube-api-access-vq86l\") pod \"certified-operators-t7hwd\" (UID: \"5ac3b5eb-198f-4b4a-857d-d24ec3f42282\") " pod="openshift-marketplace/certified-operators-t7hwd" Feb 18 14:52:28 crc kubenswrapper[4817]: I0218 14:52:28.431956 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t7hwd" Feb 18 14:52:28 crc kubenswrapper[4817]: I0218 14:52:28.965325 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t7hwd"] Feb 18 14:52:29 crc kubenswrapper[4817]: I0218 14:52:29.801014 4817 generic.go:334] "Generic (PLEG): container finished" podID="5ac3b5eb-198f-4b4a-857d-d24ec3f42282" containerID="89882b8f603a66a1347ad86ba803494a7dc99f8f7d6daaefcf2054e2c7f03a75" exitCode=0 Feb 18 14:52:29 crc kubenswrapper[4817]: I0218 14:52:29.801078 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t7hwd" event={"ID":"5ac3b5eb-198f-4b4a-857d-d24ec3f42282","Type":"ContainerDied","Data":"89882b8f603a66a1347ad86ba803494a7dc99f8f7d6daaefcf2054e2c7f03a75"} Feb 18 14:52:29 crc kubenswrapper[4817]: I0218 14:52:29.801315 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t7hwd" event={"ID":"5ac3b5eb-198f-4b4a-857d-d24ec3f42282","Type":"ContainerStarted","Data":"d0776c51831191f3ed20dfaa71a47832b8dd936fc6d232cafa0e36e2794170d1"} Feb 18 14:52:30 crc kubenswrapper[4817]: I0218 14:52:30.812438 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t7hwd" event={"ID":"5ac3b5eb-198f-4b4a-857d-d24ec3f42282","Type":"ContainerStarted","Data":"d90f66e66909a12f979a374f9c0f3e2d36dcaa30ae9ddb10b44d018bf9c2e0db"} Feb 18 14:52:31 crc kubenswrapper[4817]: I0218 14:52:31.171912 4817 scope.go:117] "RemoveContainer" containerID="6050ca918ef20814018f5d5707bf221e3958126567785aad804d21ce8f34c866" Feb 18 14:52:31 crc kubenswrapper[4817]: E0218 14:52:31.172243 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:52:31 crc kubenswrapper[4817]: I0218 14:52:31.822144 4817 generic.go:334] "Generic (PLEG): container finished" podID="5ac3b5eb-198f-4b4a-857d-d24ec3f42282" containerID="d90f66e66909a12f979a374f9c0f3e2d36dcaa30ae9ddb10b44d018bf9c2e0db" exitCode=0 Feb 18 14:52:31 crc kubenswrapper[4817]: I0218 14:52:31.822227 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t7hwd" event={"ID":"5ac3b5eb-198f-4b4a-857d-d24ec3f42282","Type":"ContainerDied","Data":"d90f66e66909a12f979a374f9c0f3e2d36dcaa30ae9ddb10b44d018bf9c2e0db"} Feb 18 14:52:32 crc kubenswrapper[4817]: I0218 14:52:32.834019 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t7hwd" event={"ID":"5ac3b5eb-198f-4b4a-857d-d24ec3f42282","Type":"ContainerStarted","Data":"287447bb33cf5cc7f540512475707df6a754e7a509fcf2f368e11801a73d540a"} Feb 18 14:52:32 crc kubenswrapper[4817]: I0218 14:52:32.854903 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t7hwd" podStartSLOduration=2.438566921 podStartE2EDuration="4.8548767s" podCreationTimestamp="2026-02-18 14:52:28 +0000 UTC" firstStartedPulling="2026-02-18 14:52:29.802879636 +0000 UTC m=+3212.378415619" lastFinishedPulling="2026-02-18 14:52:32.219189425 +0000 UTC m=+3214.794725398" observedRunningTime="2026-02-18 14:52:32.850562351 +0000 UTC m=+3215.426098334" watchObservedRunningTime="2026-02-18 14:52:32.8548767 +0000 UTC m=+3215.430412683" Feb 18 14:52:38 crc kubenswrapper[4817]: I0218 14:52:38.432242 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t7hwd" Feb 18 14:52:38 crc kubenswrapper[4817]: I0218 14:52:38.432818 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t7hwd" Feb 18 14:52:38 crc kubenswrapper[4817]: I0218 14:52:38.482543 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t7hwd" Feb 18 14:52:38 crc kubenswrapper[4817]: I0218 14:52:38.933580 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t7hwd" Feb 18 14:52:38 crc kubenswrapper[4817]: I0218 14:52:38.984260 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t7hwd"] Feb 18 14:52:40 crc kubenswrapper[4817]: I0218 14:52:40.907534 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t7hwd" podUID="5ac3b5eb-198f-4b4a-857d-d24ec3f42282" containerName="registry-server" containerID="cri-o://287447bb33cf5cc7f540512475707df6a754e7a509fcf2f368e11801a73d540a" gracePeriod=2 Feb 18 14:52:41 crc kubenswrapper[4817]: I0218 14:52:41.926062 4817 generic.go:334] "Generic (PLEG): container finished" podID="5ac3b5eb-198f-4b4a-857d-d24ec3f42282" containerID="287447bb33cf5cc7f540512475707df6a754e7a509fcf2f368e11801a73d540a" exitCode=0 Feb 18 14:52:41 crc kubenswrapper[4817]: I0218 14:52:41.926659 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t7hwd" event={"ID":"5ac3b5eb-198f-4b4a-857d-d24ec3f42282","Type":"ContainerDied","Data":"287447bb33cf5cc7f540512475707df6a754e7a509fcf2f368e11801a73d540a"} Feb 18 14:52:42 crc kubenswrapper[4817]: I0218 14:52:42.049139 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t7hwd" Feb 18 14:52:42 crc kubenswrapper[4817]: I0218 14:52:42.143781 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ac3b5eb-198f-4b4a-857d-d24ec3f42282-utilities\") pod \"5ac3b5eb-198f-4b4a-857d-d24ec3f42282\" (UID: \"5ac3b5eb-198f-4b4a-857d-d24ec3f42282\") " Feb 18 14:52:42 crc kubenswrapper[4817]: I0218 14:52:42.144149 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq86l\" (UniqueName: \"kubernetes.io/projected/5ac3b5eb-198f-4b4a-857d-d24ec3f42282-kube-api-access-vq86l\") pod \"5ac3b5eb-198f-4b4a-857d-d24ec3f42282\" (UID: \"5ac3b5eb-198f-4b4a-857d-d24ec3f42282\") " Feb 18 14:52:42 crc kubenswrapper[4817]: I0218 14:52:42.144304 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ac3b5eb-198f-4b4a-857d-d24ec3f42282-catalog-content\") pod \"5ac3b5eb-198f-4b4a-857d-d24ec3f42282\" (UID: \"5ac3b5eb-198f-4b4a-857d-d24ec3f42282\") " Feb 18 14:52:42 crc kubenswrapper[4817]: I0218 14:52:42.144964 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ac3b5eb-198f-4b4a-857d-d24ec3f42282-utilities" (OuterVolumeSpecName: "utilities") pod "5ac3b5eb-198f-4b4a-857d-d24ec3f42282" (UID: "5ac3b5eb-198f-4b4a-857d-d24ec3f42282"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:52:42 crc kubenswrapper[4817]: I0218 14:52:42.155404 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ac3b5eb-198f-4b4a-857d-d24ec3f42282-kube-api-access-vq86l" (OuterVolumeSpecName: "kube-api-access-vq86l") pod "5ac3b5eb-198f-4b4a-857d-d24ec3f42282" (UID: "5ac3b5eb-198f-4b4a-857d-d24ec3f42282"). InnerVolumeSpecName "kube-api-access-vq86l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 14:52:42 crc kubenswrapper[4817]: I0218 14:52:42.212336 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ac3b5eb-198f-4b4a-857d-d24ec3f42282-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ac3b5eb-198f-4b4a-857d-d24ec3f42282" (UID: "5ac3b5eb-198f-4b4a-857d-d24ec3f42282"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 14:52:42 crc kubenswrapper[4817]: I0218 14:52:42.247079 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ac3b5eb-198f-4b4a-857d-d24ec3f42282-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 14:52:42 crc kubenswrapper[4817]: I0218 14:52:42.247307 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq86l\" (UniqueName: \"kubernetes.io/projected/5ac3b5eb-198f-4b4a-857d-d24ec3f42282-kube-api-access-vq86l\") on node \"crc\" DevicePath \"\"" Feb 18 14:52:42 crc kubenswrapper[4817]: I0218 14:52:42.247322 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ac3b5eb-198f-4b4a-857d-d24ec3f42282-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 14:52:42 crc kubenswrapper[4817]: I0218 14:52:42.939274 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t7hwd" event={"ID":"5ac3b5eb-198f-4b4a-857d-d24ec3f42282","Type":"ContainerDied","Data":"d0776c51831191f3ed20dfaa71a47832b8dd936fc6d232cafa0e36e2794170d1"} Feb 18 14:52:42 crc kubenswrapper[4817]: I0218 14:52:42.939347 4817 scope.go:117] "RemoveContainer" containerID="287447bb33cf5cc7f540512475707df6a754e7a509fcf2f368e11801a73d540a" Feb 18 14:52:42 crc kubenswrapper[4817]: I0218 14:52:42.939419 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t7hwd" Feb 18 14:52:42 crc kubenswrapper[4817]: I0218 14:52:42.979706 4817 scope.go:117] "RemoveContainer" containerID="d90f66e66909a12f979a374f9c0f3e2d36dcaa30ae9ddb10b44d018bf9c2e0db" Feb 18 14:52:43 crc kubenswrapper[4817]: I0218 14:52:43.001167 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t7hwd"] Feb 18 14:52:43 crc kubenswrapper[4817]: I0218 14:52:43.004425 4817 scope.go:117] "RemoveContainer" containerID="89882b8f603a66a1347ad86ba803494a7dc99f8f7d6daaefcf2054e2c7f03a75" Feb 18 14:52:43 crc kubenswrapper[4817]: I0218 14:52:43.004658 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t7hwd"] Feb 18 14:52:43 crc kubenswrapper[4817]: I0218 14:52:43.171723 4817 scope.go:117] "RemoveContainer" containerID="6050ca918ef20814018f5d5707bf221e3958126567785aad804d21ce8f34c866" Feb 18 14:52:43 crc kubenswrapper[4817]: E0218 14:52:43.172063 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:52:44 crc kubenswrapper[4817]: I0218 14:52:44.183358 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ac3b5eb-198f-4b4a-857d-d24ec3f42282" path="/var/lib/kubelet/pods/5ac3b5eb-198f-4b4a-857d-d24ec3f42282/volumes" Feb 18 14:52:57 crc kubenswrapper[4817]: I0218 14:52:57.172262 4817 scope.go:117] "RemoveContainer" containerID="6050ca918ef20814018f5d5707bf221e3958126567785aad804d21ce8f34c866" Feb 18 14:52:57 crc kubenswrapper[4817]: E0218 14:52:57.172974 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:53:11 crc kubenswrapper[4817]: I0218 14:53:11.171645 4817 scope.go:117] "RemoveContainer" containerID="6050ca918ef20814018f5d5707bf221e3958126567785aad804d21ce8f34c866" Feb 18 14:53:11 crc kubenswrapper[4817]: E0218 14:53:11.172399 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:53:24 crc kubenswrapper[4817]: I0218 14:53:24.173300 4817 scope.go:117] "RemoveContainer" containerID="6050ca918ef20814018f5d5707bf221e3958126567785aad804d21ce8f34c866" Feb 18 14:53:24 crc kubenswrapper[4817]: E0218 14:53:24.174411 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:53:37 crc kubenswrapper[4817]: I0218 14:53:37.171943 4817 scope.go:117] "RemoveContainer" containerID="6050ca918ef20814018f5d5707bf221e3958126567785aad804d21ce8f34c866" Feb 18 14:53:37 crc kubenswrapper[4817]: E0218 14:53:37.173063 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:53:49 crc kubenswrapper[4817]: I0218 14:53:49.171915 4817 scope.go:117] "RemoveContainer" containerID="6050ca918ef20814018f5d5707bf221e3958126567785aad804d21ce8f34c866" Feb 18 14:53:49 crc kubenswrapper[4817]: E0218 14:53:49.172604 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:54:04 crc kubenswrapper[4817]: I0218 14:54:04.172515 4817 scope.go:117] "RemoveContainer" containerID="6050ca918ef20814018f5d5707bf221e3958126567785aad804d21ce8f34c866" Feb 18 14:54:04 crc kubenswrapper[4817]: E0218 14:54:04.173353 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:54:15 crc kubenswrapper[4817]: I0218 14:54:15.171599 4817 scope.go:117] "RemoveContainer" containerID="6050ca918ef20814018f5d5707bf221e3958126567785aad804d21ce8f34c866" Feb 18 14:54:15 crc kubenswrapper[4817]: E0218 14:54:15.172649 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:54:29 crc kubenswrapper[4817]: I0218 14:54:29.173391 4817 scope.go:117] "RemoveContainer" containerID="6050ca918ef20814018f5d5707bf221e3958126567785aad804d21ce8f34c866" Feb 18 14:54:29 crc kubenswrapper[4817]: E0218 14:54:29.174765 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:54:41 crc kubenswrapper[4817]: I0218 14:54:41.179503 4817 scope.go:117] "RemoveContainer" containerID="6050ca918ef20814018f5d5707bf221e3958126567785aad804d21ce8f34c866" Feb 18 14:54:41 crc kubenswrapper[4817]: E0218 14:54:41.182562 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:54:53 crc kubenswrapper[4817]: I0218 14:54:53.171654 4817 scope.go:117] "RemoveContainer" containerID="6050ca918ef20814018f5d5707bf221e3958126567785aad804d21ce8f34c866" Feb 18 14:54:53 crc kubenswrapper[4817]: E0218 14:54:53.173616 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:55:07 crc kubenswrapper[4817]: I0218 14:55:07.172202 4817 scope.go:117] "RemoveContainer" containerID="6050ca918ef20814018f5d5707bf221e3958126567785aad804d21ce8f34c866" Feb 18 14:55:07 crc kubenswrapper[4817]: E0218 14:55:07.173132 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:55:22 crc kubenswrapper[4817]: I0218 14:55:22.172210 4817 scope.go:117] "RemoveContainer" containerID="6050ca918ef20814018f5d5707bf221e3958126567785aad804d21ce8f34c866" Feb 18 14:55:22 crc kubenswrapper[4817]: E0218 14:55:22.173150 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:55:34 crc kubenswrapper[4817]: I0218 14:55:34.172077 4817 scope.go:117] "RemoveContainer" containerID="6050ca918ef20814018f5d5707bf221e3958126567785aad804d21ce8f34c866" Feb 18 14:55:34 crc kubenswrapper[4817]: E0218 14:55:34.172887 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:55:46 crc kubenswrapper[4817]: I0218 14:55:46.171879 4817 scope.go:117] "RemoveContainer" containerID="6050ca918ef20814018f5d5707bf221e3958126567785aad804d21ce8f34c866" Feb 18 14:55:46 crc kubenswrapper[4817]: E0218 14:55:46.172668 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:55:59 crc kubenswrapper[4817]: I0218 14:55:59.172562 4817 scope.go:117] "RemoveContainer" containerID="6050ca918ef20814018f5d5707bf221e3958126567785aad804d21ce8f34c866" Feb 18 14:55:59 crc kubenswrapper[4817]: E0218 14:55:59.173292 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 14:56:13 crc kubenswrapper[4817]: I0218 14:56:13.171253 4817 scope.go:117] "RemoveContainer" containerID="6050ca918ef20814018f5d5707bf221e3958126567785aad804d21ce8f34c866" Feb 18 14:56:13 crc kubenswrapper[4817]: I0218 14:56:13.873063 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerStarted","Data":"912706dfe852d2fdaef61a959cfa69bd788ba5cc2a058bcd9b22176603043b9b"} Feb 18 14:58:42 crc kubenswrapper[4817]: I0218 14:58:42.863764 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:58:42 crc kubenswrapper[4817]: I0218 14:58:42.864605 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:59:12 crc kubenswrapper[4817]: I0218 14:59:12.862969 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:59:12 crc kubenswrapper[4817]: I0218 14:59:12.864188 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:59:42 crc kubenswrapper[4817]: I0218 14:59:42.864298 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 14:59:42 crc kubenswrapper[4817]: I0218 14:59:42.864953 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 14:59:42 crc kubenswrapper[4817]: I0218 14:59:42.865034 4817 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" Feb 18 14:59:42 crc kubenswrapper[4817]: I0218 14:59:42.865961 4817 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"912706dfe852d2fdaef61a959cfa69bd788ba5cc2a058bcd9b22176603043b9b"} pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 14:59:42 crc kubenswrapper[4817]: I0218 14:59:42.866049 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" containerID="cri-o://912706dfe852d2fdaef61a959cfa69bd788ba5cc2a058bcd9b22176603043b9b" gracePeriod=600 Feb 18 14:59:43 crc kubenswrapper[4817]: I0218 14:59:43.965874 4817 generic.go:334] "Generic (PLEG): container finished" podID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerID="912706dfe852d2fdaef61a959cfa69bd788ba5cc2a058bcd9b22176603043b9b" exitCode=0 Feb 18 14:59:43 crc kubenswrapper[4817]: I0218 14:59:43.966160 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerDied","Data":"912706dfe852d2fdaef61a959cfa69bd788ba5cc2a058bcd9b22176603043b9b"} Feb 18 14:59:43 crc kubenswrapper[4817]: I0218 14:59:43.966389 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerStarted","Data":"543e3f95f8135537b1664f7a273ab40c0b6484576e81397f5d8c663cd771ddc5"} Feb 18 14:59:43 crc kubenswrapper[4817]: I0218 14:59:43.966415 4817 scope.go:117] "RemoveContainer" containerID="6050ca918ef20814018f5d5707bf221e3958126567785aad804d21ce8f34c866" Feb 18 15:00:00 crc kubenswrapper[4817]: I0218 15:00:00.222295 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523780-b8vdd"] Feb 18 15:00:00 crc kubenswrapper[4817]: E0218 15:00:00.224185 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ac3b5eb-198f-4b4a-857d-d24ec3f42282" containerName="extract-content" Feb 18 15:00:00 crc kubenswrapper[4817]: I0218 15:00:00.224264 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ac3b5eb-198f-4b4a-857d-d24ec3f42282" containerName="extract-content" Feb 18 15:00:00 crc kubenswrapper[4817]: E0218 15:00:00.224361 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ac3b5eb-198f-4b4a-857d-d24ec3f42282" containerName="registry-server" Feb 18 15:00:00 crc kubenswrapper[4817]: I0218 15:00:00.224417 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ac3b5eb-198f-4b4a-857d-d24ec3f42282" containerName="registry-server" Feb 18 15:00:00 crc kubenswrapper[4817]: E0218 15:00:00.224484 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ac3b5eb-198f-4b4a-857d-d24ec3f42282" containerName="extract-utilities" Feb 18 15:00:00 crc kubenswrapper[4817]: I0218 15:00:00.224548 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ac3b5eb-198f-4b4a-857d-d24ec3f42282" containerName="extract-utilities" Feb 18 15:00:00 crc kubenswrapper[4817]: I0218 15:00:00.224823 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ac3b5eb-198f-4b4a-857d-d24ec3f42282" containerName="registry-server" Feb 18 15:00:00 crc kubenswrapper[4817]: I0218 15:00:00.225815 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523780-b8vdd" Feb 18 15:00:00 crc kubenswrapper[4817]: I0218 15:00:00.230055 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 18 15:00:00 crc kubenswrapper[4817]: I0218 15:00:00.230434 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 18 15:00:00 crc kubenswrapper[4817]: I0218 15:00:00.238187 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523780-b8vdd"] Feb 18 15:00:00 crc kubenswrapper[4817]: I0218 15:00:00.380888 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwcrl\" (UniqueName: \"kubernetes.io/projected/f7e5827f-5f50-4e75-854b-f8d9e95e40be-kube-api-access-vwcrl\") pod \"collect-profiles-29523780-b8vdd\" (UID: \"f7e5827f-5f50-4e75-854b-f8d9e95e40be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523780-b8vdd" Feb 18 15:00:00 crc kubenswrapper[4817]: I0218 15:00:00.381014 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7e5827f-5f50-4e75-854b-f8d9e95e40be-secret-volume\") pod \"collect-profiles-29523780-b8vdd\" (UID: \"f7e5827f-5f50-4e75-854b-f8d9e95e40be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523780-b8vdd" Feb 18 15:00:00 crc kubenswrapper[4817]: I0218 15:00:00.381498 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7e5827f-5f50-4e75-854b-f8d9e95e40be-config-volume\") pod \"collect-profiles-29523780-b8vdd\" (UID: \"f7e5827f-5f50-4e75-854b-f8d9e95e40be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523780-b8vdd" Feb 18 15:00:00 crc kubenswrapper[4817]: I0218 15:00:00.484891 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwcrl\" (UniqueName: \"kubernetes.io/projected/f7e5827f-5f50-4e75-854b-f8d9e95e40be-kube-api-access-vwcrl\") pod \"collect-profiles-29523780-b8vdd\" (UID: \"f7e5827f-5f50-4e75-854b-f8d9e95e40be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523780-b8vdd" Feb 18 15:00:00 crc kubenswrapper[4817]: I0218 15:00:00.485114 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7e5827f-5f50-4e75-854b-f8d9e95e40be-secret-volume\") pod \"collect-profiles-29523780-b8vdd\" (UID: \"f7e5827f-5f50-4e75-854b-f8d9e95e40be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523780-b8vdd" Feb 18 15:00:00 crc kubenswrapper[4817]: I0218 15:00:00.485229 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7e5827f-5f50-4e75-854b-f8d9e95e40be-config-volume\") pod \"collect-profiles-29523780-b8vdd\" (UID: \"f7e5827f-5f50-4e75-854b-f8d9e95e40be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523780-b8vdd" Feb 18 15:00:00 crc kubenswrapper[4817]: I0218 15:00:00.486461 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7e5827f-5f50-4e75-854b-f8d9e95e40be-config-volume\") pod \"collect-profiles-29523780-b8vdd\" (UID: \"f7e5827f-5f50-4e75-854b-f8d9e95e40be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523780-b8vdd" Feb 18 15:00:00 crc kubenswrapper[4817]: I0218 15:00:00.492478 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7e5827f-5f50-4e75-854b-f8d9e95e40be-secret-volume\") pod \"collect-profiles-29523780-b8vdd\" (UID: \"f7e5827f-5f50-4e75-854b-f8d9e95e40be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523780-b8vdd" Feb 18 15:00:00 crc kubenswrapper[4817]: I0218 15:00:00.503927 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwcrl\" (UniqueName: \"kubernetes.io/projected/f7e5827f-5f50-4e75-854b-f8d9e95e40be-kube-api-access-vwcrl\") pod \"collect-profiles-29523780-b8vdd\" (UID: \"f7e5827f-5f50-4e75-854b-f8d9e95e40be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523780-b8vdd" Feb 18 15:00:00 crc kubenswrapper[4817]: I0218 15:00:00.547651 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523780-b8vdd" Feb 18 15:00:01 crc kubenswrapper[4817]: I0218 15:00:01.051158 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523780-b8vdd"] Feb 18 15:00:01 crc kubenswrapper[4817]: W0218 15:00:01.055590 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7e5827f_5f50_4e75_854b_f8d9e95e40be.slice/crio-314ed6db7b49769fb95a99a53844ee07c669a4cf49c379089bb4f18818e2627f WatchSource:0}: Error finding container 314ed6db7b49769fb95a99a53844ee07c669a4cf49c379089bb4f18818e2627f: Status 404 returned error can't find the container with id 314ed6db7b49769fb95a99a53844ee07c669a4cf49c379089bb4f18818e2627f Feb 18 15:00:01 crc kubenswrapper[4817]: I0218 15:00:01.158066 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523780-b8vdd" event={"ID":"f7e5827f-5f50-4e75-854b-f8d9e95e40be","Type":"ContainerStarted","Data":"314ed6db7b49769fb95a99a53844ee07c669a4cf49c379089bb4f18818e2627f"} Feb 18 15:00:02 crc kubenswrapper[4817]: I0218 15:00:02.171208 4817 generic.go:334] "Generic (PLEG): container finished" podID="f7e5827f-5f50-4e75-854b-f8d9e95e40be" containerID="254186bb1a7c47ad4c9cc1e0682b73b910d8ec53d9e593e3fb5f84fff47379da" exitCode=0 Feb 18 15:00:02 crc kubenswrapper[4817]: I0218 15:00:02.183145 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523780-b8vdd" event={"ID":"f7e5827f-5f50-4e75-854b-f8d9e95e40be","Type":"ContainerDied","Data":"254186bb1a7c47ad4c9cc1e0682b73b910d8ec53d9e593e3fb5f84fff47379da"} Feb 18 15:00:03 crc kubenswrapper[4817]: I0218 15:00:03.626031 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523780-b8vdd" Feb 18 15:00:03 crc kubenswrapper[4817]: I0218 15:00:03.768511 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7e5827f-5f50-4e75-854b-f8d9e95e40be-config-volume\") pod \"f7e5827f-5f50-4e75-854b-f8d9e95e40be\" (UID: \"f7e5827f-5f50-4e75-854b-f8d9e95e40be\") " Feb 18 15:00:03 crc kubenswrapper[4817]: I0218 15:00:03.769049 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7e5827f-5f50-4e75-854b-f8d9e95e40be-secret-volume\") pod \"f7e5827f-5f50-4e75-854b-f8d9e95e40be\" (UID: \"f7e5827f-5f50-4e75-854b-f8d9e95e40be\") " Feb 18 15:00:03 crc kubenswrapper[4817]: I0218 15:00:03.769082 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwcrl\" (UniqueName: \"kubernetes.io/projected/f7e5827f-5f50-4e75-854b-f8d9e95e40be-kube-api-access-vwcrl\") pod \"f7e5827f-5f50-4e75-854b-f8d9e95e40be\" (UID: \"f7e5827f-5f50-4e75-854b-f8d9e95e40be\") " Feb 18 15:00:03 crc kubenswrapper[4817]: I0218 15:00:03.769700 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7e5827f-5f50-4e75-854b-f8d9e95e40be-config-volume" (OuterVolumeSpecName: "config-volume") pod "f7e5827f-5f50-4e75-854b-f8d9e95e40be" (UID: "f7e5827f-5f50-4e75-854b-f8d9e95e40be"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 15:00:03 crc kubenswrapper[4817]: I0218 15:00:03.776032 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7e5827f-5f50-4e75-854b-f8d9e95e40be-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f7e5827f-5f50-4e75-854b-f8d9e95e40be" (UID: "f7e5827f-5f50-4e75-854b-f8d9e95e40be"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 15:00:03 crc kubenswrapper[4817]: I0218 15:00:03.776148 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7e5827f-5f50-4e75-854b-f8d9e95e40be-kube-api-access-vwcrl" (OuterVolumeSpecName: "kube-api-access-vwcrl") pod "f7e5827f-5f50-4e75-854b-f8d9e95e40be" (UID: "f7e5827f-5f50-4e75-854b-f8d9e95e40be"). InnerVolumeSpecName "kube-api-access-vwcrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 15:00:03 crc kubenswrapper[4817]: I0218 15:00:03.871917 4817 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7e5827f-5f50-4e75-854b-f8d9e95e40be-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 15:00:03 crc kubenswrapper[4817]: I0218 15:00:03.872180 4817 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7e5827f-5f50-4e75-854b-f8d9e95e40be-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 18 15:00:03 crc kubenswrapper[4817]: I0218 15:00:03.872295 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwcrl\" (UniqueName: \"kubernetes.io/projected/f7e5827f-5f50-4e75-854b-f8d9e95e40be-kube-api-access-vwcrl\") on node \"crc\" DevicePath \"\"" Feb 18 15:00:04 crc kubenswrapper[4817]: I0218 15:00:04.197702 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523780-b8vdd" event={"ID":"f7e5827f-5f50-4e75-854b-f8d9e95e40be","Type":"ContainerDied","Data":"314ed6db7b49769fb95a99a53844ee07c669a4cf49c379089bb4f18818e2627f"} Feb 18 15:00:04 crc kubenswrapper[4817]: I0218 15:00:04.197755 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="314ed6db7b49769fb95a99a53844ee07c669a4cf49c379089bb4f18818e2627f" Feb 18 15:00:04 crc kubenswrapper[4817]: I0218 15:00:04.197798 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523780-b8vdd" Feb 18 15:00:04 crc kubenswrapper[4817]: I0218 15:00:04.707782 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523735-fw4vl"] Feb 18 15:00:04 crc kubenswrapper[4817]: I0218 15:00:04.719572 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523735-fw4vl"] Feb 18 15:00:06 crc kubenswrapper[4817]: I0218 15:00:06.184698 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6441d64-55c6-45f3-a648-74924b94b4f0" path="/var/lib/kubelet/pods/b6441d64-55c6-45f3-a648-74924b94b4f0/volumes" Feb 18 15:00:31 crc kubenswrapper[4817]: I0218 15:00:31.318335 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7657x"] Feb 18 15:00:31 crc kubenswrapper[4817]: E0218 15:00:31.319864 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7e5827f-5f50-4e75-854b-f8d9e95e40be" containerName="collect-profiles" Feb 18 15:00:31 crc kubenswrapper[4817]: I0218 15:00:31.319888 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7e5827f-5f50-4e75-854b-f8d9e95e40be" containerName="collect-profiles" Feb 18 15:00:31 crc kubenswrapper[4817]: I0218 15:00:31.320242 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7e5827f-5f50-4e75-854b-f8d9e95e40be" containerName="collect-profiles" Feb 18 15:00:31 crc kubenswrapper[4817]: I0218 15:00:31.322627 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7657x" Feb 18 15:00:31 crc kubenswrapper[4817]: I0218 15:00:31.347381 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7657x"] Feb 18 15:00:31 crc kubenswrapper[4817]: I0218 15:00:31.473373 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80976ca8-de28-4b71-a0d1-f3aeb4410466-catalog-content\") pod \"community-operators-7657x\" (UID: \"80976ca8-de28-4b71-a0d1-f3aeb4410466\") " pod="openshift-marketplace/community-operators-7657x" Feb 18 15:00:31 crc kubenswrapper[4817]: I0218 15:00:31.473462 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80976ca8-de28-4b71-a0d1-f3aeb4410466-utilities\") pod \"community-operators-7657x\" (UID: \"80976ca8-de28-4b71-a0d1-f3aeb4410466\") " pod="openshift-marketplace/community-operators-7657x" Feb 18 15:00:31 crc kubenswrapper[4817]: I0218 15:00:31.473535 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k8sz\" (UniqueName: \"kubernetes.io/projected/80976ca8-de28-4b71-a0d1-f3aeb4410466-kube-api-access-8k8sz\") pod \"community-operators-7657x\" (UID: \"80976ca8-de28-4b71-a0d1-f3aeb4410466\") " pod="openshift-marketplace/community-operators-7657x" Feb 18 15:00:31 crc kubenswrapper[4817]: I0218 15:00:31.575219 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80976ca8-de28-4b71-a0d1-f3aeb4410466-catalog-content\") pod \"community-operators-7657x\" (UID: \"80976ca8-de28-4b71-a0d1-f3aeb4410466\") " pod="openshift-marketplace/community-operators-7657x" Feb 18 15:00:31 crc kubenswrapper[4817]: I0218 15:00:31.575302 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80976ca8-de28-4b71-a0d1-f3aeb4410466-utilities\") pod \"community-operators-7657x\" (UID: \"80976ca8-de28-4b71-a0d1-f3aeb4410466\") " pod="openshift-marketplace/community-operators-7657x" Feb 18 15:00:31 crc kubenswrapper[4817]: I0218 15:00:31.575363 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k8sz\" (UniqueName: \"kubernetes.io/projected/80976ca8-de28-4b71-a0d1-f3aeb4410466-kube-api-access-8k8sz\") pod \"community-operators-7657x\" (UID: \"80976ca8-de28-4b71-a0d1-f3aeb4410466\") " pod="openshift-marketplace/community-operators-7657x" Feb 18 15:00:31 crc kubenswrapper[4817]: I0218 15:00:31.575850 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80976ca8-de28-4b71-a0d1-f3aeb4410466-catalog-content\") pod \"community-operators-7657x\" (UID: \"80976ca8-de28-4b71-a0d1-f3aeb4410466\") " pod="openshift-marketplace/community-operators-7657x" Feb 18 15:00:31 crc kubenswrapper[4817]: I0218 15:00:31.575938 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80976ca8-de28-4b71-a0d1-f3aeb4410466-utilities\") pod \"community-operators-7657x\" (UID: \"80976ca8-de28-4b71-a0d1-f3aeb4410466\") " pod="openshift-marketplace/community-operators-7657x" Feb 18 15:00:31 crc kubenswrapper[4817]: I0218 15:00:31.597513 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k8sz\" (UniqueName: \"kubernetes.io/projected/80976ca8-de28-4b71-a0d1-f3aeb4410466-kube-api-access-8k8sz\") pod \"community-operators-7657x\" (UID: \"80976ca8-de28-4b71-a0d1-f3aeb4410466\") " pod="openshift-marketplace/community-operators-7657x" Feb 18 15:00:31 crc kubenswrapper[4817]: I0218 15:00:31.663756 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7657x" Feb 18 15:00:32 crc kubenswrapper[4817]: I0218 15:00:32.298815 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7657x"] Feb 18 15:00:32 crc kubenswrapper[4817]: I0218 15:00:32.497060 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7657x" event={"ID":"80976ca8-de28-4b71-a0d1-f3aeb4410466","Type":"ContainerStarted","Data":"e6aeaa0ac30eed7df17d3b5ddd671294670dfd563d38311c831eff44e8bc9554"} Feb 18 15:00:33 crc kubenswrapper[4817]: I0218 15:00:33.438556 4817 scope.go:117] "RemoveContainer" containerID="6b4a9274808cae48ea94dc6d50a89327c5a88a9b67bfb177ac190dbacc4a5757" Feb 18 15:00:33 crc kubenswrapper[4817]: I0218 15:00:33.506471 4817 generic.go:334] "Generic (PLEG): container finished" podID="80976ca8-de28-4b71-a0d1-f3aeb4410466" containerID="9fd929f944db1a796565e80c4f9fd6a9b9d52eda0793d51e92b246ce02c9fb43" exitCode=0 Feb 18 15:00:33 crc kubenswrapper[4817]: I0218 15:00:33.506585 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7657x" event={"ID":"80976ca8-de28-4b71-a0d1-f3aeb4410466","Type":"ContainerDied","Data":"9fd929f944db1a796565e80c4f9fd6a9b9d52eda0793d51e92b246ce02c9fb43"} Feb 18 15:00:33 crc kubenswrapper[4817]: I0218 15:00:33.523869 4817 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 15:00:37 crc kubenswrapper[4817]: I0218 15:00:37.545201 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7657x" event={"ID":"80976ca8-de28-4b71-a0d1-f3aeb4410466","Type":"ContainerStarted","Data":"77ae44e6a24f87a59abe57606bc9148db82af6b060125d257fdfd90431af0394"} Feb 18 15:00:38 crc kubenswrapper[4817]: I0218 15:00:38.554515 4817 generic.go:334] "Generic (PLEG): container finished" podID="80976ca8-de28-4b71-a0d1-f3aeb4410466" containerID="77ae44e6a24f87a59abe57606bc9148db82af6b060125d257fdfd90431af0394" exitCode=0 Feb 18 15:00:38 crc kubenswrapper[4817]: I0218 15:00:38.554612 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7657x" event={"ID":"80976ca8-de28-4b71-a0d1-f3aeb4410466","Type":"ContainerDied","Data":"77ae44e6a24f87a59abe57606bc9148db82af6b060125d257fdfd90431af0394"} Feb 18 15:00:39 crc kubenswrapper[4817]: I0218 15:00:39.581595 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7657x" event={"ID":"80976ca8-de28-4b71-a0d1-f3aeb4410466","Type":"ContainerStarted","Data":"431f6061589b02692d38505f23bfbf67d0d348c680336c1bbdcaf59dd3ff9e66"} Feb 18 15:00:39 crc kubenswrapper[4817]: I0218 15:00:39.609638 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7657x" podStartSLOduration=3.1816977 podStartE2EDuration="8.609616757s" podCreationTimestamp="2026-02-18 15:00:31 +0000 UTC" firstStartedPulling="2026-02-18 15:00:33.523599522 +0000 UTC m=+3696.099135505" lastFinishedPulling="2026-02-18 15:00:38.951518569 +0000 UTC m=+3701.527054562" observedRunningTime="2026-02-18 15:00:39.60659149 +0000 UTC m=+3702.182127483" watchObservedRunningTime="2026-02-18 15:00:39.609616757 +0000 UTC m=+3702.185152740" Feb 18 15:00:41 crc kubenswrapper[4817]: I0218 15:00:41.664260 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7657x" Feb 18 15:00:41 crc kubenswrapper[4817]: I0218 15:00:41.664600 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7657x" Feb 18 15:00:41 crc kubenswrapper[4817]: I0218 15:00:41.719747 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7657x" Feb 18 15:00:51 crc kubenswrapper[4817]: I0218 15:00:51.722191 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7657x" Feb 18 15:00:51 crc kubenswrapper[4817]: I0218 15:00:51.799816 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7657x"] Feb 18 15:00:51 crc kubenswrapper[4817]: I0218 15:00:51.853000 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b54sl"] Feb 18 15:00:51 crc kubenswrapper[4817]: I0218 15:00:51.853639 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b54sl" podUID="fc468458-8e7b-4993-aac2-87477b183acc" containerName="registry-server" containerID="cri-o://49fd39a61797ffb027f7990ef79d0db01803b7aa1ea4df5aa3ff647d57b4e32b" gracePeriod=2 Feb 18 15:00:52 crc kubenswrapper[4817]: I0218 15:00:52.478445 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b54sl" Feb 18 15:00:52 crc kubenswrapper[4817]: I0218 15:00:52.527111 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc468458-8e7b-4993-aac2-87477b183acc-catalog-content\") pod \"fc468458-8e7b-4993-aac2-87477b183acc\" (UID: \"fc468458-8e7b-4993-aac2-87477b183acc\") " Feb 18 15:00:52 crc kubenswrapper[4817]: I0218 15:00:52.527260 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc468458-8e7b-4993-aac2-87477b183acc-utilities\") pod \"fc468458-8e7b-4993-aac2-87477b183acc\" (UID: \"fc468458-8e7b-4993-aac2-87477b183acc\") " Feb 18 15:00:52 crc kubenswrapper[4817]: I0218 15:00:52.527380 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rmpr\" (UniqueName: \"kubernetes.io/projected/fc468458-8e7b-4993-aac2-87477b183acc-kube-api-access-2rmpr\") pod \"fc468458-8e7b-4993-aac2-87477b183acc\" (UID: \"fc468458-8e7b-4993-aac2-87477b183acc\") " Feb 18 15:00:52 crc kubenswrapper[4817]: I0218 15:00:52.528959 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc468458-8e7b-4993-aac2-87477b183acc-utilities" (OuterVolumeSpecName: "utilities") pod "fc468458-8e7b-4993-aac2-87477b183acc" (UID: "fc468458-8e7b-4993-aac2-87477b183acc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 15:00:52 crc kubenswrapper[4817]: I0218 15:00:52.553543 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc468458-8e7b-4993-aac2-87477b183acc-kube-api-access-2rmpr" (OuterVolumeSpecName: "kube-api-access-2rmpr") pod "fc468458-8e7b-4993-aac2-87477b183acc" (UID: "fc468458-8e7b-4993-aac2-87477b183acc"). InnerVolumeSpecName "kube-api-access-2rmpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 15:00:52 crc kubenswrapper[4817]: I0218 15:00:52.631710 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rmpr\" (UniqueName: \"kubernetes.io/projected/fc468458-8e7b-4993-aac2-87477b183acc-kube-api-access-2rmpr\") on node \"crc\" DevicePath \"\"" Feb 18 15:00:52 crc kubenswrapper[4817]: I0218 15:00:52.631758 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc468458-8e7b-4993-aac2-87477b183acc-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 15:00:52 crc kubenswrapper[4817]: I0218 15:00:52.647047 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc468458-8e7b-4993-aac2-87477b183acc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc468458-8e7b-4993-aac2-87477b183acc" (UID: "fc468458-8e7b-4993-aac2-87477b183acc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 15:00:52 crc kubenswrapper[4817]: I0218 15:00:52.720626 4817 generic.go:334] "Generic (PLEG): container finished" podID="fc468458-8e7b-4993-aac2-87477b183acc" containerID="49fd39a61797ffb027f7990ef79d0db01803b7aa1ea4df5aa3ff647d57b4e32b" exitCode=0 Feb 18 15:00:52 crc kubenswrapper[4817]: I0218 15:00:52.721856 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b54sl" Feb 18 15:00:52 crc kubenswrapper[4817]: I0218 15:00:52.724898 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b54sl" event={"ID":"fc468458-8e7b-4993-aac2-87477b183acc","Type":"ContainerDied","Data":"49fd39a61797ffb027f7990ef79d0db01803b7aa1ea4df5aa3ff647d57b4e32b"} Feb 18 15:00:52 crc kubenswrapper[4817]: I0218 15:00:52.724946 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b54sl" event={"ID":"fc468458-8e7b-4993-aac2-87477b183acc","Type":"ContainerDied","Data":"b6e20c1325c9367241c9c9dd69e530b339ccda44daa4d79a2b00b1ce0467ffd7"} Feb 18 15:00:52 crc kubenswrapper[4817]: I0218 15:00:52.724968 4817 scope.go:117] "RemoveContainer" containerID="49fd39a61797ffb027f7990ef79d0db01803b7aa1ea4df5aa3ff647d57b4e32b" Feb 18 15:00:52 crc kubenswrapper[4817]: I0218 15:00:52.733116 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc468458-8e7b-4993-aac2-87477b183acc-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 15:00:52 crc kubenswrapper[4817]: I0218 15:00:52.784157 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b54sl"] Feb 18 15:00:52 crc kubenswrapper[4817]: I0218 15:00:52.785944 4817 scope.go:117] "RemoveContainer" containerID="c6e04bfe466db6c26de56d221cd190e1ff13144289cc89cc7c5700214da7795b" Feb 18 15:00:52 crc kubenswrapper[4817]: I0218 15:00:52.796657 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b54sl"] Feb 18 15:00:52 crc kubenswrapper[4817]: I0218 15:00:52.829344 4817 scope.go:117] "RemoveContainer" containerID="8815a11b1e72453b97934bb00951b616ffef6679970ccd480a0f71d0fd2ac63d" Feb 18 15:00:52 crc kubenswrapper[4817]: I0218 15:00:52.873691 4817 scope.go:117] "RemoveContainer" containerID="49fd39a61797ffb027f7990ef79d0db01803b7aa1ea4df5aa3ff647d57b4e32b" Feb 18 15:00:52 crc kubenswrapper[4817]: E0218 15:00:52.874132 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49fd39a61797ffb027f7990ef79d0db01803b7aa1ea4df5aa3ff647d57b4e32b\": container with ID starting with 49fd39a61797ffb027f7990ef79d0db01803b7aa1ea4df5aa3ff647d57b4e32b not found: ID does not exist" containerID="49fd39a61797ffb027f7990ef79d0db01803b7aa1ea4df5aa3ff647d57b4e32b" Feb 18 15:00:52 crc kubenswrapper[4817]: I0218 15:00:52.874155 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49fd39a61797ffb027f7990ef79d0db01803b7aa1ea4df5aa3ff647d57b4e32b"} err="failed to get container status \"49fd39a61797ffb027f7990ef79d0db01803b7aa1ea4df5aa3ff647d57b4e32b\": rpc error: code = NotFound desc = could not find container \"49fd39a61797ffb027f7990ef79d0db01803b7aa1ea4df5aa3ff647d57b4e32b\": container with ID starting with 49fd39a61797ffb027f7990ef79d0db01803b7aa1ea4df5aa3ff647d57b4e32b not found: ID does not exist" Feb 18 15:00:52 crc kubenswrapper[4817]: I0218 15:00:52.874175 4817 scope.go:117] "RemoveContainer" containerID="c6e04bfe466db6c26de56d221cd190e1ff13144289cc89cc7c5700214da7795b" Feb 18 15:00:52 crc kubenswrapper[4817]: E0218 15:00:52.874454 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6e04bfe466db6c26de56d221cd190e1ff13144289cc89cc7c5700214da7795b\": container with ID starting with c6e04bfe466db6c26de56d221cd190e1ff13144289cc89cc7c5700214da7795b not found: ID does not exist" containerID="c6e04bfe466db6c26de56d221cd190e1ff13144289cc89cc7c5700214da7795b" Feb 18 15:00:52 crc kubenswrapper[4817]: I0218 15:00:52.874477 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6e04bfe466db6c26de56d221cd190e1ff13144289cc89cc7c5700214da7795b"} err="failed to get container status \"c6e04bfe466db6c26de56d221cd190e1ff13144289cc89cc7c5700214da7795b\": rpc error: code = NotFound desc = could not find container \"c6e04bfe466db6c26de56d221cd190e1ff13144289cc89cc7c5700214da7795b\": container with ID starting with c6e04bfe466db6c26de56d221cd190e1ff13144289cc89cc7c5700214da7795b not found: ID does not exist" Feb 18 15:00:52 crc kubenswrapper[4817]: I0218 15:00:52.874492 4817 scope.go:117] "RemoveContainer" containerID="8815a11b1e72453b97934bb00951b616ffef6679970ccd480a0f71d0fd2ac63d" Feb 18 15:00:52 crc kubenswrapper[4817]: E0218 15:00:52.874801 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8815a11b1e72453b97934bb00951b616ffef6679970ccd480a0f71d0fd2ac63d\": container with ID starting with 8815a11b1e72453b97934bb00951b616ffef6679970ccd480a0f71d0fd2ac63d not found: ID does not exist" containerID="8815a11b1e72453b97934bb00951b616ffef6679970ccd480a0f71d0fd2ac63d" Feb 18 15:00:52 crc kubenswrapper[4817]: I0218 15:00:52.874823 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8815a11b1e72453b97934bb00951b616ffef6679970ccd480a0f71d0fd2ac63d"} err="failed to get container status \"8815a11b1e72453b97934bb00951b616ffef6679970ccd480a0f71d0fd2ac63d\": rpc error: code = NotFound desc = could not find container \"8815a11b1e72453b97934bb00951b616ffef6679970ccd480a0f71d0fd2ac63d\": container with ID starting with 8815a11b1e72453b97934bb00951b616ffef6679970ccd480a0f71d0fd2ac63d not found: ID does not exist" Feb 18 15:00:54 crc kubenswrapper[4817]: I0218 15:00:54.187727 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc468458-8e7b-4993-aac2-87477b183acc" path="/var/lib/kubelet/pods/fc468458-8e7b-4993-aac2-87477b183acc/volumes" Feb 18 15:01:00 crc kubenswrapper[4817]: I0218 15:01:00.155114 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29523781-vf8xd"] Feb 18 15:01:00 crc kubenswrapper[4817]: E0218 15:01:00.156711 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc468458-8e7b-4993-aac2-87477b183acc" containerName="extract-content" Feb 18 15:01:00 crc kubenswrapper[4817]: I0218 15:01:00.156731 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc468458-8e7b-4993-aac2-87477b183acc" containerName="extract-content" Feb 18 15:01:00 crc kubenswrapper[4817]: E0218 15:01:00.156756 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc468458-8e7b-4993-aac2-87477b183acc" containerName="extract-utilities" Feb 18 15:01:00 crc kubenswrapper[4817]: I0218 15:01:00.156764 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc468458-8e7b-4993-aac2-87477b183acc" containerName="extract-utilities" Feb 18 15:01:00 crc kubenswrapper[4817]: E0218 15:01:00.156779 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc468458-8e7b-4993-aac2-87477b183acc" containerName="registry-server" Feb 18 15:01:00 crc kubenswrapper[4817]: I0218 15:01:00.156786 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc468458-8e7b-4993-aac2-87477b183acc" containerName="registry-server" Feb 18 15:01:00 crc kubenswrapper[4817]: I0218 15:01:00.157036 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc468458-8e7b-4993-aac2-87477b183acc" containerName="registry-server" Feb 18 15:01:00 crc kubenswrapper[4817]: I0218 15:01:00.158606 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29523781-vf8xd" Feb 18 15:01:00 crc kubenswrapper[4817]: I0218 15:01:00.194594 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06ad1a18-8a33-4ac1-a6df-9cb4b251c549-config-data\") pod \"keystone-cron-29523781-vf8xd\" (UID: \"06ad1a18-8a33-4ac1-a6df-9cb4b251c549\") " pod="openstack/keystone-cron-29523781-vf8xd" Feb 18 15:01:00 crc kubenswrapper[4817]: I0218 15:01:00.194666 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmdr4\" (UniqueName: \"kubernetes.io/projected/06ad1a18-8a33-4ac1-a6df-9cb4b251c549-kube-api-access-rmdr4\") pod \"keystone-cron-29523781-vf8xd\" (UID: \"06ad1a18-8a33-4ac1-a6df-9cb4b251c549\") " pod="openstack/keystone-cron-29523781-vf8xd" Feb 18 15:01:00 crc kubenswrapper[4817]: I0218 15:01:00.194800 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06ad1a18-8a33-4ac1-a6df-9cb4b251c549-combined-ca-bundle\") pod \"keystone-cron-29523781-vf8xd\" (UID: \"06ad1a18-8a33-4ac1-a6df-9cb4b251c549\") " pod="openstack/keystone-cron-29523781-vf8xd" Feb 18 15:01:00 crc kubenswrapper[4817]: I0218 15:01:00.194891 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/06ad1a18-8a33-4ac1-a6df-9cb4b251c549-fernet-keys\") pod \"keystone-cron-29523781-vf8xd\" (UID: \"06ad1a18-8a33-4ac1-a6df-9cb4b251c549\") " pod="openstack/keystone-cron-29523781-vf8xd" Feb 18 15:01:00 crc kubenswrapper[4817]: I0218 15:01:00.198841 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29523781-vf8xd"] Feb 18 15:01:00 crc kubenswrapper[4817]: I0218 15:01:00.296874 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06ad1a18-8a33-4ac1-a6df-9cb4b251c549-config-data\") pod \"keystone-cron-29523781-vf8xd\" (UID: \"06ad1a18-8a33-4ac1-a6df-9cb4b251c549\") " pod="openstack/keystone-cron-29523781-vf8xd" Feb 18 15:01:00 crc kubenswrapper[4817]: I0218 15:01:00.296945 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmdr4\" (UniqueName: \"kubernetes.io/projected/06ad1a18-8a33-4ac1-a6df-9cb4b251c549-kube-api-access-rmdr4\") pod \"keystone-cron-29523781-vf8xd\" (UID: \"06ad1a18-8a33-4ac1-a6df-9cb4b251c549\") " pod="openstack/keystone-cron-29523781-vf8xd" Feb 18 15:01:00 crc kubenswrapper[4817]: I0218 15:01:00.297078 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06ad1a18-8a33-4ac1-a6df-9cb4b251c549-combined-ca-bundle\") pod \"keystone-cron-29523781-vf8xd\" (UID: \"06ad1a18-8a33-4ac1-a6df-9cb4b251c549\") " pod="openstack/keystone-cron-29523781-vf8xd" Feb 18 15:01:00 crc kubenswrapper[4817]: I0218 15:01:00.297153 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/06ad1a18-8a33-4ac1-a6df-9cb4b251c549-fernet-keys\") pod \"keystone-cron-29523781-vf8xd\" (UID: \"06ad1a18-8a33-4ac1-a6df-9cb4b251c549\") " pod="openstack/keystone-cron-29523781-vf8xd" Feb 18 15:01:00 crc kubenswrapper[4817]: I0218 15:01:00.318713 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06ad1a18-8a33-4ac1-a6df-9cb4b251c549-config-data\") pod \"keystone-cron-29523781-vf8xd\" (UID: \"06ad1a18-8a33-4ac1-a6df-9cb4b251c549\") " pod="openstack/keystone-cron-29523781-vf8xd" Feb 18 15:01:00 crc kubenswrapper[4817]: I0218 15:01:00.319133 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06ad1a18-8a33-4ac1-a6df-9cb4b251c549-combined-ca-bundle\") pod \"keystone-cron-29523781-vf8xd\" (UID: \"06ad1a18-8a33-4ac1-a6df-9cb4b251c549\") " pod="openstack/keystone-cron-29523781-vf8xd" Feb 18 15:01:00 crc kubenswrapper[4817]: I0218 15:01:00.319293 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/06ad1a18-8a33-4ac1-a6df-9cb4b251c549-fernet-keys\") pod \"keystone-cron-29523781-vf8xd\" (UID: \"06ad1a18-8a33-4ac1-a6df-9cb4b251c549\") " pod="openstack/keystone-cron-29523781-vf8xd" Feb 18 15:01:00 crc kubenswrapper[4817]: I0218 15:01:00.320729 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmdr4\" (UniqueName: \"kubernetes.io/projected/06ad1a18-8a33-4ac1-a6df-9cb4b251c549-kube-api-access-rmdr4\") pod \"keystone-cron-29523781-vf8xd\" (UID: \"06ad1a18-8a33-4ac1-a6df-9cb4b251c549\") " pod="openstack/keystone-cron-29523781-vf8xd" Feb 18 15:01:00 crc kubenswrapper[4817]: I0218 15:01:00.492134 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29523781-vf8xd" Feb 18 15:01:01 crc kubenswrapper[4817]: I0218 15:01:01.018599 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29523781-vf8xd"] Feb 18 15:01:01 crc kubenswrapper[4817]: I0218 15:01:01.805656 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29523781-vf8xd" event={"ID":"06ad1a18-8a33-4ac1-a6df-9cb4b251c549","Type":"ContainerStarted","Data":"6c7b746b2d21c3622696b95de3cd6f9e312a2192a83a74b77340877b234fce4b"} Feb 18 15:01:01 crc kubenswrapper[4817]: I0218 15:01:01.806132 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29523781-vf8xd" event={"ID":"06ad1a18-8a33-4ac1-a6df-9cb4b251c549","Type":"ContainerStarted","Data":"6f20ffde7fc7fb08c2d50e3536c0c825b774422bb503b5bee5f12314806c8365"} Feb 18 15:01:01 crc kubenswrapper[4817]: I0218 15:01:01.830622 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29523781-vf8xd" podStartSLOduration=1.830604852 podStartE2EDuration="1.830604852s" podCreationTimestamp="2026-02-18 15:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 15:01:01.829512865 +0000 UTC m=+3724.405048848" watchObservedRunningTime="2026-02-18 15:01:01.830604852 +0000 UTC m=+3724.406140835" Feb 18 15:01:05 crc kubenswrapper[4817]: I0218 15:01:05.842040 4817 generic.go:334] "Generic (PLEG): container finished" podID="06ad1a18-8a33-4ac1-a6df-9cb4b251c549" containerID="6c7b746b2d21c3622696b95de3cd6f9e312a2192a83a74b77340877b234fce4b" exitCode=0 Feb 18 15:01:05 crc kubenswrapper[4817]: I0218 15:01:05.842127 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29523781-vf8xd" event={"ID":"06ad1a18-8a33-4ac1-a6df-9cb4b251c549","Type":"ContainerDied","Data":"6c7b746b2d21c3622696b95de3cd6f9e312a2192a83a74b77340877b234fce4b"} Feb 18 15:01:07 crc kubenswrapper[4817]: I0218 15:01:07.280848 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29523781-vf8xd" Feb 18 15:01:07 crc kubenswrapper[4817]: I0218 15:01:07.432928 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06ad1a18-8a33-4ac1-a6df-9cb4b251c549-combined-ca-bundle\") pod \"06ad1a18-8a33-4ac1-a6df-9cb4b251c549\" (UID: \"06ad1a18-8a33-4ac1-a6df-9cb4b251c549\") " Feb 18 15:01:07 crc kubenswrapper[4817]: I0218 15:01:07.433012 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmdr4\" (UniqueName: \"kubernetes.io/projected/06ad1a18-8a33-4ac1-a6df-9cb4b251c549-kube-api-access-rmdr4\") pod \"06ad1a18-8a33-4ac1-a6df-9cb4b251c549\" (UID: \"06ad1a18-8a33-4ac1-a6df-9cb4b251c549\") " Feb 18 15:01:07 crc kubenswrapper[4817]: I0218 15:01:07.433108 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06ad1a18-8a33-4ac1-a6df-9cb4b251c549-config-data\") pod \"06ad1a18-8a33-4ac1-a6df-9cb4b251c549\" (UID: \"06ad1a18-8a33-4ac1-a6df-9cb4b251c549\") " Feb 18 15:01:07 crc kubenswrapper[4817]: I0218 15:01:07.433196 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/06ad1a18-8a33-4ac1-a6df-9cb4b251c549-fernet-keys\") pod \"06ad1a18-8a33-4ac1-a6df-9cb4b251c549\" (UID: \"06ad1a18-8a33-4ac1-a6df-9cb4b251c549\") " Feb 18 15:01:07 crc kubenswrapper[4817]: I0218 15:01:07.439013 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06ad1a18-8a33-4ac1-a6df-9cb4b251c549-kube-api-access-rmdr4" (OuterVolumeSpecName: "kube-api-access-rmdr4") pod "06ad1a18-8a33-4ac1-a6df-9cb4b251c549" (UID: "06ad1a18-8a33-4ac1-a6df-9cb4b251c549"). InnerVolumeSpecName "kube-api-access-rmdr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 15:01:07 crc kubenswrapper[4817]: I0218 15:01:07.439499 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06ad1a18-8a33-4ac1-a6df-9cb4b251c549-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "06ad1a18-8a33-4ac1-a6df-9cb4b251c549" (UID: "06ad1a18-8a33-4ac1-a6df-9cb4b251c549"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 15:01:07 crc kubenswrapper[4817]: I0218 15:01:07.467059 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06ad1a18-8a33-4ac1-a6df-9cb4b251c549-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06ad1a18-8a33-4ac1-a6df-9cb4b251c549" (UID: "06ad1a18-8a33-4ac1-a6df-9cb4b251c549"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 15:01:07 crc kubenswrapper[4817]: I0218 15:01:07.486843 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06ad1a18-8a33-4ac1-a6df-9cb4b251c549-config-data" (OuterVolumeSpecName: "config-data") pod "06ad1a18-8a33-4ac1-a6df-9cb4b251c549" (UID: "06ad1a18-8a33-4ac1-a6df-9cb4b251c549"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 15:01:07 crc kubenswrapper[4817]: I0218 15:01:07.536331 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06ad1a18-8a33-4ac1-a6df-9cb4b251c549-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 15:01:07 crc kubenswrapper[4817]: I0218 15:01:07.536480 4817 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/06ad1a18-8a33-4ac1-a6df-9cb4b251c549-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 18 15:01:07 crc kubenswrapper[4817]: I0218 15:01:07.536558 4817 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06ad1a18-8a33-4ac1-a6df-9cb4b251c549-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 18 15:01:07 crc kubenswrapper[4817]: I0218 15:01:07.536631 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmdr4\" (UniqueName: \"kubernetes.io/projected/06ad1a18-8a33-4ac1-a6df-9cb4b251c549-kube-api-access-rmdr4\") on node \"crc\" DevicePath \"\"" Feb 18 15:01:07 crc kubenswrapper[4817]: I0218 15:01:07.863854 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29523781-vf8xd" event={"ID":"06ad1a18-8a33-4ac1-a6df-9cb4b251c549","Type":"ContainerDied","Data":"6f20ffde7fc7fb08c2d50e3536c0c825b774422bb503b5bee5f12314806c8365"} Feb 18 15:01:07 crc kubenswrapper[4817]: I0218 15:01:07.863902 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f20ffde7fc7fb08c2d50e3536c0c825b774422bb503b5bee5f12314806c8365" Feb 18 15:01:07 crc kubenswrapper[4817]: I0218 15:01:07.863915 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29523781-vf8xd" Feb 18 15:01:37 crc kubenswrapper[4817]: I0218 15:01:37.060705 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6qt2m"] Feb 18 15:01:37 crc kubenswrapper[4817]: E0218 15:01:37.061999 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06ad1a18-8a33-4ac1-a6df-9cb4b251c549" containerName="keystone-cron" Feb 18 15:01:37 crc kubenswrapper[4817]: I0218 15:01:37.062019 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="06ad1a18-8a33-4ac1-a6df-9cb4b251c549" containerName="keystone-cron" Feb 18 15:01:37 crc kubenswrapper[4817]: I0218 15:01:37.062276 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="06ad1a18-8a33-4ac1-a6df-9cb4b251c549" containerName="keystone-cron" Feb 18 15:01:37 crc kubenswrapper[4817]: I0218 15:01:37.064402 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6qt2m" Feb 18 15:01:37 crc kubenswrapper[4817]: I0218 15:01:37.076777 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6qt2m"] Feb 18 15:01:37 crc kubenswrapper[4817]: I0218 15:01:37.232701 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkcjj\" (UniqueName: \"kubernetes.io/projected/aef56874-7291-4a81-a724-d0290732f2ea-kube-api-access-wkcjj\") pod \"redhat-operators-6qt2m\" (UID: \"aef56874-7291-4a81-a724-d0290732f2ea\") " pod="openshift-marketplace/redhat-operators-6qt2m" Feb 18 15:01:37 crc kubenswrapper[4817]: I0218 15:01:37.232805 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aef56874-7291-4a81-a724-d0290732f2ea-utilities\") pod \"redhat-operators-6qt2m\" (UID: \"aef56874-7291-4a81-a724-d0290732f2ea\") " pod="openshift-marketplace/redhat-operators-6qt2m" Feb 18 15:01:37 crc kubenswrapper[4817]: I0218 15:01:37.232932 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aef56874-7291-4a81-a724-d0290732f2ea-catalog-content\") pod \"redhat-operators-6qt2m\" (UID: \"aef56874-7291-4a81-a724-d0290732f2ea\") " pod="openshift-marketplace/redhat-operators-6qt2m" Feb 18 15:01:37 crc kubenswrapper[4817]: I0218 15:01:37.335264 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aef56874-7291-4a81-a724-d0290732f2ea-catalog-content\") pod \"redhat-operators-6qt2m\" (UID: \"aef56874-7291-4a81-a724-d0290732f2ea\") " pod="openshift-marketplace/redhat-operators-6qt2m" Feb 18 15:01:37 crc kubenswrapper[4817]: I0218 15:01:37.335846 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aef56874-7291-4a81-a724-d0290732f2ea-catalog-content\") pod \"redhat-operators-6qt2m\" (UID: \"aef56874-7291-4a81-a724-d0290732f2ea\") " pod="openshift-marketplace/redhat-operators-6qt2m" Feb 18 15:01:37 crc kubenswrapper[4817]: I0218 15:01:37.335869 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkcjj\" (UniqueName: \"kubernetes.io/projected/aef56874-7291-4a81-a724-d0290732f2ea-kube-api-access-wkcjj\") pod \"redhat-operators-6qt2m\" (UID: \"aef56874-7291-4a81-a724-d0290732f2ea\") " pod="openshift-marketplace/redhat-operators-6qt2m" Feb 18 15:01:37 crc kubenswrapper[4817]: I0218 15:01:37.335999 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aef56874-7291-4a81-a724-d0290732f2ea-utilities\") pod \"redhat-operators-6qt2m\" (UID: \"aef56874-7291-4a81-a724-d0290732f2ea\") " pod="openshift-marketplace/redhat-operators-6qt2m" Feb 18 15:01:37 crc kubenswrapper[4817]: I0218 15:01:37.336538 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aef56874-7291-4a81-a724-d0290732f2ea-utilities\") pod \"redhat-operators-6qt2m\" (UID: \"aef56874-7291-4a81-a724-d0290732f2ea\") " pod="openshift-marketplace/redhat-operators-6qt2m" Feb 18 15:01:37 crc kubenswrapper[4817]: I0218 15:01:37.367663 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkcjj\" (UniqueName: \"kubernetes.io/projected/aef56874-7291-4a81-a724-d0290732f2ea-kube-api-access-wkcjj\") pod \"redhat-operators-6qt2m\" (UID: \"aef56874-7291-4a81-a724-d0290732f2ea\") " pod="openshift-marketplace/redhat-operators-6qt2m" Feb 18 15:01:37 crc kubenswrapper[4817]: I0218 15:01:37.390672 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6qt2m" Feb 18 15:01:37 crc kubenswrapper[4817]: I0218 15:01:37.870800 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6qt2m"] Feb 18 15:01:37 crc kubenswrapper[4817]: W0218 15:01:37.872282 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaef56874_7291_4a81_a724_d0290732f2ea.slice/crio-9f3dfe949df01afaf9de1be56824fbb59508c578f408fb4b8fa94fdd228b1901 WatchSource:0}: Error finding container 9f3dfe949df01afaf9de1be56824fbb59508c578f408fb4b8fa94fdd228b1901: Status 404 returned error can't find the container with id 9f3dfe949df01afaf9de1be56824fbb59508c578f408fb4b8fa94fdd228b1901 Feb 18 15:01:38 crc kubenswrapper[4817]: I0218 15:01:38.161609 4817 generic.go:334] "Generic (PLEG): container finished" podID="aef56874-7291-4a81-a724-d0290732f2ea" containerID="0ffdc3dba54558724380f0bf6e678d3289b268f01ef0ee28232380468a46da5a" exitCode=0 Feb 18 15:01:38 crc kubenswrapper[4817]: I0218 15:01:38.161719 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qt2m" event={"ID":"aef56874-7291-4a81-a724-d0290732f2ea","Type":"ContainerDied","Data":"0ffdc3dba54558724380f0bf6e678d3289b268f01ef0ee28232380468a46da5a"} Feb 18 15:01:38 crc kubenswrapper[4817]: I0218 15:01:38.161899 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qt2m" event={"ID":"aef56874-7291-4a81-a724-d0290732f2ea","Type":"ContainerStarted","Data":"9f3dfe949df01afaf9de1be56824fbb59508c578f408fb4b8fa94fdd228b1901"} Feb 18 15:01:39 crc kubenswrapper[4817]: I0218 15:01:39.172664 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qt2m" event={"ID":"aef56874-7291-4a81-a724-d0290732f2ea","Type":"ContainerStarted","Data":"562cb575529028084dabb6fc54360267ad5a587237b7588f3175e6a0d675d3f0"} Feb 18 15:01:43 crc kubenswrapper[4817]: I0218 15:01:43.213278 4817 generic.go:334] "Generic (PLEG): container finished" podID="aef56874-7291-4a81-a724-d0290732f2ea" containerID="562cb575529028084dabb6fc54360267ad5a587237b7588f3175e6a0d675d3f0" exitCode=0 Feb 18 15:01:43 crc kubenswrapper[4817]: I0218 15:01:43.213358 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qt2m" event={"ID":"aef56874-7291-4a81-a724-d0290732f2ea","Type":"ContainerDied","Data":"562cb575529028084dabb6fc54360267ad5a587237b7588f3175e6a0d675d3f0"} Feb 18 15:01:44 crc kubenswrapper[4817]: I0218 15:01:44.225441 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qt2m" event={"ID":"aef56874-7291-4a81-a724-d0290732f2ea","Type":"ContainerStarted","Data":"638f02c44eedaa973ed014aa4b5be19fb87769c8cccdf409d573992f09b62099"} Feb 18 15:01:44 crc kubenswrapper[4817]: I0218 15:01:44.252188 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6qt2m" podStartSLOduration=1.688228547 podStartE2EDuration="7.252162496s" podCreationTimestamp="2026-02-18 15:01:37 +0000 UTC" firstStartedPulling="2026-02-18 15:01:38.163190573 +0000 UTC m=+3760.738726556" lastFinishedPulling="2026-02-18 15:01:43.727124522 +0000 UTC m=+3766.302660505" observedRunningTime="2026-02-18 15:01:44.244088423 +0000 UTC m=+3766.819624406" watchObservedRunningTime="2026-02-18 15:01:44.252162496 +0000 UTC m=+3766.827698479" Feb 18 15:01:47 crc kubenswrapper[4817]: I0218 15:01:47.391483 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6qt2m" Feb 18 15:01:47 crc kubenswrapper[4817]: I0218 15:01:47.392567 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6qt2m" Feb 18 15:01:48 crc kubenswrapper[4817]: I0218 15:01:48.438041 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6qt2m" podUID="aef56874-7291-4a81-a724-d0290732f2ea" containerName="registry-server" probeResult="failure" output=< Feb 18 15:01:48 crc kubenswrapper[4817]: timeout: failed to connect service ":50051" within 1s Feb 18 15:01:48 crc kubenswrapper[4817]: > Feb 18 15:01:57 crc kubenswrapper[4817]: I0218 15:01:57.438306 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6qt2m" Feb 18 15:01:57 crc kubenswrapper[4817]: I0218 15:01:57.497706 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6qt2m" Feb 18 15:01:57 crc kubenswrapper[4817]: I0218 15:01:57.674322 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6qt2m"] Feb 18 15:01:59 crc kubenswrapper[4817]: I0218 15:01:59.362914 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6qt2m" podUID="aef56874-7291-4a81-a724-d0290732f2ea" containerName="registry-server" containerID="cri-o://638f02c44eedaa973ed014aa4b5be19fb87769c8cccdf409d573992f09b62099" gracePeriod=2 Feb 18 15:01:59 crc kubenswrapper[4817]: I0218 15:01:59.896330 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6qt2m" Feb 18 15:02:00 crc kubenswrapper[4817]: I0218 15:02:00.007194 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aef56874-7291-4a81-a724-d0290732f2ea-catalog-content\") pod \"aef56874-7291-4a81-a724-d0290732f2ea\" (UID: \"aef56874-7291-4a81-a724-d0290732f2ea\") " Feb 18 15:02:00 crc kubenswrapper[4817]: I0218 15:02:00.007333 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkcjj\" (UniqueName: \"kubernetes.io/projected/aef56874-7291-4a81-a724-d0290732f2ea-kube-api-access-wkcjj\") pod \"aef56874-7291-4a81-a724-d0290732f2ea\" (UID: \"aef56874-7291-4a81-a724-d0290732f2ea\") " Feb 18 15:02:00 crc kubenswrapper[4817]: I0218 15:02:00.007517 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aef56874-7291-4a81-a724-d0290732f2ea-utilities\") pod \"aef56874-7291-4a81-a724-d0290732f2ea\" (UID: \"aef56874-7291-4a81-a724-d0290732f2ea\") " Feb 18 15:02:00 crc kubenswrapper[4817]: I0218 15:02:00.008320 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aef56874-7291-4a81-a724-d0290732f2ea-utilities" (OuterVolumeSpecName: "utilities") pod "aef56874-7291-4a81-a724-d0290732f2ea" (UID: "aef56874-7291-4a81-a724-d0290732f2ea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 15:02:00 crc kubenswrapper[4817]: I0218 15:02:00.012837 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aef56874-7291-4a81-a724-d0290732f2ea-kube-api-access-wkcjj" (OuterVolumeSpecName: "kube-api-access-wkcjj") pod "aef56874-7291-4a81-a724-d0290732f2ea" (UID: "aef56874-7291-4a81-a724-d0290732f2ea"). InnerVolumeSpecName "kube-api-access-wkcjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 15:02:00 crc kubenswrapper[4817]: I0218 15:02:00.109602 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aef56874-7291-4a81-a724-d0290732f2ea-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 15:02:00 crc kubenswrapper[4817]: I0218 15:02:00.109639 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkcjj\" (UniqueName: \"kubernetes.io/projected/aef56874-7291-4a81-a724-d0290732f2ea-kube-api-access-wkcjj\") on node \"crc\" DevicePath \"\"" Feb 18 15:02:00 crc kubenswrapper[4817]: I0218 15:02:00.134652 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aef56874-7291-4a81-a724-d0290732f2ea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aef56874-7291-4a81-a724-d0290732f2ea" (UID: "aef56874-7291-4a81-a724-d0290732f2ea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 15:02:00 crc kubenswrapper[4817]: I0218 15:02:00.211911 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aef56874-7291-4a81-a724-d0290732f2ea-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 15:02:00 crc kubenswrapper[4817]: I0218 15:02:00.374726 4817 generic.go:334] "Generic (PLEG): container finished" podID="aef56874-7291-4a81-a724-d0290732f2ea" containerID="638f02c44eedaa973ed014aa4b5be19fb87769c8cccdf409d573992f09b62099" exitCode=0 Feb 18 15:02:00 crc kubenswrapper[4817]: I0218 15:02:00.374792 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qt2m" event={"ID":"aef56874-7291-4a81-a724-d0290732f2ea","Type":"ContainerDied","Data":"638f02c44eedaa973ed014aa4b5be19fb87769c8cccdf409d573992f09b62099"} Feb 18 15:02:00 crc kubenswrapper[4817]: I0218 15:02:00.374849 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qt2m" event={"ID":"aef56874-7291-4a81-a724-d0290732f2ea","Type":"ContainerDied","Data":"9f3dfe949df01afaf9de1be56824fbb59508c578f408fb4b8fa94fdd228b1901"} Feb 18 15:02:00 crc kubenswrapper[4817]: I0218 15:02:00.374867 4817 scope.go:117] "RemoveContainer" containerID="638f02c44eedaa973ed014aa4b5be19fb87769c8cccdf409d573992f09b62099" Feb 18 15:02:00 crc kubenswrapper[4817]: I0218 15:02:00.374799 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6qt2m" Feb 18 15:02:00 crc kubenswrapper[4817]: I0218 15:02:00.399906 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6qt2m"] Feb 18 15:02:00 crc kubenswrapper[4817]: I0218 15:02:00.403357 4817 scope.go:117] "RemoveContainer" containerID="562cb575529028084dabb6fc54360267ad5a587237b7588f3175e6a0d675d3f0" Feb 18 15:02:00 crc kubenswrapper[4817]: I0218 15:02:00.410690 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6qt2m"] Feb 18 15:02:00 crc kubenswrapper[4817]: I0218 15:02:00.428852 4817 scope.go:117] "RemoveContainer" containerID="0ffdc3dba54558724380f0bf6e678d3289b268f01ef0ee28232380468a46da5a" Feb 18 15:02:00 crc kubenswrapper[4817]: I0218 15:02:00.475576 4817 scope.go:117] "RemoveContainer" containerID="638f02c44eedaa973ed014aa4b5be19fb87769c8cccdf409d573992f09b62099" Feb 18 15:02:00 crc kubenswrapper[4817]: E0218 15:02:00.476206 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"638f02c44eedaa973ed014aa4b5be19fb87769c8cccdf409d573992f09b62099\": container with ID starting with 638f02c44eedaa973ed014aa4b5be19fb87769c8cccdf409d573992f09b62099 not found: ID does not exist" containerID="638f02c44eedaa973ed014aa4b5be19fb87769c8cccdf409d573992f09b62099" Feb 18 15:02:00 crc kubenswrapper[4817]: I0218 15:02:00.476306 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"638f02c44eedaa973ed014aa4b5be19fb87769c8cccdf409d573992f09b62099"} err="failed to get container status \"638f02c44eedaa973ed014aa4b5be19fb87769c8cccdf409d573992f09b62099\": rpc error: code = NotFound desc = could not find container \"638f02c44eedaa973ed014aa4b5be19fb87769c8cccdf409d573992f09b62099\": container with ID starting with 638f02c44eedaa973ed014aa4b5be19fb87769c8cccdf409d573992f09b62099 not found: ID does not exist" Feb 18 15:02:00 crc kubenswrapper[4817]: I0218 15:02:00.476376 4817 scope.go:117] "RemoveContainer" containerID="562cb575529028084dabb6fc54360267ad5a587237b7588f3175e6a0d675d3f0" Feb 18 15:02:00 crc kubenswrapper[4817]: E0218 15:02:00.476724 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"562cb575529028084dabb6fc54360267ad5a587237b7588f3175e6a0d675d3f0\": container with ID starting with 562cb575529028084dabb6fc54360267ad5a587237b7588f3175e6a0d675d3f0 not found: ID does not exist" containerID="562cb575529028084dabb6fc54360267ad5a587237b7588f3175e6a0d675d3f0" Feb 18 15:02:00 crc kubenswrapper[4817]: I0218 15:02:00.476778 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"562cb575529028084dabb6fc54360267ad5a587237b7588f3175e6a0d675d3f0"} err="failed to get container status \"562cb575529028084dabb6fc54360267ad5a587237b7588f3175e6a0d675d3f0\": rpc error: code = NotFound desc = could not find container \"562cb575529028084dabb6fc54360267ad5a587237b7588f3175e6a0d675d3f0\": container with ID starting with 562cb575529028084dabb6fc54360267ad5a587237b7588f3175e6a0d675d3f0 not found: ID does not exist" Feb 18 15:02:00 crc kubenswrapper[4817]: I0218 15:02:00.476807 4817 scope.go:117] "RemoveContainer" containerID="0ffdc3dba54558724380f0bf6e678d3289b268f01ef0ee28232380468a46da5a" Feb 18 15:02:00 crc kubenswrapper[4817]: E0218 15:02:00.477177 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ffdc3dba54558724380f0bf6e678d3289b268f01ef0ee28232380468a46da5a\": container with ID starting with 0ffdc3dba54558724380f0bf6e678d3289b268f01ef0ee28232380468a46da5a not found: ID does not exist" containerID="0ffdc3dba54558724380f0bf6e678d3289b268f01ef0ee28232380468a46da5a" Feb 18 15:02:00 crc kubenswrapper[4817]: I0218 15:02:00.477230 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ffdc3dba54558724380f0bf6e678d3289b268f01ef0ee28232380468a46da5a"} err="failed to get container status \"0ffdc3dba54558724380f0bf6e678d3289b268f01ef0ee28232380468a46da5a\": rpc error: code = NotFound desc = could not find container \"0ffdc3dba54558724380f0bf6e678d3289b268f01ef0ee28232380468a46da5a\": container with ID starting with 0ffdc3dba54558724380f0bf6e678d3289b268f01ef0ee28232380468a46da5a not found: ID does not exist" Feb 18 15:02:02 crc kubenswrapper[4817]: I0218 15:02:02.183724 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aef56874-7291-4a81-a724-d0290732f2ea" path="/var/lib/kubelet/pods/aef56874-7291-4a81-a724-d0290732f2ea/volumes" Feb 18 15:02:12 crc kubenswrapper[4817]: I0218 15:02:12.864035 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 15:02:12 crc kubenswrapper[4817]: I0218 15:02:12.864519 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 15:02:42 crc kubenswrapper[4817]: I0218 15:02:42.863770 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 15:02:42 crc kubenswrapper[4817]: I0218 15:02:42.864399 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 15:03:03 crc kubenswrapper[4817]: I0218 15:03:03.785866 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kt4w8"] Feb 18 15:03:03 crc kubenswrapper[4817]: E0218 15:03:03.786963 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aef56874-7291-4a81-a724-d0290732f2ea" containerName="extract-content" Feb 18 15:03:03 crc kubenswrapper[4817]: I0218 15:03:03.787006 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="aef56874-7291-4a81-a724-d0290732f2ea" containerName="extract-content" Feb 18 15:03:03 crc kubenswrapper[4817]: E0218 15:03:03.787028 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aef56874-7291-4a81-a724-d0290732f2ea" containerName="registry-server" Feb 18 15:03:03 crc kubenswrapper[4817]: I0218 15:03:03.787038 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="aef56874-7291-4a81-a724-d0290732f2ea" containerName="registry-server" Feb 18 15:03:03 crc kubenswrapper[4817]: E0218 15:03:03.787071 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aef56874-7291-4a81-a724-d0290732f2ea" containerName="extract-utilities" Feb 18 15:03:03 crc kubenswrapper[4817]: I0218 15:03:03.787079 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="aef56874-7291-4a81-a724-d0290732f2ea" containerName="extract-utilities" Feb 18 15:03:03 crc kubenswrapper[4817]: I0218 15:03:03.787289 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="aef56874-7291-4a81-a724-d0290732f2ea" containerName="registry-server" Feb 18 15:03:03 crc kubenswrapper[4817]: I0218 15:03:03.789399 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kt4w8" Feb 18 15:03:03 crc kubenswrapper[4817]: I0218 15:03:03.805003 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kt4w8"] Feb 18 15:03:03 crc kubenswrapper[4817]: I0218 15:03:03.977833 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37d1b82f-ba72-4f8c-aa0d-f7780d6bb650-catalog-content\") pod \"certified-operators-kt4w8\" (UID: \"37d1b82f-ba72-4f8c-aa0d-f7780d6bb650\") " pod="openshift-marketplace/certified-operators-kt4w8" Feb 18 15:03:03 crc kubenswrapper[4817]: I0218 15:03:03.977893 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tc6s\" (UniqueName: \"kubernetes.io/projected/37d1b82f-ba72-4f8c-aa0d-f7780d6bb650-kube-api-access-7tc6s\") pod \"certified-operators-kt4w8\" (UID: \"37d1b82f-ba72-4f8c-aa0d-f7780d6bb650\") " pod="openshift-marketplace/certified-operators-kt4w8" Feb 18 15:03:03 crc kubenswrapper[4817]: I0218 15:03:03.978010 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37d1b82f-ba72-4f8c-aa0d-f7780d6bb650-utilities\") pod \"certified-operators-kt4w8\" (UID: \"37d1b82f-ba72-4f8c-aa0d-f7780d6bb650\") " pod="openshift-marketplace/certified-operators-kt4w8" Feb 18 15:03:04 crc kubenswrapper[4817]: I0218 15:03:04.079507 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37d1b82f-ba72-4f8c-aa0d-f7780d6bb650-utilities\") pod \"certified-operators-kt4w8\" (UID: \"37d1b82f-ba72-4f8c-aa0d-f7780d6bb650\") " pod="openshift-marketplace/certified-operators-kt4w8" Feb 18 15:03:04 crc kubenswrapper[4817]: I0218 15:03:04.079639 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37d1b82f-ba72-4f8c-aa0d-f7780d6bb650-catalog-content\") pod \"certified-operators-kt4w8\" (UID: \"37d1b82f-ba72-4f8c-aa0d-f7780d6bb650\") " pod="openshift-marketplace/certified-operators-kt4w8" Feb 18 15:03:04 crc kubenswrapper[4817]: I0218 15:03:04.079673 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tc6s\" (UniqueName: \"kubernetes.io/projected/37d1b82f-ba72-4f8c-aa0d-f7780d6bb650-kube-api-access-7tc6s\") pod \"certified-operators-kt4w8\" (UID: \"37d1b82f-ba72-4f8c-aa0d-f7780d6bb650\") " pod="openshift-marketplace/certified-operators-kt4w8" Feb 18 15:03:04 crc kubenswrapper[4817]: I0218 15:03:04.080286 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37d1b82f-ba72-4f8c-aa0d-f7780d6bb650-utilities\") pod \"certified-operators-kt4w8\" (UID: \"37d1b82f-ba72-4f8c-aa0d-f7780d6bb650\") " pod="openshift-marketplace/certified-operators-kt4w8" Feb 18 15:03:04 crc kubenswrapper[4817]: I0218 15:03:04.080329 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37d1b82f-ba72-4f8c-aa0d-f7780d6bb650-catalog-content\") pod \"certified-operators-kt4w8\" (UID: \"37d1b82f-ba72-4f8c-aa0d-f7780d6bb650\") " pod="openshift-marketplace/certified-operators-kt4w8" Feb 18 15:03:04 crc kubenswrapper[4817]: I0218 15:03:04.098968 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tc6s\" (UniqueName: \"kubernetes.io/projected/37d1b82f-ba72-4f8c-aa0d-f7780d6bb650-kube-api-access-7tc6s\") pod \"certified-operators-kt4w8\" (UID: \"37d1b82f-ba72-4f8c-aa0d-f7780d6bb650\") " pod="openshift-marketplace/certified-operators-kt4w8" Feb 18 15:03:04 crc kubenswrapper[4817]: I0218 15:03:04.118019 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kt4w8" Feb 18 15:03:04 crc kubenswrapper[4817]: I0218 15:03:04.594382 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kt4w8"] Feb 18 15:03:05 crc kubenswrapper[4817]: I0218 15:03:05.542743 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kt4w8" event={"ID":"37d1b82f-ba72-4f8c-aa0d-f7780d6bb650","Type":"ContainerStarted","Data":"5c3b6fd86d0aa298511f8947589b82d8117d94d9dfd36a0d1d5e75f1215a629f"} Feb 18 15:03:06 crc kubenswrapper[4817]: I0218 15:03:06.553350 4817 generic.go:334] "Generic (PLEG): container finished" podID="37d1b82f-ba72-4f8c-aa0d-f7780d6bb650" containerID="02f596420682df18f1ce28d08d84808b79c9136786bc2125461270e0c0a947ae" exitCode=0 Feb 18 15:03:06 crc kubenswrapper[4817]: I0218 15:03:06.553387 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kt4w8" event={"ID":"37d1b82f-ba72-4f8c-aa0d-f7780d6bb650","Type":"ContainerDied","Data":"02f596420682df18f1ce28d08d84808b79c9136786bc2125461270e0c0a947ae"} Feb 18 15:03:07 crc kubenswrapper[4817]: I0218 15:03:07.567713 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kt4w8" event={"ID":"37d1b82f-ba72-4f8c-aa0d-f7780d6bb650","Type":"ContainerStarted","Data":"fa61bd8e57fac978ba9b4c4d83490089289fb6df675d4b47c7ca9804025d2144"} Feb 18 15:03:09 crc kubenswrapper[4817]: I0218 15:03:09.597234 4817 generic.go:334] "Generic (PLEG): container finished" podID="37d1b82f-ba72-4f8c-aa0d-f7780d6bb650" containerID="fa61bd8e57fac978ba9b4c4d83490089289fb6df675d4b47c7ca9804025d2144" exitCode=0 Feb 18 15:03:09 crc kubenswrapper[4817]: I0218 15:03:09.597311 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kt4w8" event={"ID":"37d1b82f-ba72-4f8c-aa0d-f7780d6bb650","Type":"ContainerDied","Data":"fa61bd8e57fac978ba9b4c4d83490089289fb6df675d4b47c7ca9804025d2144"} Feb 18 15:03:10 crc kubenswrapper[4817]: I0218 15:03:10.610951 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kt4w8" event={"ID":"37d1b82f-ba72-4f8c-aa0d-f7780d6bb650","Type":"ContainerStarted","Data":"16a9847c7826f46726dee440a9542ee6f7407d7b3df9f2b2be490de7cf01febc"} Feb 18 15:03:10 crc kubenswrapper[4817]: I0218 15:03:10.630412 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kt4w8" podStartSLOduration=4.152450522 podStartE2EDuration="7.630393941s" podCreationTimestamp="2026-02-18 15:03:03 +0000 UTC" firstStartedPulling="2026-02-18 15:03:06.555325168 +0000 UTC m=+3849.130861151" lastFinishedPulling="2026-02-18 15:03:10.033268567 +0000 UTC m=+3852.608804570" observedRunningTime="2026-02-18 15:03:10.62560095 +0000 UTC m=+3853.201136953" watchObservedRunningTime="2026-02-18 15:03:10.630393941 +0000 UTC m=+3853.205929924" Feb 18 15:03:12 crc kubenswrapper[4817]: I0218 15:03:12.863956 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 15:03:12 crc kubenswrapper[4817]: I0218 15:03:12.864366 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 15:03:12 crc kubenswrapper[4817]: I0218 15:03:12.864426 4817 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" Feb 18 15:03:12 crc kubenswrapper[4817]: I0218 15:03:12.865327 4817 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"543e3f95f8135537b1664f7a273ab40c0b6484576e81397f5d8c663cd771ddc5"} pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 15:03:12 crc kubenswrapper[4817]: I0218 15:03:12.865391 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" containerID="cri-o://543e3f95f8135537b1664f7a273ab40c0b6484576e81397f5d8c663cd771ddc5" gracePeriod=600 Feb 18 15:03:12 crc kubenswrapper[4817]: E0218 15:03:12.987276 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:03:13 crc kubenswrapper[4817]: I0218 15:03:13.639086 4817 generic.go:334] "Generic (PLEG): container finished" podID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerID="543e3f95f8135537b1664f7a273ab40c0b6484576e81397f5d8c663cd771ddc5" exitCode=0 Feb 18 15:03:13 crc kubenswrapper[4817]: I0218 15:03:13.639172 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerDied","Data":"543e3f95f8135537b1664f7a273ab40c0b6484576e81397f5d8c663cd771ddc5"} Feb 18 15:03:13 crc kubenswrapper[4817]: I0218 15:03:13.639478 4817 scope.go:117] "RemoveContainer" containerID="912706dfe852d2fdaef61a959cfa69bd788ba5cc2a058bcd9b22176603043b9b" Feb 18 15:03:13 crc kubenswrapper[4817]: I0218 15:03:13.640373 4817 scope.go:117] "RemoveContainer" containerID="543e3f95f8135537b1664f7a273ab40c0b6484576e81397f5d8c663cd771ddc5" Feb 18 15:03:13 crc kubenswrapper[4817]: E0218 15:03:13.640781 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:03:14 crc kubenswrapper[4817]: I0218 15:03:14.118140 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kt4w8" Feb 18 15:03:14 crc kubenswrapper[4817]: I0218 15:03:14.118206 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kt4w8" Feb 18 15:03:14 crc kubenswrapper[4817]: I0218 15:03:14.166121 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kt4w8" Feb 18 15:03:24 crc kubenswrapper[4817]: I0218 15:03:24.168062 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kt4w8" Feb 18 15:03:24 crc kubenswrapper[4817]: I0218 15:03:24.216871 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kt4w8"] Feb 18 15:03:24 crc kubenswrapper[4817]: I0218 15:03:24.737392 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kt4w8" podUID="37d1b82f-ba72-4f8c-aa0d-f7780d6bb650" containerName="registry-server" containerID="cri-o://16a9847c7826f46726dee440a9542ee6f7407d7b3df9f2b2be490de7cf01febc" gracePeriod=2 Feb 18 15:03:25 crc kubenswrapper[4817]: I0218 15:03:25.174225 4817 scope.go:117] "RemoveContainer" containerID="543e3f95f8135537b1664f7a273ab40c0b6484576e81397f5d8c663cd771ddc5" Feb 18 15:03:25 crc kubenswrapper[4817]: E0218 15:03:25.174895 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:03:25 crc kubenswrapper[4817]: I0218 15:03:25.427700 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kt4w8" Feb 18 15:03:25 crc kubenswrapper[4817]: I0218 15:03:25.562415 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37d1b82f-ba72-4f8c-aa0d-f7780d6bb650-utilities\") pod \"37d1b82f-ba72-4f8c-aa0d-f7780d6bb650\" (UID: \"37d1b82f-ba72-4f8c-aa0d-f7780d6bb650\") " Feb 18 15:03:25 crc kubenswrapper[4817]: I0218 15:03:25.562490 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tc6s\" (UniqueName: \"kubernetes.io/projected/37d1b82f-ba72-4f8c-aa0d-f7780d6bb650-kube-api-access-7tc6s\") pod \"37d1b82f-ba72-4f8c-aa0d-f7780d6bb650\" (UID: \"37d1b82f-ba72-4f8c-aa0d-f7780d6bb650\") " Feb 18 15:03:25 crc kubenswrapper[4817]: I0218 15:03:25.562726 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37d1b82f-ba72-4f8c-aa0d-f7780d6bb650-catalog-content\") pod \"37d1b82f-ba72-4f8c-aa0d-f7780d6bb650\" (UID: \"37d1b82f-ba72-4f8c-aa0d-f7780d6bb650\") " Feb 18 15:03:25 crc kubenswrapper[4817]: I0218 15:03:25.563382 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37d1b82f-ba72-4f8c-aa0d-f7780d6bb650-utilities" (OuterVolumeSpecName: "utilities") pod "37d1b82f-ba72-4f8c-aa0d-f7780d6bb650" (UID: "37d1b82f-ba72-4f8c-aa0d-f7780d6bb650"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 15:03:25 crc kubenswrapper[4817]: I0218 15:03:25.567991 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37d1b82f-ba72-4f8c-aa0d-f7780d6bb650-kube-api-access-7tc6s" (OuterVolumeSpecName: "kube-api-access-7tc6s") pod "37d1b82f-ba72-4f8c-aa0d-f7780d6bb650" (UID: "37d1b82f-ba72-4f8c-aa0d-f7780d6bb650"). InnerVolumeSpecName "kube-api-access-7tc6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 15:03:25 crc kubenswrapper[4817]: I0218 15:03:25.624877 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37d1b82f-ba72-4f8c-aa0d-f7780d6bb650-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "37d1b82f-ba72-4f8c-aa0d-f7780d6bb650" (UID: "37d1b82f-ba72-4f8c-aa0d-f7780d6bb650"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 15:03:25 crc kubenswrapper[4817]: I0218 15:03:25.665559 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37d1b82f-ba72-4f8c-aa0d-f7780d6bb650-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 15:03:25 crc kubenswrapper[4817]: I0218 15:03:25.665604 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37d1b82f-ba72-4f8c-aa0d-f7780d6bb650-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 15:03:25 crc kubenswrapper[4817]: I0218 15:03:25.665617 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tc6s\" (UniqueName: \"kubernetes.io/projected/37d1b82f-ba72-4f8c-aa0d-f7780d6bb650-kube-api-access-7tc6s\") on node \"crc\" DevicePath \"\"" Feb 18 15:03:25 crc kubenswrapper[4817]: I0218 15:03:25.752081 4817 generic.go:334] "Generic (PLEG): container finished" podID="37d1b82f-ba72-4f8c-aa0d-f7780d6bb650" containerID="16a9847c7826f46726dee440a9542ee6f7407d7b3df9f2b2be490de7cf01febc" exitCode=0 Feb 18 15:03:25 crc kubenswrapper[4817]: I0218 15:03:25.752153 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kt4w8" Feb 18 15:03:25 crc kubenswrapper[4817]: I0218 15:03:25.752156 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kt4w8" event={"ID":"37d1b82f-ba72-4f8c-aa0d-f7780d6bb650","Type":"ContainerDied","Data":"16a9847c7826f46726dee440a9542ee6f7407d7b3df9f2b2be490de7cf01febc"} Feb 18 15:03:25 crc kubenswrapper[4817]: I0218 15:03:25.752419 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kt4w8" event={"ID":"37d1b82f-ba72-4f8c-aa0d-f7780d6bb650","Type":"ContainerDied","Data":"5c3b6fd86d0aa298511f8947589b82d8117d94d9dfd36a0d1d5e75f1215a629f"} Feb 18 15:03:25 crc kubenswrapper[4817]: I0218 15:03:25.752485 4817 scope.go:117] "RemoveContainer" containerID="16a9847c7826f46726dee440a9542ee6f7407d7b3df9f2b2be490de7cf01febc" Feb 18 15:03:25 crc kubenswrapper[4817]: I0218 15:03:25.777964 4817 scope.go:117] "RemoveContainer" containerID="fa61bd8e57fac978ba9b4c4d83490089289fb6df675d4b47c7ca9804025d2144" Feb 18 15:03:25 crc kubenswrapper[4817]: I0218 15:03:25.803254 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kt4w8"] Feb 18 15:03:25 crc kubenswrapper[4817]: I0218 15:03:25.817237 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kt4w8"] Feb 18 15:03:25 crc kubenswrapper[4817]: I0218 15:03:25.819125 4817 scope.go:117] "RemoveContainer" containerID="02f596420682df18f1ce28d08d84808b79c9136786bc2125461270e0c0a947ae" Feb 18 15:03:25 crc kubenswrapper[4817]: I0218 15:03:25.856508 4817 scope.go:117] "RemoveContainer" containerID="16a9847c7826f46726dee440a9542ee6f7407d7b3df9f2b2be490de7cf01febc" Feb 18 15:03:25 crc kubenswrapper[4817]: E0218 15:03:25.856998 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16a9847c7826f46726dee440a9542ee6f7407d7b3df9f2b2be490de7cf01febc\": container with ID starting with 16a9847c7826f46726dee440a9542ee6f7407d7b3df9f2b2be490de7cf01febc not found: ID does not exist" containerID="16a9847c7826f46726dee440a9542ee6f7407d7b3df9f2b2be490de7cf01febc" Feb 18 15:03:25 crc kubenswrapper[4817]: I0218 15:03:25.857032 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16a9847c7826f46726dee440a9542ee6f7407d7b3df9f2b2be490de7cf01febc"} err="failed to get container status \"16a9847c7826f46726dee440a9542ee6f7407d7b3df9f2b2be490de7cf01febc\": rpc error: code = NotFound desc = could not find container \"16a9847c7826f46726dee440a9542ee6f7407d7b3df9f2b2be490de7cf01febc\": container with ID starting with 16a9847c7826f46726dee440a9542ee6f7407d7b3df9f2b2be490de7cf01febc not found: ID does not exist" Feb 18 15:03:25 crc kubenswrapper[4817]: I0218 15:03:25.857053 4817 scope.go:117] "RemoveContainer" containerID="fa61bd8e57fac978ba9b4c4d83490089289fb6df675d4b47c7ca9804025d2144" Feb 18 15:03:25 crc kubenswrapper[4817]: E0218 15:03:25.857355 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa61bd8e57fac978ba9b4c4d83490089289fb6df675d4b47c7ca9804025d2144\": container with ID starting with fa61bd8e57fac978ba9b4c4d83490089289fb6df675d4b47c7ca9804025d2144 not found: ID does not exist" containerID="fa61bd8e57fac978ba9b4c4d83490089289fb6df675d4b47c7ca9804025d2144" Feb 18 15:03:25 crc kubenswrapper[4817]: I0218 15:03:25.857402 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa61bd8e57fac978ba9b4c4d83490089289fb6df675d4b47c7ca9804025d2144"} err="failed to get container status \"fa61bd8e57fac978ba9b4c4d83490089289fb6df675d4b47c7ca9804025d2144\": rpc error: code = NotFound desc = could not find container \"fa61bd8e57fac978ba9b4c4d83490089289fb6df675d4b47c7ca9804025d2144\": container with ID starting with fa61bd8e57fac978ba9b4c4d83490089289fb6df675d4b47c7ca9804025d2144 not found: ID does not exist" Feb 18 15:03:25 crc kubenswrapper[4817]: I0218 15:03:25.857429 4817 scope.go:117] "RemoveContainer" containerID="02f596420682df18f1ce28d08d84808b79c9136786bc2125461270e0c0a947ae" Feb 18 15:03:25 crc kubenswrapper[4817]: E0218 15:03:25.857739 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02f596420682df18f1ce28d08d84808b79c9136786bc2125461270e0c0a947ae\": container with ID starting with 02f596420682df18f1ce28d08d84808b79c9136786bc2125461270e0c0a947ae not found: ID does not exist" containerID="02f596420682df18f1ce28d08d84808b79c9136786bc2125461270e0c0a947ae" Feb 18 15:03:25 crc kubenswrapper[4817]: I0218 15:03:25.857767 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02f596420682df18f1ce28d08d84808b79c9136786bc2125461270e0c0a947ae"} err="failed to get container status \"02f596420682df18f1ce28d08d84808b79c9136786bc2125461270e0c0a947ae\": rpc error: code = NotFound desc = could not find container \"02f596420682df18f1ce28d08d84808b79c9136786bc2125461270e0c0a947ae\": container with ID starting with 02f596420682df18f1ce28d08d84808b79c9136786bc2125461270e0c0a947ae not found: ID does not exist" Feb 18 15:03:26 crc kubenswrapper[4817]: I0218 15:03:26.184707 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37d1b82f-ba72-4f8c-aa0d-f7780d6bb650" path="/var/lib/kubelet/pods/37d1b82f-ba72-4f8c-aa0d-f7780d6bb650/volumes" Feb 18 15:03:40 crc kubenswrapper[4817]: I0218 15:03:40.171797 4817 scope.go:117] "RemoveContainer" containerID="543e3f95f8135537b1664f7a273ab40c0b6484576e81397f5d8c663cd771ddc5" Feb 18 15:03:40 crc kubenswrapper[4817]: E0218 15:03:40.173710 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:03:51 crc kubenswrapper[4817]: I0218 15:03:51.172125 4817 scope.go:117] "RemoveContainer" containerID="543e3f95f8135537b1664f7a273ab40c0b6484576e81397f5d8c663cd771ddc5" Feb 18 15:03:51 crc kubenswrapper[4817]: E0218 15:03:51.172870 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:04:06 crc kubenswrapper[4817]: I0218 15:04:06.171585 4817 scope.go:117] "RemoveContainer" containerID="543e3f95f8135537b1664f7a273ab40c0b6484576e81397f5d8c663cd771ddc5" Feb 18 15:04:06 crc kubenswrapper[4817]: E0218 15:04:06.172375 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:04:19 crc kubenswrapper[4817]: I0218 15:04:19.172869 4817 scope.go:117] "RemoveContainer" containerID="543e3f95f8135537b1664f7a273ab40c0b6484576e81397f5d8c663cd771ddc5" Feb 18 15:04:19 crc kubenswrapper[4817]: E0218 15:04:19.174071 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:04:31 crc kubenswrapper[4817]: I0218 15:04:31.171431 4817 scope.go:117] "RemoveContainer" containerID="543e3f95f8135537b1664f7a273ab40c0b6484576e81397f5d8c663cd771ddc5" Feb 18 15:04:31 crc kubenswrapper[4817]: E0218 15:04:31.172296 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:04:46 crc kubenswrapper[4817]: I0218 15:04:46.171299 4817 scope.go:117] "RemoveContainer" containerID="543e3f95f8135537b1664f7a273ab40c0b6484576e81397f5d8c663cd771ddc5" Feb 18 15:04:46 crc kubenswrapper[4817]: E0218 15:04:46.172922 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:04:57 crc kubenswrapper[4817]: I0218 15:04:57.171639 4817 scope.go:117] "RemoveContainer" containerID="543e3f95f8135537b1664f7a273ab40c0b6484576e81397f5d8c663cd771ddc5" Feb 18 15:04:57 crc kubenswrapper[4817]: E0218 15:04:57.172380 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:05:09 crc kubenswrapper[4817]: I0218 15:05:09.172477 4817 scope.go:117] "RemoveContainer" containerID="543e3f95f8135537b1664f7a273ab40c0b6484576e81397f5d8c663cd771ddc5" Feb 18 15:05:09 crc kubenswrapper[4817]: E0218 15:05:09.173387 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:05:23 crc kubenswrapper[4817]: I0218 15:05:23.172217 4817 scope.go:117] "RemoveContainer" containerID="543e3f95f8135537b1664f7a273ab40c0b6484576e81397f5d8c663cd771ddc5" Feb 18 15:05:23 crc kubenswrapper[4817]: E0218 15:05:23.172885 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:05:38 crc kubenswrapper[4817]: I0218 15:05:38.178872 4817 scope.go:117] "RemoveContainer" containerID="543e3f95f8135537b1664f7a273ab40c0b6484576e81397f5d8c663cd771ddc5" Feb 18 15:05:38 crc kubenswrapper[4817]: E0218 15:05:38.180893 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:05:52 crc kubenswrapper[4817]: I0218 15:05:52.172425 4817 scope.go:117] "RemoveContainer" containerID="543e3f95f8135537b1664f7a273ab40c0b6484576e81397f5d8c663cd771ddc5" Feb 18 15:05:52 crc kubenswrapper[4817]: E0218 15:05:52.173790 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:06:03 crc kubenswrapper[4817]: I0218 15:06:03.171819 4817 scope.go:117] "RemoveContainer" containerID="543e3f95f8135537b1664f7a273ab40c0b6484576e81397f5d8c663cd771ddc5" Feb 18 15:06:03 crc kubenswrapper[4817]: E0218 15:06:03.172521 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:06:09 crc kubenswrapper[4817]: I0218 15:06:09.081791 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4pvpn"] Feb 18 15:06:09 crc kubenswrapper[4817]: E0218 15:06:09.083037 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37d1b82f-ba72-4f8c-aa0d-f7780d6bb650" containerName="extract-utilities" Feb 18 15:06:09 crc kubenswrapper[4817]: I0218 15:06:09.083059 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="37d1b82f-ba72-4f8c-aa0d-f7780d6bb650" containerName="extract-utilities" Feb 18 15:06:09 crc kubenswrapper[4817]: E0218 15:06:09.083080 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37d1b82f-ba72-4f8c-aa0d-f7780d6bb650" containerName="extract-content" Feb 18 15:06:09 crc kubenswrapper[4817]: I0218 15:06:09.083089 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="37d1b82f-ba72-4f8c-aa0d-f7780d6bb650" containerName="extract-content" Feb 18 15:06:09 crc kubenswrapper[4817]: E0218 15:06:09.083120 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37d1b82f-ba72-4f8c-aa0d-f7780d6bb650" containerName="registry-server" Feb 18 15:06:09 crc kubenswrapper[4817]: I0218 15:06:09.083128 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="37d1b82f-ba72-4f8c-aa0d-f7780d6bb650" containerName="registry-server" Feb 18 15:06:09 crc kubenswrapper[4817]: I0218 15:06:09.083385 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="37d1b82f-ba72-4f8c-aa0d-f7780d6bb650" containerName="registry-server" Feb 18 15:06:09 crc kubenswrapper[4817]: I0218 15:06:09.085340 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4pvpn" Feb 18 15:06:09 crc kubenswrapper[4817]: I0218 15:06:09.093108 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4pvpn"] Feb 18 15:06:09 crc kubenswrapper[4817]: I0218 15:06:09.257004 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/377f544f-d438-4647-96cb-3cf569c9eaa9-utilities\") pod \"redhat-marketplace-4pvpn\" (UID: \"377f544f-d438-4647-96cb-3cf569c9eaa9\") " pod="openshift-marketplace/redhat-marketplace-4pvpn" Feb 18 15:06:09 crc kubenswrapper[4817]: I0218 15:06:09.257399 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4lll\" (UniqueName: \"kubernetes.io/projected/377f544f-d438-4647-96cb-3cf569c9eaa9-kube-api-access-l4lll\") pod \"redhat-marketplace-4pvpn\" (UID: \"377f544f-d438-4647-96cb-3cf569c9eaa9\") " pod="openshift-marketplace/redhat-marketplace-4pvpn" Feb 18 15:06:09 crc kubenswrapper[4817]: I0218 15:06:09.257678 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/377f544f-d438-4647-96cb-3cf569c9eaa9-catalog-content\") pod \"redhat-marketplace-4pvpn\" (UID: \"377f544f-d438-4647-96cb-3cf569c9eaa9\") " pod="openshift-marketplace/redhat-marketplace-4pvpn" Feb 18 15:06:09 crc kubenswrapper[4817]: I0218 15:06:09.360277 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4lll\" (UniqueName: \"kubernetes.io/projected/377f544f-d438-4647-96cb-3cf569c9eaa9-kube-api-access-l4lll\") pod \"redhat-marketplace-4pvpn\" (UID: \"377f544f-d438-4647-96cb-3cf569c9eaa9\") " pod="openshift-marketplace/redhat-marketplace-4pvpn" Feb 18 15:06:09 crc kubenswrapper[4817]: I0218 15:06:09.360465 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/377f544f-d438-4647-96cb-3cf569c9eaa9-catalog-content\") pod \"redhat-marketplace-4pvpn\" (UID: \"377f544f-d438-4647-96cb-3cf569c9eaa9\") " pod="openshift-marketplace/redhat-marketplace-4pvpn" Feb 18 15:06:09 crc kubenswrapper[4817]: I0218 15:06:09.360526 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/377f544f-d438-4647-96cb-3cf569c9eaa9-utilities\") pod \"redhat-marketplace-4pvpn\" (UID: \"377f544f-d438-4647-96cb-3cf569c9eaa9\") " pod="openshift-marketplace/redhat-marketplace-4pvpn" Feb 18 15:06:09 crc kubenswrapper[4817]: I0218 15:06:09.361104 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/377f544f-d438-4647-96cb-3cf569c9eaa9-catalog-content\") pod \"redhat-marketplace-4pvpn\" (UID: \"377f544f-d438-4647-96cb-3cf569c9eaa9\") " pod="openshift-marketplace/redhat-marketplace-4pvpn" Feb 18 15:06:09 crc kubenswrapper[4817]: I0218 15:06:09.361411 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/377f544f-d438-4647-96cb-3cf569c9eaa9-utilities\") pod \"redhat-marketplace-4pvpn\" (UID: \"377f544f-d438-4647-96cb-3cf569c9eaa9\") " pod="openshift-marketplace/redhat-marketplace-4pvpn" Feb 18 15:06:09 crc kubenswrapper[4817]: I0218 15:06:09.381230 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4lll\" (UniqueName: \"kubernetes.io/projected/377f544f-d438-4647-96cb-3cf569c9eaa9-kube-api-access-l4lll\") pod \"redhat-marketplace-4pvpn\" (UID: \"377f544f-d438-4647-96cb-3cf569c9eaa9\") " pod="openshift-marketplace/redhat-marketplace-4pvpn" Feb 18 15:06:09 crc kubenswrapper[4817]: I0218 15:06:09.408101 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4pvpn" Feb 18 15:06:09 crc kubenswrapper[4817]: I0218 15:06:09.944419 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4pvpn"] Feb 18 15:06:10 crc kubenswrapper[4817]: I0218 15:06:10.836736 4817 generic.go:334] "Generic (PLEG): container finished" podID="377f544f-d438-4647-96cb-3cf569c9eaa9" containerID="910f11649683d9fde4b287ea6822e157914fc2fc97f7a8414ffd8cde4982998e" exitCode=0 Feb 18 15:06:10 crc kubenswrapper[4817]: I0218 15:06:10.836783 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4pvpn" event={"ID":"377f544f-d438-4647-96cb-3cf569c9eaa9","Type":"ContainerDied","Data":"910f11649683d9fde4b287ea6822e157914fc2fc97f7a8414ffd8cde4982998e"} Feb 18 15:06:10 crc kubenswrapper[4817]: I0218 15:06:10.836810 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4pvpn" event={"ID":"377f544f-d438-4647-96cb-3cf569c9eaa9","Type":"ContainerStarted","Data":"d18ad9c6b178345ce8290d1a1ab4e35d22de10af19016be627145e11deb8b25c"} Feb 18 15:06:10 crc kubenswrapper[4817]: I0218 15:06:10.840441 4817 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 15:06:11 crc kubenswrapper[4817]: I0218 15:06:11.848305 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4pvpn" event={"ID":"377f544f-d438-4647-96cb-3cf569c9eaa9","Type":"ContainerStarted","Data":"4f38776a6325b5f55461576854fda4c1c97876c68524d4594a959d47f5782249"} Feb 18 15:06:12 crc kubenswrapper[4817]: I0218 15:06:12.858841 4817 generic.go:334] "Generic (PLEG): container finished" podID="377f544f-d438-4647-96cb-3cf569c9eaa9" containerID="4f38776a6325b5f55461576854fda4c1c97876c68524d4594a959d47f5782249" exitCode=0 Feb 18 15:06:12 crc kubenswrapper[4817]: I0218 15:06:12.858920 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4pvpn" event={"ID":"377f544f-d438-4647-96cb-3cf569c9eaa9","Type":"ContainerDied","Data":"4f38776a6325b5f55461576854fda4c1c97876c68524d4594a959d47f5782249"} Feb 18 15:06:13 crc kubenswrapper[4817]: I0218 15:06:13.870601 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4pvpn" event={"ID":"377f544f-d438-4647-96cb-3cf569c9eaa9","Type":"ContainerStarted","Data":"138de3847aac81a319da9cf0f1442b1203ea69dbfc96c1688189fb46dec3ca01"} Feb 18 15:06:13 crc kubenswrapper[4817]: I0218 15:06:13.903080 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4pvpn" podStartSLOduration=2.455917537 podStartE2EDuration="4.903057533s" podCreationTimestamp="2026-02-18 15:06:09 +0000 UTC" firstStartedPulling="2026-02-18 15:06:10.840171743 +0000 UTC m=+4033.415707726" lastFinishedPulling="2026-02-18 15:06:13.287311739 +0000 UTC m=+4035.862847722" observedRunningTime="2026-02-18 15:06:13.890916837 +0000 UTC m=+4036.466452820" watchObservedRunningTime="2026-02-18 15:06:13.903057533 +0000 UTC m=+4036.478593516" Feb 18 15:06:15 crc kubenswrapper[4817]: I0218 15:06:15.172436 4817 scope.go:117] "RemoveContainer" containerID="543e3f95f8135537b1664f7a273ab40c0b6484576e81397f5d8c663cd771ddc5" Feb 18 15:06:15 crc kubenswrapper[4817]: E0218 15:06:15.172750 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:06:19 crc kubenswrapper[4817]: I0218 15:06:19.408324 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4pvpn" Feb 18 15:06:19 crc kubenswrapper[4817]: I0218 15:06:19.408731 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4pvpn" Feb 18 15:06:19 crc kubenswrapper[4817]: I0218 15:06:19.454437 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4pvpn" Feb 18 15:06:19 crc kubenswrapper[4817]: I0218 15:06:19.989091 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4pvpn" Feb 18 15:06:23 crc kubenswrapper[4817]: I0218 15:06:23.075596 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4pvpn"] Feb 18 15:06:23 crc kubenswrapper[4817]: I0218 15:06:23.076759 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4pvpn" podUID="377f544f-d438-4647-96cb-3cf569c9eaa9" containerName="registry-server" containerID="cri-o://138de3847aac81a319da9cf0f1442b1203ea69dbfc96c1688189fb46dec3ca01" gracePeriod=2 Feb 18 15:06:23 crc kubenswrapper[4817]: I0218 15:06:23.612949 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4pvpn" Feb 18 15:06:23 crc kubenswrapper[4817]: I0218 15:06:23.676299 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/377f544f-d438-4647-96cb-3cf569c9eaa9-catalog-content\") pod \"377f544f-d438-4647-96cb-3cf569c9eaa9\" (UID: \"377f544f-d438-4647-96cb-3cf569c9eaa9\") " Feb 18 15:06:23 crc kubenswrapper[4817]: I0218 15:06:23.676468 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4lll\" (UniqueName: \"kubernetes.io/projected/377f544f-d438-4647-96cb-3cf569c9eaa9-kube-api-access-l4lll\") pod \"377f544f-d438-4647-96cb-3cf569c9eaa9\" (UID: \"377f544f-d438-4647-96cb-3cf569c9eaa9\") " Feb 18 15:06:23 crc kubenswrapper[4817]: I0218 15:06:23.676527 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/377f544f-d438-4647-96cb-3cf569c9eaa9-utilities\") pod \"377f544f-d438-4647-96cb-3cf569c9eaa9\" (UID: \"377f544f-d438-4647-96cb-3cf569c9eaa9\") " Feb 18 15:06:23 crc kubenswrapper[4817]: I0218 15:06:23.678753 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/377f544f-d438-4647-96cb-3cf569c9eaa9-utilities" (OuterVolumeSpecName: "utilities") pod "377f544f-d438-4647-96cb-3cf569c9eaa9" (UID: "377f544f-d438-4647-96cb-3cf569c9eaa9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 15:06:23 crc kubenswrapper[4817]: I0218 15:06:23.683400 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/377f544f-d438-4647-96cb-3cf569c9eaa9-kube-api-access-l4lll" (OuterVolumeSpecName: "kube-api-access-l4lll") pod "377f544f-d438-4647-96cb-3cf569c9eaa9" (UID: "377f544f-d438-4647-96cb-3cf569c9eaa9"). InnerVolumeSpecName "kube-api-access-l4lll". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 15:06:23 crc kubenswrapper[4817]: I0218 15:06:23.704212 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/377f544f-d438-4647-96cb-3cf569c9eaa9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "377f544f-d438-4647-96cb-3cf569c9eaa9" (UID: "377f544f-d438-4647-96cb-3cf569c9eaa9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 15:06:23 crc kubenswrapper[4817]: I0218 15:06:23.780152 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/377f544f-d438-4647-96cb-3cf569c9eaa9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 15:06:23 crc kubenswrapper[4817]: I0218 15:06:23.780222 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4lll\" (UniqueName: \"kubernetes.io/projected/377f544f-d438-4647-96cb-3cf569c9eaa9-kube-api-access-l4lll\") on node \"crc\" DevicePath \"\"" Feb 18 15:06:23 crc kubenswrapper[4817]: I0218 15:06:23.780237 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/377f544f-d438-4647-96cb-3cf569c9eaa9-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 15:06:23 crc kubenswrapper[4817]: I0218 15:06:23.970770 4817 generic.go:334] "Generic (PLEG): container finished" podID="377f544f-d438-4647-96cb-3cf569c9eaa9" containerID="138de3847aac81a319da9cf0f1442b1203ea69dbfc96c1688189fb46dec3ca01" exitCode=0 Feb 18 15:06:23 crc kubenswrapper[4817]: I0218 15:06:23.970827 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4pvpn" event={"ID":"377f544f-d438-4647-96cb-3cf569c9eaa9","Type":"ContainerDied","Data":"138de3847aac81a319da9cf0f1442b1203ea69dbfc96c1688189fb46dec3ca01"} Feb 18 15:06:23 crc kubenswrapper[4817]: I0218 15:06:23.970861 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4pvpn" event={"ID":"377f544f-d438-4647-96cb-3cf569c9eaa9","Type":"ContainerDied","Data":"d18ad9c6b178345ce8290d1a1ab4e35d22de10af19016be627145e11deb8b25c"} Feb 18 15:06:23 crc kubenswrapper[4817]: I0218 15:06:23.970884 4817 scope.go:117] "RemoveContainer" containerID="138de3847aac81a319da9cf0f1442b1203ea69dbfc96c1688189fb46dec3ca01" Feb 18 15:06:23 crc kubenswrapper[4817]: I0218 15:06:23.971076 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4pvpn" Feb 18 15:06:24 crc kubenswrapper[4817]: I0218 15:06:24.009444 4817 scope.go:117] "RemoveContainer" containerID="4f38776a6325b5f55461576854fda4c1c97876c68524d4594a959d47f5782249" Feb 18 15:06:24 crc kubenswrapper[4817]: I0218 15:06:24.014508 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4pvpn"] Feb 18 15:06:24 crc kubenswrapper[4817]: I0218 15:06:24.024173 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4pvpn"] Feb 18 15:06:24 crc kubenswrapper[4817]: I0218 15:06:24.035255 4817 scope.go:117] "RemoveContainer" containerID="910f11649683d9fde4b287ea6822e157914fc2fc97f7a8414ffd8cde4982998e" Feb 18 15:06:24 crc kubenswrapper[4817]: I0218 15:06:24.089665 4817 scope.go:117] "RemoveContainer" containerID="138de3847aac81a319da9cf0f1442b1203ea69dbfc96c1688189fb46dec3ca01" Feb 18 15:06:24 crc kubenswrapper[4817]: E0218 15:06:24.090137 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"138de3847aac81a319da9cf0f1442b1203ea69dbfc96c1688189fb46dec3ca01\": container with ID starting with 138de3847aac81a319da9cf0f1442b1203ea69dbfc96c1688189fb46dec3ca01 not found: ID does not exist" containerID="138de3847aac81a319da9cf0f1442b1203ea69dbfc96c1688189fb46dec3ca01" Feb 18 15:06:24 crc kubenswrapper[4817]: I0218 15:06:24.090172 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"138de3847aac81a319da9cf0f1442b1203ea69dbfc96c1688189fb46dec3ca01"} err="failed to get container status \"138de3847aac81a319da9cf0f1442b1203ea69dbfc96c1688189fb46dec3ca01\": rpc error: code = NotFound desc = could not find container \"138de3847aac81a319da9cf0f1442b1203ea69dbfc96c1688189fb46dec3ca01\": container with ID starting with 138de3847aac81a319da9cf0f1442b1203ea69dbfc96c1688189fb46dec3ca01 not found: ID does not exist" Feb 18 15:06:24 crc kubenswrapper[4817]: I0218 15:06:24.090198 4817 scope.go:117] "RemoveContainer" containerID="4f38776a6325b5f55461576854fda4c1c97876c68524d4594a959d47f5782249" Feb 18 15:06:24 crc kubenswrapper[4817]: E0218 15:06:24.090616 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f38776a6325b5f55461576854fda4c1c97876c68524d4594a959d47f5782249\": container with ID starting with 4f38776a6325b5f55461576854fda4c1c97876c68524d4594a959d47f5782249 not found: ID does not exist" containerID="4f38776a6325b5f55461576854fda4c1c97876c68524d4594a959d47f5782249" Feb 18 15:06:24 crc kubenswrapper[4817]: I0218 15:06:24.090633 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f38776a6325b5f55461576854fda4c1c97876c68524d4594a959d47f5782249"} err="failed to get container status \"4f38776a6325b5f55461576854fda4c1c97876c68524d4594a959d47f5782249\": rpc error: code = NotFound desc = could not find container \"4f38776a6325b5f55461576854fda4c1c97876c68524d4594a959d47f5782249\": container with ID starting with 4f38776a6325b5f55461576854fda4c1c97876c68524d4594a959d47f5782249 not found: ID does not exist" Feb 18 15:06:24 crc kubenswrapper[4817]: I0218 15:06:24.090647 4817 scope.go:117] "RemoveContainer" containerID="910f11649683d9fde4b287ea6822e157914fc2fc97f7a8414ffd8cde4982998e" Feb 18 15:06:24 crc kubenswrapper[4817]: E0218 15:06:24.091068 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"910f11649683d9fde4b287ea6822e157914fc2fc97f7a8414ffd8cde4982998e\": container with ID starting with 910f11649683d9fde4b287ea6822e157914fc2fc97f7a8414ffd8cde4982998e not found: ID does not exist" containerID="910f11649683d9fde4b287ea6822e157914fc2fc97f7a8414ffd8cde4982998e" Feb 18 15:06:24 crc kubenswrapper[4817]: I0218 15:06:24.091101 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"910f11649683d9fde4b287ea6822e157914fc2fc97f7a8414ffd8cde4982998e"} err="failed to get container status \"910f11649683d9fde4b287ea6822e157914fc2fc97f7a8414ffd8cde4982998e\": rpc error: code = NotFound desc = could not find container \"910f11649683d9fde4b287ea6822e157914fc2fc97f7a8414ffd8cde4982998e\": container with ID starting with 910f11649683d9fde4b287ea6822e157914fc2fc97f7a8414ffd8cde4982998e not found: ID does not exist" Feb 18 15:06:24 crc kubenswrapper[4817]: I0218 15:06:24.183552 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="377f544f-d438-4647-96cb-3cf569c9eaa9" path="/var/lib/kubelet/pods/377f544f-d438-4647-96cb-3cf569c9eaa9/volumes" Feb 18 15:06:26 crc kubenswrapper[4817]: I0218 15:06:26.171764 4817 scope.go:117] "RemoveContainer" containerID="543e3f95f8135537b1664f7a273ab40c0b6484576e81397f5d8c663cd771ddc5" Feb 18 15:06:26 crc kubenswrapper[4817]: E0218 15:06:26.172377 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:06:37 crc kubenswrapper[4817]: I0218 15:06:37.172128 4817 scope.go:117] "RemoveContainer" containerID="543e3f95f8135537b1664f7a273ab40c0b6484576e81397f5d8c663cd771ddc5" Feb 18 15:06:37 crc kubenswrapper[4817]: E0218 15:06:37.172970 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:06:51 crc kubenswrapper[4817]: I0218 15:06:51.172174 4817 scope.go:117] "RemoveContainer" containerID="543e3f95f8135537b1664f7a273ab40c0b6484576e81397f5d8c663cd771ddc5" Feb 18 15:06:51 crc kubenswrapper[4817]: E0218 15:06:51.173105 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:07:04 crc kubenswrapper[4817]: I0218 15:07:04.172342 4817 scope.go:117] "RemoveContainer" containerID="543e3f95f8135537b1664f7a273ab40c0b6484576e81397f5d8c663cd771ddc5" Feb 18 15:07:04 crc kubenswrapper[4817]: E0218 15:07:04.174186 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:07:16 crc kubenswrapper[4817]: I0218 15:07:16.172434 4817 scope.go:117] "RemoveContainer" containerID="543e3f95f8135537b1664f7a273ab40c0b6484576e81397f5d8c663cd771ddc5" Feb 18 15:07:16 crc kubenswrapper[4817]: E0218 15:07:16.173264 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:07:23 crc kubenswrapper[4817]: E0218 15:07:23.107043 4817 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.38:52986->38.102.83.38:36749: write tcp 38.102.83.38:52986->38.102.83.38:36749: write: broken pipe Feb 18 15:07:30 crc kubenswrapper[4817]: I0218 15:07:30.171907 4817 scope.go:117] "RemoveContainer" containerID="543e3f95f8135537b1664f7a273ab40c0b6484576e81397f5d8c663cd771ddc5" Feb 18 15:07:30 crc kubenswrapper[4817]: E0218 15:07:30.172766 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:07:45 crc kubenswrapper[4817]: I0218 15:07:45.173094 4817 scope.go:117] "RemoveContainer" containerID="543e3f95f8135537b1664f7a273ab40c0b6484576e81397f5d8c663cd771ddc5" Feb 18 15:07:45 crc kubenswrapper[4817]: E0218 15:07:45.174298 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.061815 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Feb 18 15:07:51 crc kubenswrapper[4817]: E0218 15:07:51.062871 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="377f544f-d438-4647-96cb-3cf569c9eaa9" containerName="extract-utilities" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.062894 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="377f544f-d438-4647-96cb-3cf569c9eaa9" containerName="extract-utilities" Feb 18 15:07:51 crc kubenswrapper[4817]: E0218 15:07:51.062911 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="377f544f-d438-4647-96cb-3cf569c9eaa9" containerName="extract-content" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.062919 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="377f544f-d438-4647-96cb-3cf569c9eaa9" containerName="extract-content" Feb 18 15:07:51 crc kubenswrapper[4817]: E0218 15:07:51.062949 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="377f544f-d438-4647-96cb-3cf569c9eaa9" containerName="registry-server" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.062957 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="377f544f-d438-4647-96cb-3cf569c9eaa9" containerName="registry-server" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.063265 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="377f544f-d438-4647-96cb-3cf569c9eaa9" containerName="registry-server" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.064312 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.066605 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.068152 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-rrsh5" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.068206 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.068170 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.079512 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.193673 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33627f57-553f-4c87-a517-4fbe8d221665-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " pod="openstack/tempest-tests-tempest" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.193766 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/33627f57-553f-4c87-a517-4fbe8d221665-config-data\") pod \"tempest-tests-tempest\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " pod="openstack/tempest-tests-tempest" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.193876 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/33627f57-553f-4c87-a517-4fbe8d221665-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " pod="openstack/tempest-tests-tempest" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.194040 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/33627f57-553f-4c87-a517-4fbe8d221665-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " pod="openstack/tempest-tests-tempest" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.194133 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5blq4\" (UniqueName: \"kubernetes.io/projected/33627f57-553f-4c87-a517-4fbe8d221665-kube-api-access-5blq4\") pod \"tempest-tests-tempest\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " pod="openstack/tempest-tests-tempest" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.194261 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/33627f57-553f-4c87-a517-4fbe8d221665-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " pod="openstack/tempest-tests-tempest" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.194351 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/33627f57-553f-4c87-a517-4fbe8d221665-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " pod="openstack/tempest-tests-tempest" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.194474 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " pod="openstack/tempest-tests-tempest" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.194548 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/33627f57-553f-4c87-a517-4fbe8d221665-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " pod="openstack/tempest-tests-tempest" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.296570 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/33627f57-553f-4c87-a517-4fbe8d221665-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " pod="openstack/tempest-tests-tempest" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.296651 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/33627f57-553f-4c87-a517-4fbe8d221665-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " pod="openstack/tempest-tests-tempest" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.296691 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5blq4\" (UniqueName: \"kubernetes.io/projected/33627f57-553f-4c87-a517-4fbe8d221665-kube-api-access-5blq4\") pod \"tempest-tests-tempest\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " pod="openstack/tempest-tests-tempest" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.296757 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/33627f57-553f-4c87-a517-4fbe8d221665-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " pod="openstack/tempest-tests-tempest" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.296802 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/33627f57-553f-4c87-a517-4fbe8d221665-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " pod="openstack/tempest-tests-tempest" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.296901 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " pod="openstack/tempest-tests-tempest" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.296935 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/33627f57-553f-4c87-a517-4fbe8d221665-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " pod="openstack/tempest-tests-tempest" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.297113 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33627f57-553f-4c87-a517-4fbe8d221665-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " pod="openstack/tempest-tests-tempest" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.297151 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/33627f57-553f-4c87-a517-4fbe8d221665-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " pod="openstack/tempest-tests-tempest" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.297184 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/33627f57-553f-4c87-a517-4fbe8d221665-config-data\") pod \"tempest-tests-tempest\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " pod="openstack/tempest-tests-tempest" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.298365 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/33627f57-553f-4c87-a517-4fbe8d221665-config-data\") pod \"tempest-tests-tempest\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " pod="openstack/tempest-tests-tempest" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.298855 4817 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/tempest-tests-tempest" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.298959 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/33627f57-553f-4c87-a517-4fbe8d221665-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " pod="openstack/tempest-tests-tempest" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.299572 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/33627f57-553f-4c87-a517-4fbe8d221665-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " pod="openstack/tempest-tests-tempest" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.303866 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/33627f57-553f-4c87-a517-4fbe8d221665-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " pod="openstack/tempest-tests-tempest" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.304355 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/33627f57-553f-4c87-a517-4fbe8d221665-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " pod="openstack/tempest-tests-tempest" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.308496 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33627f57-553f-4c87-a517-4fbe8d221665-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " pod="openstack/tempest-tests-tempest" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.316016 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5blq4\" (UniqueName: \"kubernetes.io/projected/33627f57-553f-4c87-a517-4fbe8d221665-kube-api-access-5blq4\") pod \"tempest-tests-tempest\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " pod="openstack/tempest-tests-tempest" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.346547 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " pod="openstack/tempest-tests-tempest" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.389216 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 18 15:07:51 crc kubenswrapper[4817]: I0218 15:07:51.822817 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 18 15:07:52 crc kubenswrapper[4817]: I0218 15:07:52.815616 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"33627f57-553f-4c87-a517-4fbe8d221665","Type":"ContainerStarted","Data":"0e6916d7d3a28ee762ba7714d3521642e04a11a71f492ecf11bc1af59c7d6d72"} Feb 18 15:07:57 crc kubenswrapper[4817]: I0218 15:07:57.172780 4817 scope.go:117] "RemoveContainer" containerID="543e3f95f8135537b1664f7a273ab40c0b6484576e81397f5d8c663cd771ddc5" Feb 18 15:07:57 crc kubenswrapper[4817]: E0218 15:07:57.173657 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:08:09 crc kubenswrapper[4817]: I0218 15:08:09.180220 4817 scope.go:117] "RemoveContainer" containerID="543e3f95f8135537b1664f7a273ab40c0b6484576e81397f5d8c663cd771ddc5" Feb 18 15:08:09 crc kubenswrapper[4817]: E0218 15:08:09.181780 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:08:19 crc kubenswrapper[4817]: E0218 15:08:19.854097 4817 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Feb 18 15:08:19 crc kubenswrapper[4817]: E0218 15:08:19.854798 4817 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5blq4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(33627f57-553f-4c87-a517-4fbe8d221665): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 18 15:08:19 crc kubenswrapper[4817]: E0218 15:08:19.856038 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="33627f57-553f-4c87-a517-4fbe8d221665" Feb 18 15:08:20 crc kubenswrapper[4817]: E0218 15:08:20.093378 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="33627f57-553f-4c87-a517-4fbe8d221665" Feb 18 15:08:24 crc kubenswrapper[4817]: I0218 15:08:24.172386 4817 scope.go:117] "RemoveContainer" containerID="543e3f95f8135537b1664f7a273ab40c0b6484576e81397f5d8c663cd771ddc5" Feb 18 15:08:25 crc kubenswrapper[4817]: I0218 15:08:25.137735 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerStarted","Data":"88f9ad2af8642e25df06dc7d11c08f0c159cf5c0e3364736e58acd12597763ea"} Feb 18 15:08:33 crc kubenswrapper[4817]: I0218 15:08:33.621156 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 18 15:08:35 crc kubenswrapper[4817]: I0218 15:08:35.223371 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"33627f57-553f-4c87-a517-4fbe8d221665","Type":"ContainerStarted","Data":"2c7851cc8d2d83f51035793ceaf51f382775fd6eeeed12ebf191ff6f75f3b2e7"} Feb 18 15:08:35 crc kubenswrapper[4817]: I0218 15:08:35.253314 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.693158328 podStartE2EDuration="45.253290749s" podCreationTimestamp="2026-02-18 15:07:50 +0000 UTC" firstStartedPulling="2026-02-18 15:07:52.058544102 +0000 UTC m=+4134.634080125" lastFinishedPulling="2026-02-18 15:08:33.618676563 +0000 UTC m=+4176.194212546" observedRunningTime="2026-02-18 15:08:35.241162663 +0000 UTC m=+4177.816698646" watchObservedRunningTime="2026-02-18 15:08:35.253290749 +0000 UTC m=+4177.828826732" Feb 18 15:10:42 crc kubenswrapper[4817]: I0218 15:10:42.863576 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 15:10:42 crc kubenswrapper[4817]: I0218 15:10:42.864215 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 15:11:12 crc kubenswrapper[4817]: I0218 15:11:12.865172 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 15:11:12 crc kubenswrapper[4817]: I0218 15:11:12.865800 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 15:11:36 crc kubenswrapper[4817]: I0218 15:11:36.358556 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m9wzb"] Feb 18 15:11:36 crc kubenswrapper[4817]: I0218 15:11:36.361720 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m9wzb" Feb 18 15:11:36 crc kubenswrapper[4817]: I0218 15:11:36.371727 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m9wzb"] Feb 18 15:11:36 crc kubenswrapper[4817]: I0218 15:11:36.485913 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15062f03-ca34-4a6a-8a62-8819dbb8056e-utilities\") pod \"community-operators-m9wzb\" (UID: \"15062f03-ca34-4a6a-8a62-8819dbb8056e\") " pod="openshift-marketplace/community-operators-m9wzb" Feb 18 15:11:36 crc kubenswrapper[4817]: I0218 15:11:36.486051 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9gwg\" (UniqueName: \"kubernetes.io/projected/15062f03-ca34-4a6a-8a62-8819dbb8056e-kube-api-access-h9gwg\") pod \"community-operators-m9wzb\" (UID: \"15062f03-ca34-4a6a-8a62-8819dbb8056e\") " pod="openshift-marketplace/community-operators-m9wzb" Feb 18 15:11:36 crc kubenswrapper[4817]: I0218 15:11:36.486133 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15062f03-ca34-4a6a-8a62-8819dbb8056e-catalog-content\") pod \"community-operators-m9wzb\" (UID: \"15062f03-ca34-4a6a-8a62-8819dbb8056e\") " pod="openshift-marketplace/community-operators-m9wzb" Feb 18 15:11:36 crc kubenswrapper[4817]: I0218 15:11:36.589398 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15062f03-ca34-4a6a-8a62-8819dbb8056e-catalog-content\") pod \"community-operators-m9wzb\" (UID: \"15062f03-ca34-4a6a-8a62-8819dbb8056e\") " pod="openshift-marketplace/community-operators-m9wzb" Feb 18 15:11:36 crc kubenswrapper[4817]: I0218 15:11:36.589541 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15062f03-ca34-4a6a-8a62-8819dbb8056e-utilities\") pod \"community-operators-m9wzb\" (UID: \"15062f03-ca34-4a6a-8a62-8819dbb8056e\") " pod="openshift-marketplace/community-operators-m9wzb" Feb 18 15:11:36 crc kubenswrapper[4817]: I0218 15:11:36.589620 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9gwg\" (UniqueName: \"kubernetes.io/projected/15062f03-ca34-4a6a-8a62-8819dbb8056e-kube-api-access-h9gwg\") pod \"community-operators-m9wzb\" (UID: \"15062f03-ca34-4a6a-8a62-8819dbb8056e\") " pod="openshift-marketplace/community-operators-m9wzb" Feb 18 15:11:36 crc kubenswrapper[4817]: I0218 15:11:36.590416 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15062f03-ca34-4a6a-8a62-8819dbb8056e-utilities\") pod \"community-operators-m9wzb\" (UID: \"15062f03-ca34-4a6a-8a62-8819dbb8056e\") " pod="openshift-marketplace/community-operators-m9wzb" Feb 18 15:11:36 crc kubenswrapper[4817]: I0218 15:11:36.590597 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15062f03-ca34-4a6a-8a62-8819dbb8056e-catalog-content\") pod \"community-operators-m9wzb\" (UID: \"15062f03-ca34-4a6a-8a62-8819dbb8056e\") " pod="openshift-marketplace/community-operators-m9wzb" Feb 18 15:11:36 crc kubenswrapper[4817]: I0218 15:11:36.617466 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9gwg\" (UniqueName: \"kubernetes.io/projected/15062f03-ca34-4a6a-8a62-8819dbb8056e-kube-api-access-h9gwg\") pod \"community-operators-m9wzb\" (UID: \"15062f03-ca34-4a6a-8a62-8819dbb8056e\") " pod="openshift-marketplace/community-operators-m9wzb" Feb 18 15:11:36 crc kubenswrapper[4817]: I0218 15:11:36.691847 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m9wzb" Feb 18 15:11:37 crc kubenswrapper[4817]: I0218 15:11:37.523321 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m9wzb"] Feb 18 15:11:38 crc kubenswrapper[4817]: I0218 15:11:38.098522 4817 generic.go:334] "Generic (PLEG): container finished" podID="15062f03-ca34-4a6a-8a62-8819dbb8056e" containerID="72de6b4cdb050add3d49da72a08a7af7881a7c84f4f3ac078ed8cf1cfae074e1" exitCode=0 Feb 18 15:11:38 crc kubenswrapper[4817]: I0218 15:11:38.098629 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9wzb" event={"ID":"15062f03-ca34-4a6a-8a62-8819dbb8056e","Type":"ContainerDied","Data":"72de6b4cdb050add3d49da72a08a7af7881a7c84f4f3ac078ed8cf1cfae074e1"} Feb 18 15:11:38 crc kubenswrapper[4817]: I0218 15:11:38.098809 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9wzb" event={"ID":"15062f03-ca34-4a6a-8a62-8819dbb8056e","Type":"ContainerStarted","Data":"d70b1faaebf0924d51db5436f59588471869612ef562bc50e7b14e37b16b3948"} Feb 18 15:11:38 crc kubenswrapper[4817]: I0218 15:11:38.100874 4817 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 15:11:40 crc kubenswrapper[4817]: I0218 15:11:40.118555 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9wzb" event={"ID":"15062f03-ca34-4a6a-8a62-8819dbb8056e","Type":"ContainerStarted","Data":"0e568388c5ad7510a3cecb4574a838ca93bd76873215b8377e7e9b747ee85e94"} Feb 18 15:11:42 crc kubenswrapper[4817]: I0218 15:11:42.180301 4817 generic.go:334] "Generic (PLEG): container finished" podID="15062f03-ca34-4a6a-8a62-8819dbb8056e" containerID="0e568388c5ad7510a3cecb4574a838ca93bd76873215b8377e7e9b747ee85e94" exitCode=0 Feb 18 15:11:42 crc kubenswrapper[4817]: I0218 15:11:42.183832 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9wzb" event={"ID":"15062f03-ca34-4a6a-8a62-8819dbb8056e","Type":"ContainerDied","Data":"0e568388c5ad7510a3cecb4574a838ca93bd76873215b8377e7e9b747ee85e94"} Feb 18 15:11:42 crc kubenswrapper[4817]: I0218 15:11:42.863079 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 15:11:42 crc kubenswrapper[4817]: I0218 15:11:42.863627 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 15:11:42 crc kubenswrapper[4817]: I0218 15:11:42.863725 4817 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" Feb 18 15:11:42 crc kubenswrapper[4817]: I0218 15:11:42.864592 4817 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"88f9ad2af8642e25df06dc7d11c08f0c159cf5c0e3364736e58acd12597763ea"} pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 15:11:42 crc kubenswrapper[4817]: I0218 15:11:42.864762 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" containerID="cri-o://88f9ad2af8642e25df06dc7d11c08f0c159cf5c0e3364736e58acd12597763ea" gracePeriod=600 Feb 18 15:11:43 crc kubenswrapper[4817]: I0218 15:11:43.194811 4817 generic.go:334] "Generic (PLEG): container finished" podID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerID="88f9ad2af8642e25df06dc7d11c08f0c159cf5c0e3364736e58acd12597763ea" exitCode=0 Feb 18 15:11:43 crc kubenswrapper[4817]: I0218 15:11:43.194884 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerDied","Data":"88f9ad2af8642e25df06dc7d11c08f0c159cf5c0e3364736e58acd12597763ea"} Feb 18 15:11:43 crc kubenswrapper[4817]: I0218 15:11:43.195263 4817 scope.go:117] "RemoveContainer" containerID="543e3f95f8135537b1664f7a273ab40c0b6484576e81397f5d8c663cd771ddc5" Feb 18 15:11:43 crc kubenswrapper[4817]: I0218 15:11:43.198111 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9wzb" event={"ID":"15062f03-ca34-4a6a-8a62-8819dbb8056e","Type":"ContainerStarted","Data":"fa836901a0ad7d182fffab68d6213efd96f7386a243bca46241d1c3c1a0032fd"} Feb 18 15:11:43 crc kubenswrapper[4817]: I0218 15:11:43.221957 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m9wzb" podStartSLOduration=2.586943637 podStartE2EDuration="7.221936707s" podCreationTimestamp="2026-02-18 15:11:36 +0000 UTC" firstStartedPulling="2026-02-18 15:11:38.10067176 +0000 UTC m=+4360.676207743" lastFinishedPulling="2026-02-18 15:11:42.73566483 +0000 UTC m=+4365.311200813" observedRunningTime="2026-02-18 15:11:43.21969222 +0000 UTC m=+4365.795228223" watchObservedRunningTime="2026-02-18 15:11:43.221936707 +0000 UTC m=+4365.797472680" Feb 18 15:11:44 crc kubenswrapper[4817]: I0218 15:11:44.227196 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerStarted","Data":"4984bca70bc0993e719f0bc7f551bc184a0706e39bad62118e0e29921ef0f456"} Feb 18 15:11:46 crc kubenswrapper[4817]: I0218 15:11:46.692474 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m9wzb" Feb 18 15:11:46 crc kubenswrapper[4817]: I0218 15:11:46.693056 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m9wzb" Feb 18 15:11:46 crc kubenswrapper[4817]: I0218 15:11:46.746357 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m9wzb" Feb 18 15:11:47 crc kubenswrapper[4817]: I0218 15:11:47.363409 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m9wzb" Feb 18 15:11:47 crc kubenswrapper[4817]: I0218 15:11:47.456898 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m9wzb"] Feb 18 15:11:49 crc kubenswrapper[4817]: I0218 15:11:49.276156 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m9wzb" podUID="15062f03-ca34-4a6a-8a62-8819dbb8056e" containerName="registry-server" containerID="cri-o://fa836901a0ad7d182fffab68d6213efd96f7386a243bca46241d1c3c1a0032fd" gracePeriod=2 Feb 18 15:11:50 crc kubenswrapper[4817]: I0218 15:11:50.288632 4817 generic.go:334] "Generic (PLEG): container finished" podID="15062f03-ca34-4a6a-8a62-8819dbb8056e" containerID="fa836901a0ad7d182fffab68d6213efd96f7386a243bca46241d1c3c1a0032fd" exitCode=0 Feb 18 15:11:50 crc kubenswrapper[4817]: I0218 15:11:50.288699 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9wzb" event={"ID":"15062f03-ca34-4a6a-8a62-8819dbb8056e","Type":"ContainerDied","Data":"fa836901a0ad7d182fffab68d6213efd96f7386a243bca46241d1c3c1a0032fd"} Feb 18 15:11:50 crc kubenswrapper[4817]: I0218 15:11:50.771914 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m9wzb" Feb 18 15:11:50 crc kubenswrapper[4817]: I0218 15:11:50.922473 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15062f03-ca34-4a6a-8a62-8819dbb8056e-utilities\") pod \"15062f03-ca34-4a6a-8a62-8819dbb8056e\" (UID: \"15062f03-ca34-4a6a-8a62-8819dbb8056e\") " Feb 18 15:11:50 crc kubenswrapper[4817]: I0218 15:11:50.922617 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15062f03-ca34-4a6a-8a62-8819dbb8056e-catalog-content\") pod \"15062f03-ca34-4a6a-8a62-8819dbb8056e\" (UID: \"15062f03-ca34-4a6a-8a62-8819dbb8056e\") " Feb 18 15:11:50 crc kubenswrapper[4817]: I0218 15:11:50.922644 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9gwg\" (UniqueName: \"kubernetes.io/projected/15062f03-ca34-4a6a-8a62-8819dbb8056e-kube-api-access-h9gwg\") pod \"15062f03-ca34-4a6a-8a62-8819dbb8056e\" (UID: \"15062f03-ca34-4a6a-8a62-8819dbb8056e\") " Feb 18 15:11:50 crc kubenswrapper[4817]: I0218 15:11:50.923522 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15062f03-ca34-4a6a-8a62-8819dbb8056e-utilities" (OuterVolumeSpecName: "utilities") pod "15062f03-ca34-4a6a-8a62-8819dbb8056e" (UID: "15062f03-ca34-4a6a-8a62-8819dbb8056e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 15:11:50 crc kubenswrapper[4817]: I0218 15:11:50.934827 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15062f03-ca34-4a6a-8a62-8819dbb8056e-kube-api-access-h9gwg" (OuterVolumeSpecName: "kube-api-access-h9gwg") pod "15062f03-ca34-4a6a-8a62-8819dbb8056e" (UID: "15062f03-ca34-4a6a-8a62-8819dbb8056e"). InnerVolumeSpecName "kube-api-access-h9gwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 15:11:50 crc kubenswrapper[4817]: I0218 15:11:50.989042 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15062f03-ca34-4a6a-8a62-8819dbb8056e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "15062f03-ca34-4a6a-8a62-8819dbb8056e" (UID: "15062f03-ca34-4a6a-8a62-8819dbb8056e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 15:11:51 crc kubenswrapper[4817]: I0218 15:11:51.025053 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15062f03-ca34-4a6a-8a62-8819dbb8056e-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 15:11:51 crc kubenswrapper[4817]: I0218 15:11:51.025092 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15062f03-ca34-4a6a-8a62-8819dbb8056e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 15:11:51 crc kubenswrapper[4817]: I0218 15:11:51.025103 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9gwg\" (UniqueName: \"kubernetes.io/projected/15062f03-ca34-4a6a-8a62-8819dbb8056e-kube-api-access-h9gwg\") on node \"crc\" DevicePath \"\"" Feb 18 15:11:51 crc kubenswrapper[4817]: I0218 15:11:51.304085 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m9wzb" event={"ID":"15062f03-ca34-4a6a-8a62-8819dbb8056e","Type":"ContainerDied","Data":"d70b1faaebf0924d51db5436f59588471869612ef562bc50e7b14e37b16b3948"} Feb 18 15:11:51 crc kubenswrapper[4817]: I0218 15:11:51.304147 4817 scope.go:117] "RemoveContainer" containerID="fa836901a0ad7d182fffab68d6213efd96f7386a243bca46241d1c3c1a0032fd" Feb 18 15:11:51 crc kubenswrapper[4817]: I0218 15:11:51.304168 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m9wzb" Feb 18 15:11:51 crc kubenswrapper[4817]: I0218 15:11:51.332818 4817 scope.go:117] "RemoveContainer" containerID="0e568388c5ad7510a3cecb4574a838ca93bd76873215b8377e7e9b747ee85e94" Feb 18 15:11:51 crc kubenswrapper[4817]: I0218 15:11:51.388375 4817 scope.go:117] "RemoveContainer" containerID="72de6b4cdb050add3d49da72a08a7af7881a7c84f4f3ac078ed8cf1cfae074e1" Feb 18 15:11:51 crc kubenswrapper[4817]: I0218 15:11:51.402498 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m9wzb"] Feb 18 15:11:51 crc kubenswrapper[4817]: I0218 15:11:51.431591 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m9wzb"] Feb 18 15:11:52 crc kubenswrapper[4817]: I0218 15:11:52.218792 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15062f03-ca34-4a6a-8a62-8819dbb8056e" path="/var/lib/kubelet/pods/15062f03-ca34-4a6a-8a62-8819dbb8056e/volumes" Feb 18 15:12:02 crc kubenswrapper[4817]: I0218 15:12:02.505080 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4qk4p"] Feb 18 15:12:02 crc kubenswrapper[4817]: E0218 15:12:02.506391 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15062f03-ca34-4a6a-8a62-8819dbb8056e" containerName="extract-content" Feb 18 15:12:02 crc kubenswrapper[4817]: I0218 15:12:02.506408 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="15062f03-ca34-4a6a-8a62-8819dbb8056e" containerName="extract-content" Feb 18 15:12:02 crc kubenswrapper[4817]: E0218 15:12:02.506432 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15062f03-ca34-4a6a-8a62-8819dbb8056e" containerName="extract-utilities" Feb 18 15:12:02 crc kubenswrapper[4817]: I0218 15:12:02.506440 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="15062f03-ca34-4a6a-8a62-8819dbb8056e" containerName="extract-utilities" Feb 18 15:12:02 crc kubenswrapper[4817]: E0218 15:12:02.506493 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15062f03-ca34-4a6a-8a62-8819dbb8056e" containerName="registry-server" Feb 18 15:12:02 crc kubenswrapper[4817]: I0218 15:12:02.506503 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="15062f03-ca34-4a6a-8a62-8819dbb8056e" containerName="registry-server" Feb 18 15:12:02 crc kubenswrapper[4817]: I0218 15:12:02.506764 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="15062f03-ca34-4a6a-8a62-8819dbb8056e" containerName="registry-server" Feb 18 15:12:02 crc kubenswrapper[4817]: I0218 15:12:02.508769 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4qk4p" Feb 18 15:12:02 crc kubenswrapper[4817]: I0218 15:12:02.532748 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4qk4p"] Feb 18 15:12:02 crc kubenswrapper[4817]: I0218 15:12:02.574478 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18829d61-ffa6-4ae9-9468-5d13ba0d3a9b-utilities\") pod \"redhat-operators-4qk4p\" (UID: \"18829d61-ffa6-4ae9-9468-5d13ba0d3a9b\") " pod="openshift-marketplace/redhat-operators-4qk4p" Feb 18 15:12:02 crc kubenswrapper[4817]: I0218 15:12:02.574699 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm7qg\" (UniqueName: \"kubernetes.io/projected/18829d61-ffa6-4ae9-9468-5d13ba0d3a9b-kube-api-access-fm7qg\") pod \"redhat-operators-4qk4p\" (UID: \"18829d61-ffa6-4ae9-9468-5d13ba0d3a9b\") " pod="openshift-marketplace/redhat-operators-4qk4p" Feb 18 15:12:02 crc kubenswrapper[4817]: I0218 15:12:02.574765 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18829d61-ffa6-4ae9-9468-5d13ba0d3a9b-catalog-content\") pod \"redhat-operators-4qk4p\" (UID: \"18829d61-ffa6-4ae9-9468-5d13ba0d3a9b\") " pod="openshift-marketplace/redhat-operators-4qk4p" Feb 18 15:12:02 crc kubenswrapper[4817]: I0218 15:12:02.676583 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fm7qg\" (UniqueName: \"kubernetes.io/projected/18829d61-ffa6-4ae9-9468-5d13ba0d3a9b-kube-api-access-fm7qg\") pod \"redhat-operators-4qk4p\" (UID: \"18829d61-ffa6-4ae9-9468-5d13ba0d3a9b\") " pod="openshift-marketplace/redhat-operators-4qk4p" Feb 18 15:12:02 crc kubenswrapper[4817]: I0218 15:12:02.676648 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18829d61-ffa6-4ae9-9468-5d13ba0d3a9b-catalog-content\") pod \"redhat-operators-4qk4p\" (UID: \"18829d61-ffa6-4ae9-9468-5d13ba0d3a9b\") " pod="openshift-marketplace/redhat-operators-4qk4p" Feb 18 15:12:02 crc kubenswrapper[4817]: I0218 15:12:02.676825 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18829d61-ffa6-4ae9-9468-5d13ba0d3a9b-utilities\") pod \"redhat-operators-4qk4p\" (UID: \"18829d61-ffa6-4ae9-9468-5d13ba0d3a9b\") " pod="openshift-marketplace/redhat-operators-4qk4p" Feb 18 15:12:02 crc kubenswrapper[4817]: I0218 15:12:02.677485 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18829d61-ffa6-4ae9-9468-5d13ba0d3a9b-catalog-content\") pod \"redhat-operators-4qk4p\" (UID: \"18829d61-ffa6-4ae9-9468-5d13ba0d3a9b\") " pod="openshift-marketplace/redhat-operators-4qk4p" Feb 18 15:12:02 crc kubenswrapper[4817]: I0218 15:12:02.677566 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18829d61-ffa6-4ae9-9468-5d13ba0d3a9b-utilities\") pod \"redhat-operators-4qk4p\" (UID: \"18829d61-ffa6-4ae9-9468-5d13ba0d3a9b\") " pod="openshift-marketplace/redhat-operators-4qk4p" Feb 18 15:12:03 crc kubenswrapper[4817]: I0218 15:12:03.349030 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm7qg\" (UniqueName: \"kubernetes.io/projected/18829d61-ffa6-4ae9-9468-5d13ba0d3a9b-kube-api-access-fm7qg\") pod \"redhat-operators-4qk4p\" (UID: \"18829d61-ffa6-4ae9-9468-5d13ba0d3a9b\") " pod="openshift-marketplace/redhat-operators-4qk4p" Feb 18 15:12:03 crc kubenswrapper[4817]: I0218 15:12:03.439558 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4qk4p" Feb 18 15:12:03 crc kubenswrapper[4817]: I0218 15:12:03.953270 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4qk4p"] Feb 18 15:12:03 crc kubenswrapper[4817]: W0218 15:12:03.960567 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18829d61_ffa6_4ae9_9468_5d13ba0d3a9b.slice/crio-1a48174204c9bcbedac0be5c6e0af22bf30bbfe43b387b9ec3863571ed862a5f WatchSource:0}: Error finding container 1a48174204c9bcbedac0be5c6e0af22bf30bbfe43b387b9ec3863571ed862a5f: Status 404 returned error can't find the container with id 1a48174204c9bcbedac0be5c6e0af22bf30bbfe43b387b9ec3863571ed862a5f Feb 18 15:12:04 crc kubenswrapper[4817]: I0218 15:12:04.459761 4817 generic.go:334] "Generic (PLEG): container finished" podID="18829d61-ffa6-4ae9-9468-5d13ba0d3a9b" containerID="c12ec6a17771d25a195ad68355c5efd155deda61ae4eaf7dda078bc466c3ff66" exitCode=0 Feb 18 15:12:04 crc kubenswrapper[4817]: I0218 15:12:04.459869 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qk4p" event={"ID":"18829d61-ffa6-4ae9-9468-5d13ba0d3a9b","Type":"ContainerDied","Data":"c12ec6a17771d25a195ad68355c5efd155deda61ae4eaf7dda078bc466c3ff66"} Feb 18 15:12:04 crc kubenswrapper[4817]: I0218 15:12:04.460002 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qk4p" event={"ID":"18829d61-ffa6-4ae9-9468-5d13ba0d3a9b","Type":"ContainerStarted","Data":"1a48174204c9bcbedac0be5c6e0af22bf30bbfe43b387b9ec3863571ed862a5f"} Feb 18 15:12:05 crc kubenswrapper[4817]: I0218 15:12:05.472047 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qk4p" event={"ID":"18829d61-ffa6-4ae9-9468-5d13ba0d3a9b","Type":"ContainerStarted","Data":"8d4e91330f78cf57535dea12fd806d3b256bef052e28a7b50b59a21eb3c1378c"} Feb 18 15:12:11 crc kubenswrapper[4817]: I0218 15:12:11.531232 4817 generic.go:334] "Generic (PLEG): container finished" podID="18829d61-ffa6-4ae9-9468-5d13ba0d3a9b" containerID="8d4e91330f78cf57535dea12fd806d3b256bef052e28a7b50b59a21eb3c1378c" exitCode=0 Feb 18 15:12:11 crc kubenswrapper[4817]: I0218 15:12:11.531329 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qk4p" event={"ID":"18829d61-ffa6-4ae9-9468-5d13ba0d3a9b","Type":"ContainerDied","Data":"8d4e91330f78cf57535dea12fd806d3b256bef052e28a7b50b59a21eb3c1378c"} Feb 18 15:12:13 crc kubenswrapper[4817]: I0218 15:12:13.560325 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qk4p" event={"ID":"18829d61-ffa6-4ae9-9468-5d13ba0d3a9b","Type":"ContainerStarted","Data":"3ce90ae7a4cc73920172d8440790fe972a8025f6d45188e55592b0462e72d915"} Feb 18 15:12:13 crc kubenswrapper[4817]: I0218 15:12:13.586951 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4qk4p" podStartSLOduration=4.105600053 podStartE2EDuration="11.586926848s" podCreationTimestamp="2026-02-18 15:12:02 +0000 UTC" firstStartedPulling="2026-02-18 15:12:04.461968773 +0000 UTC m=+4387.037504756" lastFinishedPulling="2026-02-18 15:12:11.943295518 +0000 UTC m=+4394.518831551" observedRunningTime="2026-02-18 15:12:13.582520877 +0000 UTC m=+4396.158056880" watchObservedRunningTime="2026-02-18 15:12:13.586926848 +0000 UTC m=+4396.162462831" Feb 18 15:12:23 crc kubenswrapper[4817]: I0218 15:12:23.440652 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4qk4p" Feb 18 15:12:23 crc kubenswrapper[4817]: I0218 15:12:23.441401 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4qk4p" Feb 18 15:12:24 crc kubenswrapper[4817]: I0218 15:12:24.496646 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4qk4p" podUID="18829d61-ffa6-4ae9-9468-5d13ba0d3a9b" containerName="registry-server" probeResult="failure" output=< Feb 18 15:12:24 crc kubenswrapper[4817]: timeout: failed to connect service ":50051" within 1s Feb 18 15:12:24 crc kubenswrapper[4817]: > Feb 18 15:12:34 crc kubenswrapper[4817]: I0218 15:12:34.701125 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4qk4p" podUID="18829d61-ffa6-4ae9-9468-5d13ba0d3a9b" containerName="registry-server" probeResult="failure" output=< Feb 18 15:12:34 crc kubenswrapper[4817]: timeout: failed to connect service ":50051" within 1s Feb 18 15:12:34 crc kubenswrapper[4817]: > Feb 18 15:12:45 crc kubenswrapper[4817]: I0218 15:12:45.106067 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4qk4p" podUID="18829d61-ffa6-4ae9-9468-5d13ba0d3a9b" containerName="registry-server" probeResult="failure" output=< Feb 18 15:12:45 crc kubenswrapper[4817]: timeout: failed to connect service ":50051" within 1s Feb 18 15:12:45 crc kubenswrapper[4817]: > Feb 18 15:12:54 crc kubenswrapper[4817]: I0218 15:12:54.294658 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4qk4p" Feb 18 15:12:54 crc kubenswrapper[4817]: I0218 15:12:54.352724 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4qk4p" Feb 18 15:12:54 crc kubenswrapper[4817]: I0218 15:12:54.534718 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4qk4p"] Feb 18 15:12:55 crc kubenswrapper[4817]: I0218 15:12:55.985839 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4qk4p" podUID="18829d61-ffa6-4ae9-9468-5d13ba0d3a9b" containerName="registry-server" containerID="cri-o://3ce90ae7a4cc73920172d8440790fe972a8025f6d45188e55592b0462e72d915" gracePeriod=2 Feb 18 15:12:56 crc kubenswrapper[4817]: I0218 15:12:56.866948 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4qk4p" Feb 18 15:12:56 crc kubenswrapper[4817]: I0218 15:12:56.890939 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18829d61-ffa6-4ae9-9468-5d13ba0d3a9b-catalog-content\") pod \"18829d61-ffa6-4ae9-9468-5d13ba0d3a9b\" (UID: \"18829d61-ffa6-4ae9-9468-5d13ba0d3a9b\") " Feb 18 15:12:56 crc kubenswrapper[4817]: I0218 15:12:56.891039 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fm7qg\" (UniqueName: \"kubernetes.io/projected/18829d61-ffa6-4ae9-9468-5d13ba0d3a9b-kube-api-access-fm7qg\") pod \"18829d61-ffa6-4ae9-9468-5d13ba0d3a9b\" (UID: \"18829d61-ffa6-4ae9-9468-5d13ba0d3a9b\") " Feb 18 15:12:56 crc kubenswrapper[4817]: I0218 15:12:56.891083 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18829d61-ffa6-4ae9-9468-5d13ba0d3a9b-utilities\") pod \"18829d61-ffa6-4ae9-9468-5d13ba0d3a9b\" (UID: \"18829d61-ffa6-4ae9-9468-5d13ba0d3a9b\") " Feb 18 15:12:56 crc kubenswrapper[4817]: I0218 15:12:56.892451 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18829d61-ffa6-4ae9-9468-5d13ba0d3a9b-utilities" (OuterVolumeSpecName: "utilities") pod "18829d61-ffa6-4ae9-9468-5d13ba0d3a9b" (UID: "18829d61-ffa6-4ae9-9468-5d13ba0d3a9b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 15:12:56 crc kubenswrapper[4817]: I0218 15:12:56.897421 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18829d61-ffa6-4ae9-9468-5d13ba0d3a9b-kube-api-access-fm7qg" (OuterVolumeSpecName: "kube-api-access-fm7qg") pod "18829d61-ffa6-4ae9-9468-5d13ba0d3a9b" (UID: "18829d61-ffa6-4ae9-9468-5d13ba0d3a9b"). InnerVolumeSpecName "kube-api-access-fm7qg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 15:12:56 crc kubenswrapper[4817]: I0218 15:12:56.993139 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fm7qg\" (UniqueName: \"kubernetes.io/projected/18829d61-ffa6-4ae9-9468-5d13ba0d3a9b-kube-api-access-fm7qg\") on node \"crc\" DevicePath \"\"" Feb 18 15:12:56 crc kubenswrapper[4817]: I0218 15:12:56.993167 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18829d61-ffa6-4ae9-9468-5d13ba0d3a9b-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 15:12:57 crc kubenswrapper[4817]: I0218 15:12:57.006013 4817 generic.go:334] "Generic (PLEG): container finished" podID="18829d61-ffa6-4ae9-9468-5d13ba0d3a9b" containerID="3ce90ae7a4cc73920172d8440790fe972a8025f6d45188e55592b0462e72d915" exitCode=0 Feb 18 15:12:57 crc kubenswrapper[4817]: I0218 15:12:57.006061 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qk4p" event={"ID":"18829d61-ffa6-4ae9-9468-5d13ba0d3a9b","Type":"ContainerDied","Data":"3ce90ae7a4cc73920172d8440790fe972a8025f6d45188e55592b0462e72d915"} Feb 18 15:12:57 crc kubenswrapper[4817]: I0218 15:12:57.006090 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qk4p" event={"ID":"18829d61-ffa6-4ae9-9468-5d13ba0d3a9b","Type":"ContainerDied","Data":"1a48174204c9bcbedac0be5c6e0af22bf30bbfe43b387b9ec3863571ed862a5f"} Feb 18 15:12:57 crc kubenswrapper[4817]: I0218 15:12:57.006108 4817 scope.go:117] "RemoveContainer" containerID="3ce90ae7a4cc73920172d8440790fe972a8025f6d45188e55592b0462e72d915" Feb 18 15:12:57 crc kubenswrapper[4817]: I0218 15:12:57.006185 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4qk4p" Feb 18 15:12:57 crc kubenswrapper[4817]: I0218 15:12:57.023648 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18829d61-ffa6-4ae9-9468-5d13ba0d3a9b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18829d61-ffa6-4ae9-9468-5d13ba0d3a9b" (UID: "18829d61-ffa6-4ae9-9468-5d13ba0d3a9b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 15:12:57 crc kubenswrapper[4817]: I0218 15:12:57.047318 4817 scope.go:117] "RemoveContainer" containerID="8d4e91330f78cf57535dea12fd806d3b256bef052e28a7b50b59a21eb3c1378c" Feb 18 15:12:57 crc kubenswrapper[4817]: I0218 15:12:57.075437 4817 scope.go:117] "RemoveContainer" containerID="c12ec6a17771d25a195ad68355c5efd155deda61ae4eaf7dda078bc466c3ff66" Feb 18 15:12:57 crc kubenswrapper[4817]: I0218 15:12:57.095051 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18829d61-ffa6-4ae9-9468-5d13ba0d3a9b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 15:12:57 crc kubenswrapper[4817]: I0218 15:12:57.133951 4817 scope.go:117] "RemoveContainer" containerID="3ce90ae7a4cc73920172d8440790fe972a8025f6d45188e55592b0462e72d915" Feb 18 15:12:57 crc kubenswrapper[4817]: E0218 15:12:57.137910 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ce90ae7a4cc73920172d8440790fe972a8025f6d45188e55592b0462e72d915\": container with ID starting with 3ce90ae7a4cc73920172d8440790fe972a8025f6d45188e55592b0462e72d915 not found: ID does not exist" containerID="3ce90ae7a4cc73920172d8440790fe972a8025f6d45188e55592b0462e72d915" Feb 18 15:12:57 crc kubenswrapper[4817]: I0218 15:12:57.137949 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ce90ae7a4cc73920172d8440790fe972a8025f6d45188e55592b0462e72d915"} err="failed to get container status \"3ce90ae7a4cc73920172d8440790fe972a8025f6d45188e55592b0462e72d915\": rpc error: code = NotFound desc = could not find container \"3ce90ae7a4cc73920172d8440790fe972a8025f6d45188e55592b0462e72d915\": container with ID starting with 3ce90ae7a4cc73920172d8440790fe972a8025f6d45188e55592b0462e72d915 not found: ID does not exist" Feb 18 15:12:57 crc kubenswrapper[4817]: I0218 15:12:57.137989 4817 scope.go:117] "RemoveContainer" containerID="8d4e91330f78cf57535dea12fd806d3b256bef052e28a7b50b59a21eb3c1378c" Feb 18 15:12:57 crc kubenswrapper[4817]: E0218 15:12:57.138425 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d4e91330f78cf57535dea12fd806d3b256bef052e28a7b50b59a21eb3c1378c\": container with ID starting with 8d4e91330f78cf57535dea12fd806d3b256bef052e28a7b50b59a21eb3c1378c not found: ID does not exist" containerID="8d4e91330f78cf57535dea12fd806d3b256bef052e28a7b50b59a21eb3c1378c" Feb 18 15:12:57 crc kubenswrapper[4817]: I0218 15:12:57.138465 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d4e91330f78cf57535dea12fd806d3b256bef052e28a7b50b59a21eb3c1378c"} err="failed to get container status \"8d4e91330f78cf57535dea12fd806d3b256bef052e28a7b50b59a21eb3c1378c\": rpc error: code = NotFound desc = could not find container \"8d4e91330f78cf57535dea12fd806d3b256bef052e28a7b50b59a21eb3c1378c\": container with ID starting with 8d4e91330f78cf57535dea12fd806d3b256bef052e28a7b50b59a21eb3c1378c not found: ID does not exist" Feb 18 15:12:57 crc kubenswrapper[4817]: I0218 15:12:57.138490 4817 scope.go:117] "RemoveContainer" containerID="c12ec6a17771d25a195ad68355c5efd155deda61ae4eaf7dda078bc466c3ff66" Feb 18 15:12:57 crc kubenswrapper[4817]: E0218 15:12:57.138872 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c12ec6a17771d25a195ad68355c5efd155deda61ae4eaf7dda078bc466c3ff66\": container with ID starting with c12ec6a17771d25a195ad68355c5efd155deda61ae4eaf7dda078bc466c3ff66 not found: ID does not exist" containerID="c12ec6a17771d25a195ad68355c5efd155deda61ae4eaf7dda078bc466c3ff66" Feb 18 15:12:57 crc kubenswrapper[4817]: I0218 15:12:57.138893 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c12ec6a17771d25a195ad68355c5efd155deda61ae4eaf7dda078bc466c3ff66"} err="failed to get container status \"c12ec6a17771d25a195ad68355c5efd155deda61ae4eaf7dda078bc466c3ff66\": rpc error: code = NotFound desc = could not find container \"c12ec6a17771d25a195ad68355c5efd155deda61ae4eaf7dda078bc466c3ff66\": container with ID starting with c12ec6a17771d25a195ad68355c5efd155deda61ae4eaf7dda078bc466c3ff66 not found: ID does not exist" Feb 18 15:12:57 crc kubenswrapper[4817]: I0218 15:12:57.366695 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4qk4p"] Feb 18 15:12:57 crc kubenswrapper[4817]: I0218 15:12:57.380869 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4qk4p"] Feb 18 15:12:58 crc kubenswrapper[4817]: I0218 15:12:58.189213 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18829d61-ffa6-4ae9-9468-5d13ba0d3a9b" path="/var/lib/kubelet/pods/18829d61-ffa6-4ae9-9468-5d13ba0d3a9b/volumes" Feb 18 15:13:56 crc kubenswrapper[4817]: I0218 15:13:56.571492 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bg8pc"] Feb 18 15:13:56 crc kubenswrapper[4817]: E0218 15:13:56.573567 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18829d61-ffa6-4ae9-9468-5d13ba0d3a9b" containerName="extract-utilities" Feb 18 15:13:56 crc kubenswrapper[4817]: I0218 15:13:56.573703 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="18829d61-ffa6-4ae9-9468-5d13ba0d3a9b" containerName="extract-utilities" Feb 18 15:13:56 crc kubenswrapper[4817]: E0218 15:13:56.573824 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18829d61-ffa6-4ae9-9468-5d13ba0d3a9b" containerName="registry-server" Feb 18 15:13:56 crc kubenswrapper[4817]: I0218 15:13:56.574014 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="18829d61-ffa6-4ae9-9468-5d13ba0d3a9b" containerName="registry-server" Feb 18 15:13:56 crc kubenswrapper[4817]: E0218 15:13:56.574115 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18829d61-ffa6-4ae9-9468-5d13ba0d3a9b" containerName="extract-content" Feb 18 15:13:56 crc kubenswrapper[4817]: I0218 15:13:56.574196 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="18829d61-ffa6-4ae9-9468-5d13ba0d3a9b" containerName="extract-content" Feb 18 15:13:56 crc kubenswrapper[4817]: I0218 15:13:56.574551 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="18829d61-ffa6-4ae9-9468-5d13ba0d3a9b" containerName="registry-server" Feb 18 15:13:56 crc kubenswrapper[4817]: I0218 15:13:56.578439 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bg8pc" Feb 18 15:13:56 crc kubenswrapper[4817]: I0218 15:13:56.585953 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bg8pc"] Feb 18 15:13:56 crc kubenswrapper[4817]: I0218 15:13:56.592389 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57bb3636-0b72-45fd-a0d1-4df7bf74ea7c-utilities\") pod \"certified-operators-bg8pc\" (UID: \"57bb3636-0b72-45fd-a0d1-4df7bf74ea7c\") " pod="openshift-marketplace/certified-operators-bg8pc" Feb 18 15:13:56 crc kubenswrapper[4817]: I0218 15:13:56.594297 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57bb3636-0b72-45fd-a0d1-4df7bf74ea7c-catalog-content\") pod \"certified-operators-bg8pc\" (UID: \"57bb3636-0b72-45fd-a0d1-4df7bf74ea7c\") " pod="openshift-marketplace/certified-operators-bg8pc" Feb 18 15:13:56 crc kubenswrapper[4817]: I0218 15:13:56.594482 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd9n5\" (UniqueName: \"kubernetes.io/projected/57bb3636-0b72-45fd-a0d1-4df7bf74ea7c-kube-api-access-zd9n5\") pod \"certified-operators-bg8pc\" (UID: \"57bb3636-0b72-45fd-a0d1-4df7bf74ea7c\") " pod="openshift-marketplace/certified-operators-bg8pc" Feb 18 15:13:56 crc kubenswrapper[4817]: I0218 15:13:56.695803 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57bb3636-0b72-45fd-a0d1-4df7bf74ea7c-utilities\") pod \"certified-operators-bg8pc\" (UID: \"57bb3636-0b72-45fd-a0d1-4df7bf74ea7c\") " pod="openshift-marketplace/certified-operators-bg8pc" Feb 18 15:13:56 crc kubenswrapper[4817]: I0218 15:13:56.696143 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57bb3636-0b72-45fd-a0d1-4df7bf74ea7c-catalog-content\") pod \"certified-operators-bg8pc\" (UID: \"57bb3636-0b72-45fd-a0d1-4df7bf74ea7c\") " pod="openshift-marketplace/certified-operators-bg8pc" Feb 18 15:13:56 crc kubenswrapper[4817]: I0218 15:13:56.696300 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd9n5\" (UniqueName: \"kubernetes.io/projected/57bb3636-0b72-45fd-a0d1-4df7bf74ea7c-kube-api-access-zd9n5\") pod \"certified-operators-bg8pc\" (UID: \"57bb3636-0b72-45fd-a0d1-4df7bf74ea7c\") " pod="openshift-marketplace/certified-operators-bg8pc" Feb 18 15:13:56 crc kubenswrapper[4817]: I0218 15:13:56.696562 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57bb3636-0b72-45fd-a0d1-4df7bf74ea7c-catalog-content\") pod \"certified-operators-bg8pc\" (UID: \"57bb3636-0b72-45fd-a0d1-4df7bf74ea7c\") " pod="openshift-marketplace/certified-operators-bg8pc" Feb 18 15:13:56 crc kubenswrapper[4817]: I0218 15:13:56.696771 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57bb3636-0b72-45fd-a0d1-4df7bf74ea7c-utilities\") pod \"certified-operators-bg8pc\" (UID: \"57bb3636-0b72-45fd-a0d1-4df7bf74ea7c\") " pod="openshift-marketplace/certified-operators-bg8pc" Feb 18 15:13:57 crc kubenswrapper[4817]: I0218 15:13:57.050438 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd9n5\" (UniqueName: \"kubernetes.io/projected/57bb3636-0b72-45fd-a0d1-4df7bf74ea7c-kube-api-access-zd9n5\") pod \"certified-operators-bg8pc\" (UID: \"57bb3636-0b72-45fd-a0d1-4df7bf74ea7c\") " pod="openshift-marketplace/certified-operators-bg8pc" Feb 18 15:13:57 crc kubenswrapper[4817]: I0218 15:13:57.199452 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bg8pc" Feb 18 15:13:57 crc kubenswrapper[4817]: I0218 15:13:57.846996 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bg8pc"] Feb 18 15:13:58 crc kubenswrapper[4817]: I0218 15:13:58.604694 4817 generic.go:334] "Generic (PLEG): container finished" podID="57bb3636-0b72-45fd-a0d1-4df7bf74ea7c" containerID="abd96d328fe6401aa15ae9eba37855429a9e11158e974d5e88a3af3d196a895f" exitCode=0 Feb 18 15:13:58 crc kubenswrapper[4817]: I0218 15:13:58.604794 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bg8pc" event={"ID":"57bb3636-0b72-45fd-a0d1-4df7bf74ea7c","Type":"ContainerDied","Data":"abd96d328fe6401aa15ae9eba37855429a9e11158e974d5e88a3af3d196a895f"} Feb 18 15:13:58 crc kubenswrapper[4817]: I0218 15:13:58.605329 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bg8pc" event={"ID":"57bb3636-0b72-45fd-a0d1-4df7bf74ea7c","Type":"ContainerStarted","Data":"05886da2ae62f3cc9f0f14f3f22e6ea0fe1c453f52c968761bc42a698bf4575b"} Feb 18 15:14:00 crc kubenswrapper[4817]: I0218 15:14:00.624354 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bg8pc" event={"ID":"57bb3636-0b72-45fd-a0d1-4df7bf74ea7c","Type":"ContainerStarted","Data":"2d45c8a5f21378f7336163315165cf0f2750fe63061824c5328b4b9c5d1566f4"} Feb 18 15:14:01 crc kubenswrapper[4817]: I0218 15:14:01.637095 4817 generic.go:334] "Generic (PLEG): container finished" podID="57bb3636-0b72-45fd-a0d1-4df7bf74ea7c" containerID="2d45c8a5f21378f7336163315165cf0f2750fe63061824c5328b4b9c5d1566f4" exitCode=0 Feb 18 15:14:01 crc kubenswrapper[4817]: I0218 15:14:01.637188 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bg8pc" event={"ID":"57bb3636-0b72-45fd-a0d1-4df7bf74ea7c","Type":"ContainerDied","Data":"2d45c8a5f21378f7336163315165cf0f2750fe63061824c5328b4b9c5d1566f4"} Feb 18 15:14:02 crc kubenswrapper[4817]: I0218 15:14:02.650482 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bg8pc" event={"ID":"57bb3636-0b72-45fd-a0d1-4df7bf74ea7c","Type":"ContainerStarted","Data":"38309ba779540af439a962eef5dbbc581c1e028940528bf690b03d8857172a7e"} Feb 18 15:14:02 crc kubenswrapper[4817]: I0218 15:14:02.671389 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bg8pc" podStartSLOduration=3.118448159 podStartE2EDuration="6.671371144s" podCreationTimestamp="2026-02-18 15:13:56 +0000 UTC" firstStartedPulling="2026-02-18 15:13:58.607016168 +0000 UTC m=+4501.182552151" lastFinishedPulling="2026-02-18 15:14:02.159939153 +0000 UTC m=+4504.735475136" observedRunningTime="2026-02-18 15:14:02.668166553 +0000 UTC m=+4505.243702566" watchObservedRunningTime="2026-02-18 15:14:02.671371144 +0000 UTC m=+4505.246907127" Feb 18 15:14:07 crc kubenswrapper[4817]: I0218 15:14:07.199745 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bg8pc" Feb 18 15:14:07 crc kubenswrapper[4817]: I0218 15:14:07.200430 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bg8pc" Feb 18 15:14:07 crc kubenswrapper[4817]: I0218 15:14:07.261040 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bg8pc" Feb 18 15:14:08 crc kubenswrapper[4817]: I0218 15:14:08.308755 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bg8pc" Feb 18 15:14:08 crc kubenswrapper[4817]: I0218 15:14:08.384833 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bg8pc"] Feb 18 15:14:09 crc kubenswrapper[4817]: I0218 15:14:09.711266 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bg8pc" podUID="57bb3636-0b72-45fd-a0d1-4df7bf74ea7c" containerName="registry-server" containerID="cri-o://38309ba779540af439a962eef5dbbc581c1e028940528bf690b03d8857172a7e" gracePeriod=2 Feb 18 15:14:10 crc kubenswrapper[4817]: I0218 15:14:10.732178 4817 generic.go:334] "Generic (PLEG): container finished" podID="57bb3636-0b72-45fd-a0d1-4df7bf74ea7c" containerID="38309ba779540af439a962eef5dbbc581c1e028940528bf690b03d8857172a7e" exitCode=0 Feb 18 15:14:10 crc kubenswrapper[4817]: I0218 15:14:10.732270 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bg8pc" event={"ID":"57bb3636-0b72-45fd-a0d1-4df7bf74ea7c","Type":"ContainerDied","Data":"38309ba779540af439a962eef5dbbc581c1e028940528bf690b03d8857172a7e"} Feb 18 15:14:11 crc kubenswrapper[4817]: I0218 15:14:11.495996 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bg8pc" Feb 18 15:14:11 crc kubenswrapper[4817]: I0218 15:14:11.627330 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57bb3636-0b72-45fd-a0d1-4df7bf74ea7c-utilities\") pod \"57bb3636-0b72-45fd-a0d1-4df7bf74ea7c\" (UID: \"57bb3636-0b72-45fd-a0d1-4df7bf74ea7c\") " Feb 18 15:14:11 crc kubenswrapper[4817]: I0218 15:14:11.627514 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd9n5\" (UniqueName: \"kubernetes.io/projected/57bb3636-0b72-45fd-a0d1-4df7bf74ea7c-kube-api-access-zd9n5\") pod \"57bb3636-0b72-45fd-a0d1-4df7bf74ea7c\" (UID: \"57bb3636-0b72-45fd-a0d1-4df7bf74ea7c\") " Feb 18 15:14:11 crc kubenswrapper[4817]: I0218 15:14:11.627589 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57bb3636-0b72-45fd-a0d1-4df7bf74ea7c-catalog-content\") pod \"57bb3636-0b72-45fd-a0d1-4df7bf74ea7c\" (UID: \"57bb3636-0b72-45fd-a0d1-4df7bf74ea7c\") " Feb 18 15:14:11 crc kubenswrapper[4817]: I0218 15:14:11.628593 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57bb3636-0b72-45fd-a0d1-4df7bf74ea7c-utilities" (OuterVolumeSpecName: "utilities") pod "57bb3636-0b72-45fd-a0d1-4df7bf74ea7c" (UID: "57bb3636-0b72-45fd-a0d1-4df7bf74ea7c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 15:14:11 crc kubenswrapper[4817]: I0218 15:14:11.633157 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57bb3636-0b72-45fd-a0d1-4df7bf74ea7c-kube-api-access-zd9n5" (OuterVolumeSpecName: "kube-api-access-zd9n5") pod "57bb3636-0b72-45fd-a0d1-4df7bf74ea7c" (UID: "57bb3636-0b72-45fd-a0d1-4df7bf74ea7c"). InnerVolumeSpecName "kube-api-access-zd9n5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 15:14:11 crc kubenswrapper[4817]: I0218 15:14:11.679653 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57bb3636-0b72-45fd-a0d1-4df7bf74ea7c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57bb3636-0b72-45fd-a0d1-4df7bf74ea7c" (UID: "57bb3636-0b72-45fd-a0d1-4df7bf74ea7c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 15:14:11 crc kubenswrapper[4817]: I0218 15:14:11.730179 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57bb3636-0b72-45fd-a0d1-4df7bf74ea7c-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 15:14:11 crc kubenswrapper[4817]: I0218 15:14:11.730220 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zd9n5\" (UniqueName: \"kubernetes.io/projected/57bb3636-0b72-45fd-a0d1-4df7bf74ea7c-kube-api-access-zd9n5\") on node \"crc\" DevicePath \"\"" Feb 18 15:14:11 crc kubenswrapper[4817]: I0218 15:14:11.730234 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57bb3636-0b72-45fd-a0d1-4df7bf74ea7c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 15:14:11 crc kubenswrapper[4817]: I0218 15:14:11.743136 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bg8pc" event={"ID":"57bb3636-0b72-45fd-a0d1-4df7bf74ea7c","Type":"ContainerDied","Data":"05886da2ae62f3cc9f0f14f3f22e6ea0fe1c453f52c968761bc42a698bf4575b"} Feb 18 15:14:11 crc kubenswrapper[4817]: I0218 15:14:11.743197 4817 scope.go:117] "RemoveContainer" containerID="38309ba779540af439a962eef5dbbc581c1e028940528bf690b03d8857172a7e" Feb 18 15:14:11 crc kubenswrapper[4817]: I0218 15:14:11.743357 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bg8pc" Feb 18 15:14:11 crc kubenswrapper[4817]: I0218 15:14:11.775341 4817 scope.go:117] "RemoveContainer" containerID="2d45c8a5f21378f7336163315165cf0f2750fe63061824c5328b4b9c5d1566f4" Feb 18 15:14:11 crc kubenswrapper[4817]: I0218 15:14:11.783347 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bg8pc"] Feb 18 15:14:11 crc kubenswrapper[4817]: I0218 15:14:11.793562 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bg8pc"] Feb 18 15:14:11 crc kubenswrapper[4817]: I0218 15:14:11.812906 4817 scope.go:117] "RemoveContainer" containerID="abd96d328fe6401aa15ae9eba37855429a9e11158e974d5e88a3af3d196a895f" Feb 18 15:14:12 crc kubenswrapper[4817]: I0218 15:14:12.183574 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57bb3636-0b72-45fd-a0d1-4df7bf74ea7c" path="/var/lib/kubelet/pods/57bb3636-0b72-45fd-a0d1-4df7bf74ea7c/volumes" Feb 18 15:14:12 crc kubenswrapper[4817]: I0218 15:14:12.863159 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 15:14:12 crc kubenswrapper[4817]: I0218 15:14:12.864182 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 15:14:42 crc kubenswrapper[4817]: I0218 15:14:42.863505 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 15:14:42 crc kubenswrapper[4817]: I0218 15:14:42.864075 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 15:15:00 crc kubenswrapper[4817]: I0218 15:15:00.168541 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523795-mmdmd"] Feb 18 15:15:00 crc kubenswrapper[4817]: E0218 15:15:00.169655 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57bb3636-0b72-45fd-a0d1-4df7bf74ea7c" containerName="extract-utilities" Feb 18 15:15:00 crc kubenswrapper[4817]: I0218 15:15:00.169673 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="57bb3636-0b72-45fd-a0d1-4df7bf74ea7c" containerName="extract-utilities" Feb 18 15:15:00 crc kubenswrapper[4817]: E0218 15:15:00.169705 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57bb3636-0b72-45fd-a0d1-4df7bf74ea7c" containerName="extract-content" Feb 18 15:15:00 crc kubenswrapper[4817]: I0218 15:15:00.169712 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="57bb3636-0b72-45fd-a0d1-4df7bf74ea7c" containerName="extract-content" Feb 18 15:15:00 crc kubenswrapper[4817]: E0218 15:15:00.169734 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57bb3636-0b72-45fd-a0d1-4df7bf74ea7c" containerName="registry-server" Feb 18 15:15:00 crc kubenswrapper[4817]: I0218 15:15:00.169742 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="57bb3636-0b72-45fd-a0d1-4df7bf74ea7c" containerName="registry-server" Feb 18 15:15:00 crc kubenswrapper[4817]: I0218 15:15:00.169960 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="57bb3636-0b72-45fd-a0d1-4df7bf74ea7c" containerName="registry-server" Feb 18 15:15:00 crc kubenswrapper[4817]: I0218 15:15:00.170781 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523795-mmdmd" Feb 18 15:15:00 crc kubenswrapper[4817]: I0218 15:15:00.173105 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 18 15:15:00 crc kubenswrapper[4817]: I0218 15:15:00.175014 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 18 15:15:00 crc kubenswrapper[4817]: I0218 15:15:00.192779 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523795-mmdmd"] Feb 18 15:15:00 crc kubenswrapper[4817]: I0218 15:15:00.319146 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54a9abc2-9d84-4822-85b9-94505c4ba9df-config-volume\") pod \"collect-profiles-29523795-mmdmd\" (UID: \"54a9abc2-9d84-4822-85b9-94505c4ba9df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523795-mmdmd" Feb 18 15:15:00 crc kubenswrapper[4817]: I0218 15:15:00.319467 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54a9abc2-9d84-4822-85b9-94505c4ba9df-secret-volume\") pod \"collect-profiles-29523795-mmdmd\" (UID: \"54a9abc2-9d84-4822-85b9-94505c4ba9df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523795-mmdmd" Feb 18 15:15:00 crc kubenswrapper[4817]: I0218 15:15:00.319577 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2chw\" (UniqueName: \"kubernetes.io/projected/54a9abc2-9d84-4822-85b9-94505c4ba9df-kube-api-access-q2chw\") pod \"collect-profiles-29523795-mmdmd\" (UID: \"54a9abc2-9d84-4822-85b9-94505c4ba9df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523795-mmdmd" Feb 18 15:15:00 crc kubenswrapper[4817]: I0218 15:15:00.420892 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54a9abc2-9d84-4822-85b9-94505c4ba9df-secret-volume\") pod \"collect-profiles-29523795-mmdmd\" (UID: \"54a9abc2-9d84-4822-85b9-94505c4ba9df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523795-mmdmd" Feb 18 15:15:00 crc kubenswrapper[4817]: I0218 15:15:00.421363 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2chw\" (UniqueName: \"kubernetes.io/projected/54a9abc2-9d84-4822-85b9-94505c4ba9df-kube-api-access-q2chw\") pod \"collect-profiles-29523795-mmdmd\" (UID: \"54a9abc2-9d84-4822-85b9-94505c4ba9df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523795-mmdmd" Feb 18 15:15:00 crc kubenswrapper[4817]: I0218 15:15:00.421397 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54a9abc2-9d84-4822-85b9-94505c4ba9df-config-volume\") pod \"collect-profiles-29523795-mmdmd\" (UID: \"54a9abc2-9d84-4822-85b9-94505c4ba9df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523795-mmdmd" Feb 18 15:15:00 crc kubenswrapper[4817]: I0218 15:15:00.422599 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54a9abc2-9d84-4822-85b9-94505c4ba9df-config-volume\") pod \"collect-profiles-29523795-mmdmd\" (UID: \"54a9abc2-9d84-4822-85b9-94505c4ba9df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523795-mmdmd" Feb 18 15:15:00 crc kubenswrapper[4817]: I0218 15:15:00.450680 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2chw\" (UniqueName: \"kubernetes.io/projected/54a9abc2-9d84-4822-85b9-94505c4ba9df-kube-api-access-q2chw\") pod \"collect-profiles-29523795-mmdmd\" (UID: \"54a9abc2-9d84-4822-85b9-94505c4ba9df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523795-mmdmd" Feb 18 15:15:00 crc kubenswrapper[4817]: I0218 15:15:00.451637 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54a9abc2-9d84-4822-85b9-94505c4ba9df-secret-volume\") pod \"collect-profiles-29523795-mmdmd\" (UID: \"54a9abc2-9d84-4822-85b9-94505c4ba9df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523795-mmdmd" Feb 18 15:15:00 crc kubenswrapper[4817]: I0218 15:15:00.491678 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523795-mmdmd" Feb 18 15:15:00 crc kubenswrapper[4817]: I0218 15:15:00.984387 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523795-mmdmd"] Feb 18 15:15:01 crc kubenswrapper[4817]: I0218 15:15:01.181406 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523795-mmdmd" event={"ID":"54a9abc2-9d84-4822-85b9-94505c4ba9df","Type":"ContainerStarted","Data":"1d5ab7fefabfe7c73b1ee13d1755934f061f8160ff2f96a7754327944d509c08"} Feb 18 15:15:02 crc kubenswrapper[4817]: I0218 15:15:02.191852 4817 generic.go:334] "Generic (PLEG): container finished" podID="54a9abc2-9d84-4822-85b9-94505c4ba9df" containerID="09a8461be3d5bbbfcb000a1302a72fa63e23af4544dc74cf4463f07f1d2ef730" exitCode=0 Feb 18 15:15:02 crc kubenswrapper[4817]: I0218 15:15:02.191935 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523795-mmdmd" event={"ID":"54a9abc2-9d84-4822-85b9-94505c4ba9df","Type":"ContainerDied","Data":"09a8461be3d5bbbfcb000a1302a72fa63e23af4544dc74cf4463f07f1d2ef730"} Feb 18 15:15:03 crc kubenswrapper[4817]: I0218 15:15:03.850233 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523795-mmdmd" Feb 18 15:15:03 crc kubenswrapper[4817]: I0218 15:15:03.991776 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2chw\" (UniqueName: \"kubernetes.io/projected/54a9abc2-9d84-4822-85b9-94505c4ba9df-kube-api-access-q2chw\") pod \"54a9abc2-9d84-4822-85b9-94505c4ba9df\" (UID: \"54a9abc2-9d84-4822-85b9-94505c4ba9df\") " Feb 18 15:15:03 crc kubenswrapper[4817]: I0218 15:15:03.992223 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54a9abc2-9d84-4822-85b9-94505c4ba9df-config-volume\") pod \"54a9abc2-9d84-4822-85b9-94505c4ba9df\" (UID: \"54a9abc2-9d84-4822-85b9-94505c4ba9df\") " Feb 18 15:15:03 crc kubenswrapper[4817]: I0218 15:15:03.992396 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54a9abc2-9d84-4822-85b9-94505c4ba9df-secret-volume\") pod \"54a9abc2-9d84-4822-85b9-94505c4ba9df\" (UID: \"54a9abc2-9d84-4822-85b9-94505c4ba9df\") " Feb 18 15:15:03 crc kubenswrapper[4817]: I0218 15:15:03.994153 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54a9abc2-9d84-4822-85b9-94505c4ba9df-config-volume" (OuterVolumeSpecName: "config-volume") pod "54a9abc2-9d84-4822-85b9-94505c4ba9df" (UID: "54a9abc2-9d84-4822-85b9-94505c4ba9df"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 15:15:03 crc kubenswrapper[4817]: I0218 15:15:03.999889 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54a9abc2-9d84-4822-85b9-94505c4ba9df-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "54a9abc2-9d84-4822-85b9-94505c4ba9df" (UID: "54a9abc2-9d84-4822-85b9-94505c4ba9df"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 15:15:04 crc kubenswrapper[4817]: I0218 15:15:04.000116 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54a9abc2-9d84-4822-85b9-94505c4ba9df-kube-api-access-q2chw" (OuterVolumeSpecName: "kube-api-access-q2chw") pod "54a9abc2-9d84-4822-85b9-94505c4ba9df" (UID: "54a9abc2-9d84-4822-85b9-94505c4ba9df"). InnerVolumeSpecName "kube-api-access-q2chw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 15:15:04 crc kubenswrapper[4817]: I0218 15:15:04.094943 4817 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54a9abc2-9d84-4822-85b9-94505c4ba9df-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 18 15:15:04 crc kubenswrapper[4817]: I0218 15:15:04.095010 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2chw\" (UniqueName: \"kubernetes.io/projected/54a9abc2-9d84-4822-85b9-94505c4ba9df-kube-api-access-q2chw\") on node \"crc\" DevicePath \"\"" Feb 18 15:15:04 crc kubenswrapper[4817]: I0218 15:15:04.095025 4817 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54a9abc2-9d84-4822-85b9-94505c4ba9df-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 15:15:04 crc kubenswrapper[4817]: I0218 15:15:04.223320 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523795-mmdmd" event={"ID":"54a9abc2-9d84-4822-85b9-94505c4ba9df","Type":"ContainerDied","Data":"1d5ab7fefabfe7c73b1ee13d1755934f061f8160ff2f96a7754327944d509c08"} Feb 18 15:15:04 crc kubenswrapper[4817]: I0218 15:15:04.223387 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523795-mmdmd" Feb 18 15:15:04 crc kubenswrapper[4817]: I0218 15:15:04.223419 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d5ab7fefabfe7c73b1ee13d1755934f061f8160ff2f96a7754327944d509c08" Feb 18 15:15:04 crc kubenswrapper[4817]: I0218 15:15:04.927248 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523750-79hmz"] Feb 18 15:15:04 crc kubenswrapper[4817]: I0218 15:15:04.938922 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523750-79hmz"] Feb 18 15:15:06 crc kubenswrapper[4817]: I0218 15:15:06.183414 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="868382c3-7437-4ae4-9dd7-0f7629fe09ab" path="/var/lib/kubelet/pods/868382c3-7437-4ae4-9dd7-0f7629fe09ab/volumes" Feb 18 15:15:12 crc kubenswrapper[4817]: I0218 15:15:12.863560 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 15:15:12 crc kubenswrapper[4817]: I0218 15:15:12.864239 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 15:15:12 crc kubenswrapper[4817]: I0218 15:15:12.864301 4817 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" Feb 18 15:15:12 crc kubenswrapper[4817]: I0218 15:15:12.865242 4817 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4984bca70bc0993e719f0bc7f551bc184a0706e39bad62118e0e29921ef0f456"} pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 15:15:12 crc kubenswrapper[4817]: I0218 15:15:12.865314 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" containerID="cri-o://4984bca70bc0993e719f0bc7f551bc184a0706e39bad62118e0e29921ef0f456" gracePeriod=600 Feb 18 15:15:12 crc kubenswrapper[4817]: E0218 15:15:12.995721 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:15:13 crc kubenswrapper[4817]: I0218 15:15:13.310428 4817 generic.go:334] "Generic (PLEG): container finished" podID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerID="4984bca70bc0993e719f0bc7f551bc184a0706e39bad62118e0e29921ef0f456" exitCode=0 Feb 18 15:15:13 crc kubenswrapper[4817]: I0218 15:15:13.310478 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerDied","Data":"4984bca70bc0993e719f0bc7f551bc184a0706e39bad62118e0e29921ef0f456"} Feb 18 15:15:13 crc kubenswrapper[4817]: I0218 15:15:13.310523 4817 scope.go:117] "RemoveContainer" containerID="88f9ad2af8642e25df06dc7d11c08f0c159cf5c0e3364736e58acd12597763ea" Feb 18 15:15:13 crc kubenswrapper[4817]: I0218 15:15:13.311291 4817 scope.go:117] "RemoveContainer" containerID="4984bca70bc0993e719f0bc7f551bc184a0706e39bad62118e0e29921ef0f456" Feb 18 15:15:13 crc kubenswrapper[4817]: E0218 15:15:13.311732 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:15:20 crc kubenswrapper[4817]: I0218 15:15:20.401484 4817 generic.go:334] "Generic (PLEG): container finished" podID="33627f57-553f-4c87-a517-4fbe8d221665" containerID="2c7851cc8d2d83f51035793ceaf51f382775fd6eeeed12ebf191ff6f75f3b2e7" exitCode=0 Feb 18 15:15:20 crc kubenswrapper[4817]: I0218 15:15:20.401577 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"33627f57-553f-4c87-a517-4fbe8d221665","Type":"ContainerDied","Data":"2c7851cc8d2d83f51035793ceaf51f382775fd6eeeed12ebf191ff6f75f3b2e7"} Feb 18 15:15:22 crc kubenswrapper[4817]: I0218 15:15:22.306835 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 18 15:15:22 crc kubenswrapper[4817]: I0218 15:15:22.411225 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/33627f57-553f-4c87-a517-4fbe8d221665-test-operator-ephemeral-temporary\") pod \"33627f57-553f-4c87-a517-4fbe8d221665\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " Feb 18 15:15:22 crc kubenswrapper[4817]: I0218 15:15:22.411291 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/33627f57-553f-4c87-a517-4fbe8d221665-test-operator-ephemeral-workdir\") pod \"33627f57-553f-4c87-a517-4fbe8d221665\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " Feb 18 15:15:22 crc kubenswrapper[4817]: I0218 15:15:22.411348 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/33627f57-553f-4c87-a517-4fbe8d221665-openstack-config\") pod \"33627f57-553f-4c87-a517-4fbe8d221665\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " Feb 18 15:15:22 crc kubenswrapper[4817]: I0218 15:15:22.411421 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/33627f57-553f-4c87-a517-4fbe8d221665-ca-certs\") pod \"33627f57-553f-4c87-a517-4fbe8d221665\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " Feb 18 15:15:22 crc kubenswrapper[4817]: I0218 15:15:22.411462 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5blq4\" (UniqueName: \"kubernetes.io/projected/33627f57-553f-4c87-a517-4fbe8d221665-kube-api-access-5blq4\") pod \"33627f57-553f-4c87-a517-4fbe8d221665\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " Feb 18 15:15:22 crc kubenswrapper[4817]: I0218 15:15:22.411511 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/33627f57-553f-4c87-a517-4fbe8d221665-openstack-config-secret\") pod \"33627f57-553f-4c87-a517-4fbe8d221665\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " Feb 18 15:15:22 crc kubenswrapper[4817]: I0218 15:15:22.411612 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33627f57-553f-4c87-a517-4fbe8d221665-ssh-key\") pod \"33627f57-553f-4c87-a517-4fbe8d221665\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " Feb 18 15:15:22 crc kubenswrapper[4817]: I0218 15:15:22.411658 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"33627f57-553f-4c87-a517-4fbe8d221665\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " Feb 18 15:15:22 crc kubenswrapper[4817]: I0218 15:15:22.411689 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/33627f57-553f-4c87-a517-4fbe8d221665-config-data\") pod \"33627f57-553f-4c87-a517-4fbe8d221665\" (UID: \"33627f57-553f-4c87-a517-4fbe8d221665\") " Feb 18 15:15:22 crc kubenswrapper[4817]: I0218 15:15:22.411847 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33627f57-553f-4c87-a517-4fbe8d221665-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "33627f57-553f-4c87-a517-4fbe8d221665" (UID: "33627f57-553f-4c87-a517-4fbe8d221665"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 15:15:22 crc kubenswrapper[4817]: I0218 15:15:22.412422 4817 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/33627f57-553f-4c87-a517-4fbe8d221665-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 18 15:15:22 crc kubenswrapper[4817]: I0218 15:15:22.413448 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33627f57-553f-4c87-a517-4fbe8d221665-config-data" (OuterVolumeSpecName: "config-data") pod "33627f57-553f-4c87-a517-4fbe8d221665" (UID: "33627f57-553f-4c87-a517-4fbe8d221665"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 15:15:22 crc kubenswrapper[4817]: I0218 15:15:22.416888 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33627f57-553f-4c87-a517-4fbe8d221665-kube-api-access-5blq4" (OuterVolumeSpecName: "kube-api-access-5blq4") pod "33627f57-553f-4c87-a517-4fbe8d221665" (UID: "33627f57-553f-4c87-a517-4fbe8d221665"). InnerVolumeSpecName "kube-api-access-5blq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 15:15:22 crc kubenswrapper[4817]: I0218 15:15:22.417307 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "test-operator-logs") pod "33627f57-553f-4c87-a517-4fbe8d221665" (UID: "33627f57-553f-4c87-a517-4fbe8d221665"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 18 15:15:22 crc kubenswrapper[4817]: I0218 15:15:22.424524 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"33627f57-553f-4c87-a517-4fbe8d221665","Type":"ContainerDied","Data":"0e6916d7d3a28ee762ba7714d3521642e04a11a71f492ecf11bc1af59c7d6d72"} Feb 18 15:15:22 crc kubenswrapper[4817]: I0218 15:15:22.424569 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e6916d7d3a28ee762ba7714d3521642e04a11a71f492ecf11bc1af59c7d6d72" Feb 18 15:15:22 crc kubenswrapper[4817]: I0218 15:15:22.424580 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 18 15:15:22 crc kubenswrapper[4817]: I0218 15:15:22.448854 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33627f57-553f-4c87-a517-4fbe8d221665-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "33627f57-553f-4c87-a517-4fbe8d221665" (UID: "33627f57-553f-4c87-a517-4fbe8d221665"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 15:15:22 crc kubenswrapper[4817]: I0218 15:15:22.449937 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33627f57-553f-4c87-a517-4fbe8d221665-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "33627f57-553f-4c87-a517-4fbe8d221665" (UID: "33627f57-553f-4c87-a517-4fbe8d221665"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 15:15:22 crc kubenswrapper[4817]: I0218 15:15:22.473920 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33627f57-553f-4c87-a517-4fbe8d221665-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "33627f57-553f-4c87-a517-4fbe8d221665" (UID: "33627f57-553f-4c87-a517-4fbe8d221665"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 15:15:22 crc kubenswrapper[4817]: I0218 15:15:22.489263 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33627f57-553f-4c87-a517-4fbe8d221665-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "33627f57-553f-4c87-a517-4fbe8d221665" (UID: "33627f57-553f-4c87-a517-4fbe8d221665"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 15:15:22 crc kubenswrapper[4817]: I0218 15:15:22.521840 4817 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/33627f57-553f-4c87-a517-4fbe8d221665-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 18 15:15:22 crc kubenswrapper[4817]: I0218 15:15:22.521887 4817 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/33627f57-553f-4c87-a517-4fbe8d221665-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 18 15:15:22 crc kubenswrapper[4817]: I0218 15:15:22.521903 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5blq4\" (UniqueName: \"kubernetes.io/projected/33627f57-553f-4c87-a517-4fbe8d221665-kube-api-access-5blq4\") on node \"crc\" DevicePath \"\"" Feb 18 15:15:22 crc kubenswrapper[4817]: I0218 15:15:22.521917 4817 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/33627f57-553f-4c87-a517-4fbe8d221665-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 18 15:15:22 crc kubenswrapper[4817]: I0218 15:15:22.521929 4817 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33627f57-553f-4c87-a517-4fbe8d221665-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 18 15:15:22 crc kubenswrapper[4817]: I0218 15:15:22.521966 4817 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 18 15:15:22 crc kubenswrapper[4817]: I0218 15:15:22.521998 4817 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/33627f57-553f-4c87-a517-4fbe8d221665-config-data\") on node \"crc\" DevicePath \"\"" Feb 18 15:15:22 crc kubenswrapper[4817]: I0218 15:15:22.548910 4817 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 18 15:15:22 crc kubenswrapper[4817]: I0218 15:15:22.624622 4817 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 18 15:15:22 crc kubenswrapper[4817]: I0218 15:15:22.853841 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33627f57-553f-4c87-a517-4fbe8d221665-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "33627f57-553f-4c87-a517-4fbe8d221665" (UID: "33627f57-553f-4c87-a517-4fbe8d221665"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 15:15:22 crc kubenswrapper[4817]: I0218 15:15:22.930117 4817 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/33627f57-553f-4c87-a517-4fbe8d221665-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 18 15:15:25 crc kubenswrapper[4817]: I0218 15:15:25.172031 4817 scope.go:117] "RemoveContainer" containerID="4984bca70bc0993e719f0bc7f551bc184a0706e39bad62118e0e29921ef0f456" Feb 18 15:15:25 crc kubenswrapper[4817]: E0218 15:15:25.172587 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:15:33 crc kubenswrapper[4817]: I0218 15:15:33.943687 4817 scope.go:117] "RemoveContainer" containerID="22578da341ece50346a885ac68787bde287f82a2032211be12e36cb991fe55fa" Feb 18 15:15:34 crc kubenswrapper[4817]: I0218 15:15:34.718444 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 18 15:15:34 crc kubenswrapper[4817]: E0218 15:15:34.719796 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33627f57-553f-4c87-a517-4fbe8d221665" containerName="tempest-tests-tempest-tests-runner" Feb 18 15:15:34 crc kubenswrapper[4817]: I0218 15:15:34.719831 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="33627f57-553f-4c87-a517-4fbe8d221665" containerName="tempest-tests-tempest-tests-runner" Feb 18 15:15:34 crc kubenswrapper[4817]: E0218 15:15:34.719868 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54a9abc2-9d84-4822-85b9-94505c4ba9df" containerName="collect-profiles" Feb 18 15:15:34 crc kubenswrapper[4817]: I0218 15:15:34.719875 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="54a9abc2-9d84-4822-85b9-94505c4ba9df" containerName="collect-profiles" Feb 18 15:15:34 crc kubenswrapper[4817]: I0218 15:15:34.720238 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="54a9abc2-9d84-4822-85b9-94505c4ba9df" containerName="collect-profiles" Feb 18 15:15:34 crc kubenswrapper[4817]: I0218 15:15:34.720273 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="33627f57-553f-4c87-a517-4fbe8d221665" containerName="tempest-tests-tempest-tests-runner" Feb 18 15:15:34 crc kubenswrapper[4817]: I0218 15:15:34.721627 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 15:15:34 crc kubenswrapper[4817]: I0218 15:15:34.723949 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-rrsh5" Feb 18 15:15:34 crc kubenswrapper[4817]: I0218 15:15:34.772044 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 18 15:15:34 crc kubenswrapper[4817]: I0218 15:15:34.787391 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"943aa309-38ce-4bba-8183-bfaf4357b702\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 15:15:34 crc kubenswrapper[4817]: I0218 15:15:34.787608 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lfph\" (UniqueName: \"kubernetes.io/projected/943aa309-38ce-4bba-8183-bfaf4357b702-kube-api-access-8lfph\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"943aa309-38ce-4bba-8183-bfaf4357b702\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 15:15:34 crc kubenswrapper[4817]: I0218 15:15:34.889337 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lfph\" (UniqueName: \"kubernetes.io/projected/943aa309-38ce-4bba-8183-bfaf4357b702-kube-api-access-8lfph\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"943aa309-38ce-4bba-8183-bfaf4357b702\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 15:15:34 crc kubenswrapper[4817]: I0218 15:15:34.889511 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"943aa309-38ce-4bba-8183-bfaf4357b702\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 15:15:34 crc kubenswrapper[4817]: I0218 15:15:34.890101 4817 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"943aa309-38ce-4bba-8183-bfaf4357b702\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 15:15:34 crc kubenswrapper[4817]: I0218 15:15:34.956433 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lfph\" (UniqueName: \"kubernetes.io/projected/943aa309-38ce-4bba-8183-bfaf4357b702-kube-api-access-8lfph\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"943aa309-38ce-4bba-8183-bfaf4357b702\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 15:15:34 crc kubenswrapper[4817]: I0218 15:15:34.992515 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"943aa309-38ce-4bba-8183-bfaf4357b702\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 15:15:35 crc kubenswrapper[4817]: I0218 15:15:35.042664 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 18 15:15:35 crc kubenswrapper[4817]: I0218 15:15:35.531593 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 18 15:15:35 crc kubenswrapper[4817]: I0218 15:15:35.548997 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"943aa309-38ce-4bba-8183-bfaf4357b702","Type":"ContainerStarted","Data":"c2177ab3f495dce452f11ee6af03df82603bf1d1c22cb6e2df8dd2ac284c918a"} Feb 18 15:15:36 crc kubenswrapper[4817]: I0218 15:15:36.172473 4817 scope.go:117] "RemoveContainer" containerID="4984bca70bc0993e719f0bc7f551bc184a0706e39bad62118e0e29921ef0f456" Feb 18 15:15:36 crc kubenswrapper[4817]: E0218 15:15:36.172784 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:15:37 crc kubenswrapper[4817]: I0218 15:15:37.575136 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"943aa309-38ce-4bba-8183-bfaf4357b702","Type":"ContainerStarted","Data":"c7f4a4cf1f95470a09c26b369168a964526666a9f5f2936a1b0e0383aab384cd"} Feb 18 15:15:37 crc kubenswrapper[4817]: I0218 15:15:37.597907 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.627473996 podStartE2EDuration="3.597888146s" podCreationTimestamp="2026-02-18 15:15:34 +0000 UTC" firstStartedPulling="2026-02-18 15:15:35.53901342 +0000 UTC m=+4598.114549393" lastFinishedPulling="2026-02-18 15:15:36.50942756 +0000 UTC m=+4599.084963543" observedRunningTime="2026-02-18 15:15:37.585633307 +0000 UTC m=+4600.161169290" watchObservedRunningTime="2026-02-18 15:15:37.597888146 +0000 UTC m=+4600.173424129" Feb 18 15:15:47 crc kubenswrapper[4817]: I0218 15:15:47.171587 4817 scope.go:117] "RemoveContainer" containerID="4984bca70bc0993e719f0bc7f551bc184a0706e39bad62118e0e29921ef0f456" Feb 18 15:15:47 crc kubenswrapper[4817]: E0218 15:15:47.172469 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:15:59 crc kubenswrapper[4817]: I0218 15:15:59.172020 4817 scope.go:117] "RemoveContainer" containerID="4984bca70bc0993e719f0bc7f551bc184a0706e39bad62118e0e29921ef0f456" Feb 18 15:15:59 crc kubenswrapper[4817]: E0218 15:15:59.172704 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:16:08 crc kubenswrapper[4817]: I0218 15:16:08.289415 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-khngq/must-gather-8xxh7"] Feb 18 15:16:08 crc kubenswrapper[4817]: I0218 15:16:08.327147 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-khngq/must-gather-8xxh7"] Feb 18 15:16:08 crc kubenswrapper[4817]: I0218 15:16:08.327252 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-khngq/must-gather-8xxh7" Feb 18 15:16:08 crc kubenswrapper[4817]: I0218 15:16:08.329424 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-khngq"/"default-dockercfg-xn7rn" Feb 18 15:16:08 crc kubenswrapper[4817]: I0218 15:16:08.329805 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-khngq"/"kube-root-ca.crt" Feb 18 15:16:08 crc kubenswrapper[4817]: I0218 15:16:08.330358 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-khngq"/"openshift-service-ca.crt" Feb 18 15:16:08 crc kubenswrapper[4817]: I0218 15:16:08.441111 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/10c9d1d1-19a5-49f6-9466-5395dc592916-must-gather-output\") pod \"must-gather-8xxh7\" (UID: \"10c9d1d1-19a5-49f6-9466-5395dc592916\") " pod="openshift-must-gather-khngq/must-gather-8xxh7" Feb 18 15:16:08 crc kubenswrapper[4817]: I0218 15:16:08.441241 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s554\" (UniqueName: \"kubernetes.io/projected/10c9d1d1-19a5-49f6-9466-5395dc592916-kube-api-access-5s554\") pod \"must-gather-8xxh7\" (UID: \"10c9d1d1-19a5-49f6-9466-5395dc592916\") " pod="openshift-must-gather-khngq/must-gather-8xxh7" Feb 18 15:16:08 crc kubenswrapper[4817]: I0218 15:16:08.543468 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/10c9d1d1-19a5-49f6-9466-5395dc592916-must-gather-output\") pod \"must-gather-8xxh7\" (UID: \"10c9d1d1-19a5-49f6-9466-5395dc592916\") " pod="openshift-must-gather-khngq/must-gather-8xxh7" Feb 18 15:16:08 crc kubenswrapper[4817]: I0218 15:16:08.543561 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s554\" (UniqueName: \"kubernetes.io/projected/10c9d1d1-19a5-49f6-9466-5395dc592916-kube-api-access-5s554\") pod \"must-gather-8xxh7\" (UID: \"10c9d1d1-19a5-49f6-9466-5395dc592916\") " pod="openshift-must-gather-khngq/must-gather-8xxh7" Feb 18 15:16:08 crc kubenswrapper[4817]: I0218 15:16:08.544037 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/10c9d1d1-19a5-49f6-9466-5395dc592916-must-gather-output\") pod \"must-gather-8xxh7\" (UID: \"10c9d1d1-19a5-49f6-9466-5395dc592916\") " pod="openshift-must-gather-khngq/must-gather-8xxh7" Feb 18 15:16:08 crc kubenswrapper[4817]: I0218 15:16:08.576547 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s554\" (UniqueName: \"kubernetes.io/projected/10c9d1d1-19a5-49f6-9466-5395dc592916-kube-api-access-5s554\") pod \"must-gather-8xxh7\" (UID: \"10c9d1d1-19a5-49f6-9466-5395dc592916\") " pod="openshift-must-gather-khngq/must-gather-8xxh7" Feb 18 15:16:08 crc kubenswrapper[4817]: I0218 15:16:08.663146 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-khngq/must-gather-8xxh7" Feb 18 15:16:09 crc kubenswrapper[4817]: I0218 15:16:09.203902 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-khngq/must-gather-8xxh7"] Feb 18 15:16:09 crc kubenswrapper[4817]: I0218 15:16:09.884463 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-khngq/must-gather-8xxh7" event={"ID":"10c9d1d1-19a5-49f6-9466-5395dc592916","Type":"ContainerStarted","Data":"e40bb4f244cb1bad149d7e1a20e44ee5b3d76316f91e101a2d8c5c97c68f4fd5"} Feb 18 15:16:13 crc kubenswrapper[4817]: I0218 15:16:13.171479 4817 scope.go:117] "RemoveContainer" containerID="4984bca70bc0993e719f0bc7f551bc184a0706e39bad62118e0e29921ef0f456" Feb 18 15:16:13 crc kubenswrapper[4817]: E0218 15:16:13.172377 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:16:16 crc kubenswrapper[4817]: I0218 15:16:16.958636 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-khngq/must-gather-8xxh7" event={"ID":"10c9d1d1-19a5-49f6-9466-5395dc592916","Type":"ContainerStarted","Data":"f90b8a1390161835f4f4c2bae86ad312505d978f84aaff3153555a382150c1ef"} Feb 18 15:16:17 crc kubenswrapper[4817]: I0218 15:16:17.969944 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-khngq/must-gather-8xxh7" event={"ID":"10c9d1d1-19a5-49f6-9466-5395dc592916","Type":"ContainerStarted","Data":"52737c484eadbc6de7ed9d471b9d6ffefbc1357ec6591d9ff6924c0234658ad0"} Feb 18 15:16:17 crc kubenswrapper[4817]: I0218 15:16:17.993351 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-khngq/must-gather-8xxh7" podStartSLOduration=2.570202569 podStartE2EDuration="9.993324546s" podCreationTimestamp="2026-02-18 15:16:08 +0000 UTC" firstStartedPulling="2026-02-18 15:16:09.210762123 +0000 UTC m=+4631.786298106" lastFinishedPulling="2026-02-18 15:16:16.6338841 +0000 UTC m=+4639.209420083" observedRunningTime="2026-02-18 15:16:17.986104444 +0000 UTC m=+4640.561640427" watchObservedRunningTime="2026-02-18 15:16:17.993324546 +0000 UTC m=+4640.568860529" Feb 18 15:16:22 crc kubenswrapper[4817]: I0218 15:16:22.303582 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-khngq/crc-debug-gr9gh"] Feb 18 15:16:22 crc kubenswrapper[4817]: I0218 15:16:22.305545 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-khngq/crc-debug-gr9gh" Feb 18 15:16:22 crc kubenswrapper[4817]: I0218 15:16:22.463275 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9cxw\" (UniqueName: \"kubernetes.io/projected/6cfd5bd1-4f61-4ca2-914c-823b7620e4a1-kube-api-access-d9cxw\") pod \"crc-debug-gr9gh\" (UID: \"6cfd5bd1-4f61-4ca2-914c-823b7620e4a1\") " pod="openshift-must-gather-khngq/crc-debug-gr9gh" Feb 18 15:16:22 crc kubenswrapper[4817]: I0218 15:16:22.463403 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6cfd5bd1-4f61-4ca2-914c-823b7620e4a1-host\") pod \"crc-debug-gr9gh\" (UID: \"6cfd5bd1-4f61-4ca2-914c-823b7620e4a1\") " pod="openshift-must-gather-khngq/crc-debug-gr9gh" Feb 18 15:16:22 crc kubenswrapper[4817]: I0218 15:16:22.564722 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9cxw\" (UniqueName: \"kubernetes.io/projected/6cfd5bd1-4f61-4ca2-914c-823b7620e4a1-kube-api-access-d9cxw\") pod \"crc-debug-gr9gh\" (UID: \"6cfd5bd1-4f61-4ca2-914c-823b7620e4a1\") " pod="openshift-must-gather-khngq/crc-debug-gr9gh" Feb 18 15:16:22 crc kubenswrapper[4817]: I0218 15:16:22.564821 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6cfd5bd1-4f61-4ca2-914c-823b7620e4a1-host\") pod \"crc-debug-gr9gh\" (UID: \"6cfd5bd1-4f61-4ca2-914c-823b7620e4a1\") " pod="openshift-must-gather-khngq/crc-debug-gr9gh" Feb 18 15:16:22 crc kubenswrapper[4817]: I0218 15:16:22.565000 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6cfd5bd1-4f61-4ca2-914c-823b7620e4a1-host\") pod \"crc-debug-gr9gh\" (UID: \"6cfd5bd1-4f61-4ca2-914c-823b7620e4a1\") " pod="openshift-must-gather-khngq/crc-debug-gr9gh" Feb 18 15:16:22 crc kubenswrapper[4817]: I0218 15:16:22.587706 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9cxw\" (UniqueName: \"kubernetes.io/projected/6cfd5bd1-4f61-4ca2-914c-823b7620e4a1-kube-api-access-d9cxw\") pod \"crc-debug-gr9gh\" (UID: \"6cfd5bd1-4f61-4ca2-914c-823b7620e4a1\") " pod="openshift-must-gather-khngq/crc-debug-gr9gh" Feb 18 15:16:22 crc kubenswrapper[4817]: I0218 15:16:22.623472 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-khngq/crc-debug-gr9gh" Feb 18 15:16:23 crc kubenswrapper[4817]: I0218 15:16:23.016936 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-khngq/crc-debug-gr9gh" event={"ID":"6cfd5bd1-4f61-4ca2-914c-823b7620e4a1","Type":"ContainerStarted","Data":"aefc7d426c08f5a8ba4fc99c2d4ac8c8caae417eacfce9d2c9a1b6eb1691a535"} Feb 18 15:16:28 crc kubenswrapper[4817]: I0218 15:16:28.180862 4817 scope.go:117] "RemoveContainer" containerID="4984bca70bc0993e719f0bc7f551bc184a0706e39bad62118e0e29921ef0f456" Feb 18 15:16:28 crc kubenswrapper[4817]: E0218 15:16:28.181799 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:16:38 crc kubenswrapper[4817]: I0218 15:16:38.201046 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-khngq/crc-debug-gr9gh" event={"ID":"6cfd5bd1-4f61-4ca2-914c-823b7620e4a1","Type":"ContainerStarted","Data":"c813cd2d1cd6c8bb2376f843f249feb3bd90ba048ac18f948a8a71f735339017"} Feb 18 15:16:38 crc kubenswrapper[4817]: I0218 15:16:38.228435 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-khngq/crc-debug-gr9gh" podStartSLOduration=1.213683403 podStartE2EDuration="16.228410935s" podCreationTimestamp="2026-02-18 15:16:22 +0000 UTC" firstStartedPulling="2026-02-18 15:16:22.714146331 +0000 UTC m=+4645.289682314" lastFinishedPulling="2026-02-18 15:16:37.728873873 +0000 UTC m=+4660.304409846" observedRunningTime="2026-02-18 15:16:38.220595228 +0000 UTC m=+4660.796131221" watchObservedRunningTime="2026-02-18 15:16:38.228410935 +0000 UTC m=+4660.803946918" Feb 18 15:16:39 crc kubenswrapper[4817]: I0218 15:16:39.171547 4817 scope.go:117] "RemoveContainer" containerID="4984bca70bc0993e719f0bc7f551bc184a0706e39bad62118e0e29921ef0f456" Feb 18 15:16:39 crc kubenswrapper[4817]: E0218 15:16:39.172335 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:16:53 crc kubenswrapper[4817]: I0218 15:16:53.172617 4817 scope.go:117] "RemoveContainer" containerID="4984bca70bc0993e719f0bc7f551bc184a0706e39bad62118e0e29921ef0f456" Feb 18 15:16:53 crc kubenswrapper[4817]: E0218 15:16:53.174635 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:17:04 crc kubenswrapper[4817]: I0218 15:17:04.171671 4817 scope.go:117] "RemoveContainer" containerID="4984bca70bc0993e719f0bc7f551bc184a0706e39bad62118e0e29921ef0f456" Feb 18 15:17:04 crc kubenswrapper[4817]: E0218 15:17:04.172558 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:17:17 crc kubenswrapper[4817]: I0218 15:17:17.172573 4817 scope.go:117] "RemoveContainer" containerID="4984bca70bc0993e719f0bc7f551bc184a0706e39bad62118e0e29921ef0f456" Feb 18 15:17:17 crc kubenswrapper[4817]: E0218 15:17:17.173514 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:17:32 crc kubenswrapper[4817]: I0218 15:17:32.172562 4817 scope.go:117] "RemoveContainer" containerID="4984bca70bc0993e719f0bc7f551bc184a0706e39bad62118e0e29921ef0f456" Feb 18 15:17:32 crc kubenswrapper[4817]: E0218 15:17:32.173336 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:17:41 crc kubenswrapper[4817]: I0218 15:17:41.817833 4817 generic.go:334] "Generic (PLEG): container finished" podID="6cfd5bd1-4f61-4ca2-914c-823b7620e4a1" containerID="c813cd2d1cd6c8bb2376f843f249feb3bd90ba048ac18f948a8a71f735339017" exitCode=0 Feb 18 15:17:41 crc kubenswrapper[4817]: I0218 15:17:41.817876 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-khngq/crc-debug-gr9gh" event={"ID":"6cfd5bd1-4f61-4ca2-914c-823b7620e4a1","Type":"ContainerDied","Data":"c813cd2d1cd6c8bb2376f843f249feb3bd90ba048ac18f948a8a71f735339017"} Feb 18 15:17:43 crc kubenswrapper[4817]: I0218 15:17:43.176264 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-khngq/crc-debug-gr9gh" Feb 18 15:17:43 crc kubenswrapper[4817]: I0218 15:17:43.212704 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-khngq/crc-debug-gr9gh"] Feb 18 15:17:43 crc kubenswrapper[4817]: I0218 15:17:43.224039 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-khngq/crc-debug-gr9gh"] Feb 18 15:17:43 crc kubenswrapper[4817]: I0218 15:17:43.309314 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9cxw\" (UniqueName: \"kubernetes.io/projected/6cfd5bd1-4f61-4ca2-914c-823b7620e4a1-kube-api-access-d9cxw\") pod \"6cfd5bd1-4f61-4ca2-914c-823b7620e4a1\" (UID: \"6cfd5bd1-4f61-4ca2-914c-823b7620e4a1\") " Feb 18 15:17:43 crc kubenswrapper[4817]: I0218 15:17:43.309617 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6cfd5bd1-4f61-4ca2-914c-823b7620e4a1-host\") pod \"6cfd5bd1-4f61-4ca2-914c-823b7620e4a1\" (UID: \"6cfd5bd1-4f61-4ca2-914c-823b7620e4a1\") " Feb 18 15:17:43 crc kubenswrapper[4817]: I0218 15:17:43.310882 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6cfd5bd1-4f61-4ca2-914c-823b7620e4a1-host" (OuterVolumeSpecName: "host") pod "6cfd5bd1-4f61-4ca2-914c-823b7620e4a1" (UID: "6cfd5bd1-4f61-4ca2-914c-823b7620e4a1"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 15:17:43 crc kubenswrapper[4817]: I0218 15:17:43.327224 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cfd5bd1-4f61-4ca2-914c-823b7620e4a1-kube-api-access-d9cxw" (OuterVolumeSpecName: "kube-api-access-d9cxw") pod "6cfd5bd1-4f61-4ca2-914c-823b7620e4a1" (UID: "6cfd5bd1-4f61-4ca2-914c-823b7620e4a1"). InnerVolumeSpecName "kube-api-access-d9cxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 15:17:43 crc kubenswrapper[4817]: I0218 15:17:43.412160 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9cxw\" (UniqueName: \"kubernetes.io/projected/6cfd5bd1-4f61-4ca2-914c-823b7620e4a1-kube-api-access-d9cxw\") on node \"crc\" DevicePath \"\"" Feb 18 15:17:43 crc kubenswrapper[4817]: I0218 15:17:43.412212 4817 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6cfd5bd1-4f61-4ca2-914c-823b7620e4a1-host\") on node \"crc\" DevicePath \"\"" Feb 18 15:17:43 crc kubenswrapper[4817]: I0218 15:17:43.837765 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aefc7d426c08f5a8ba4fc99c2d4ac8c8caae417eacfce9d2c9a1b6eb1691a535" Feb 18 15:17:43 crc kubenswrapper[4817]: I0218 15:17:43.837826 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-khngq/crc-debug-gr9gh" Feb 18 15:17:44 crc kubenswrapper[4817]: I0218 15:17:44.173356 4817 scope.go:117] "RemoveContainer" containerID="4984bca70bc0993e719f0bc7f551bc184a0706e39bad62118e0e29921ef0f456" Feb 18 15:17:44 crc kubenswrapper[4817]: E0218 15:17:44.174248 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:17:44 crc kubenswrapper[4817]: I0218 15:17:44.184724 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cfd5bd1-4f61-4ca2-914c-823b7620e4a1" path="/var/lib/kubelet/pods/6cfd5bd1-4f61-4ca2-914c-823b7620e4a1/volumes" Feb 18 15:17:44 crc kubenswrapper[4817]: I0218 15:17:44.418189 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-khngq/crc-debug-vk6mv"] Feb 18 15:17:44 crc kubenswrapper[4817]: E0218 15:17:44.418650 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cfd5bd1-4f61-4ca2-914c-823b7620e4a1" containerName="container-00" Feb 18 15:17:44 crc kubenswrapper[4817]: I0218 15:17:44.418680 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cfd5bd1-4f61-4ca2-914c-823b7620e4a1" containerName="container-00" Feb 18 15:17:44 crc kubenswrapper[4817]: I0218 15:17:44.418970 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cfd5bd1-4f61-4ca2-914c-823b7620e4a1" containerName="container-00" Feb 18 15:17:44 crc kubenswrapper[4817]: I0218 15:17:44.419874 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-khngq/crc-debug-vk6mv" Feb 18 15:17:44 crc kubenswrapper[4817]: I0218 15:17:44.533773 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b946c\" (UniqueName: \"kubernetes.io/projected/92fe09f9-6fee-4f62-a3eb-27d9118c8646-kube-api-access-b946c\") pod \"crc-debug-vk6mv\" (UID: \"92fe09f9-6fee-4f62-a3eb-27d9118c8646\") " pod="openshift-must-gather-khngq/crc-debug-vk6mv" Feb 18 15:17:44 crc kubenswrapper[4817]: I0218 15:17:44.534258 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/92fe09f9-6fee-4f62-a3eb-27d9118c8646-host\") pod \"crc-debug-vk6mv\" (UID: \"92fe09f9-6fee-4f62-a3eb-27d9118c8646\") " pod="openshift-must-gather-khngq/crc-debug-vk6mv" Feb 18 15:17:44 crc kubenswrapper[4817]: I0218 15:17:44.635617 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b946c\" (UniqueName: \"kubernetes.io/projected/92fe09f9-6fee-4f62-a3eb-27d9118c8646-kube-api-access-b946c\") pod \"crc-debug-vk6mv\" (UID: \"92fe09f9-6fee-4f62-a3eb-27d9118c8646\") " pod="openshift-must-gather-khngq/crc-debug-vk6mv" Feb 18 15:17:44 crc kubenswrapper[4817]: I0218 15:17:44.635798 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/92fe09f9-6fee-4f62-a3eb-27d9118c8646-host\") pod \"crc-debug-vk6mv\" (UID: \"92fe09f9-6fee-4f62-a3eb-27d9118c8646\") " pod="openshift-must-gather-khngq/crc-debug-vk6mv" Feb 18 15:17:44 crc kubenswrapper[4817]: I0218 15:17:44.636014 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/92fe09f9-6fee-4f62-a3eb-27d9118c8646-host\") pod \"crc-debug-vk6mv\" (UID: \"92fe09f9-6fee-4f62-a3eb-27d9118c8646\") " pod="openshift-must-gather-khngq/crc-debug-vk6mv" Feb 18 15:17:44 crc kubenswrapper[4817]: I0218 15:17:44.659710 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b946c\" (UniqueName: \"kubernetes.io/projected/92fe09f9-6fee-4f62-a3eb-27d9118c8646-kube-api-access-b946c\") pod \"crc-debug-vk6mv\" (UID: \"92fe09f9-6fee-4f62-a3eb-27d9118c8646\") " pod="openshift-must-gather-khngq/crc-debug-vk6mv" Feb 18 15:17:44 crc kubenswrapper[4817]: I0218 15:17:44.740557 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-khngq/crc-debug-vk6mv" Feb 18 15:17:44 crc kubenswrapper[4817]: I0218 15:17:44.848878 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-khngq/crc-debug-vk6mv" event={"ID":"92fe09f9-6fee-4f62-a3eb-27d9118c8646","Type":"ContainerStarted","Data":"31a722a8d22def5aae2508acbc21546e81aa0538c2a0a457ed3354ed17d7cfb1"} Feb 18 15:17:45 crc kubenswrapper[4817]: I0218 15:17:45.861218 4817 generic.go:334] "Generic (PLEG): container finished" podID="92fe09f9-6fee-4f62-a3eb-27d9118c8646" containerID="f417b606eacd8b00a427ec382dcc8fdbbb60a4eef17209504cb18e3ab9f01803" exitCode=0 Feb 18 15:17:45 crc kubenswrapper[4817]: I0218 15:17:45.861270 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-khngq/crc-debug-vk6mv" event={"ID":"92fe09f9-6fee-4f62-a3eb-27d9118c8646","Type":"ContainerDied","Data":"f417b606eacd8b00a427ec382dcc8fdbbb60a4eef17209504cb18e3ab9f01803"} Feb 18 15:17:46 crc kubenswrapper[4817]: I0218 15:17:46.993747 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-khngq/crc-debug-vk6mv" Feb 18 15:17:47 crc kubenswrapper[4817]: I0218 15:17:47.089675 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/92fe09f9-6fee-4f62-a3eb-27d9118c8646-host\") pod \"92fe09f9-6fee-4f62-a3eb-27d9118c8646\" (UID: \"92fe09f9-6fee-4f62-a3eb-27d9118c8646\") " Feb 18 15:17:47 crc kubenswrapper[4817]: I0218 15:17:47.089919 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b946c\" (UniqueName: \"kubernetes.io/projected/92fe09f9-6fee-4f62-a3eb-27d9118c8646-kube-api-access-b946c\") pod \"92fe09f9-6fee-4f62-a3eb-27d9118c8646\" (UID: \"92fe09f9-6fee-4f62-a3eb-27d9118c8646\") " Feb 18 15:17:47 crc kubenswrapper[4817]: I0218 15:17:47.090055 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92fe09f9-6fee-4f62-a3eb-27d9118c8646-host" (OuterVolumeSpecName: "host") pod "92fe09f9-6fee-4f62-a3eb-27d9118c8646" (UID: "92fe09f9-6fee-4f62-a3eb-27d9118c8646"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 15:17:47 crc kubenswrapper[4817]: I0218 15:17:47.090601 4817 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/92fe09f9-6fee-4f62-a3eb-27d9118c8646-host\") on node \"crc\" DevicePath \"\"" Feb 18 15:17:47 crc kubenswrapper[4817]: I0218 15:17:47.115324 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92fe09f9-6fee-4f62-a3eb-27d9118c8646-kube-api-access-b946c" (OuterVolumeSpecName: "kube-api-access-b946c") pod "92fe09f9-6fee-4f62-a3eb-27d9118c8646" (UID: "92fe09f9-6fee-4f62-a3eb-27d9118c8646"). InnerVolumeSpecName "kube-api-access-b946c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 15:17:47 crc kubenswrapper[4817]: I0218 15:17:47.191996 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b946c\" (UniqueName: \"kubernetes.io/projected/92fe09f9-6fee-4f62-a3eb-27d9118c8646-kube-api-access-b946c\") on node \"crc\" DevicePath \"\"" Feb 18 15:17:47 crc kubenswrapper[4817]: I0218 15:17:47.765921 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-khngq/crc-debug-vk6mv"] Feb 18 15:17:47 crc kubenswrapper[4817]: I0218 15:17:47.776289 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-khngq/crc-debug-vk6mv"] Feb 18 15:17:47 crc kubenswrapper[4817]: I0218 15:17:47.890676 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31a722a8d22def5aae2508acbc21546e81aa0538c2a0a457ed3354ed17d7cfb1" Feb 18 15:17:47 crc kubenswrapper[4817]: I0218 15:17:47.890933 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-khngq/crc-debug-vk6mv" Feb 18 15:17:48 crc kubenswrapper[4817]: I0218 15:17:48.186931 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92fe09f9-6fee-4f62-a3eb-27d9118c8646" path="/var/lib/kubelet/pods/92fe09f9-6fee-4f62-a3eb-27d9118c8646/volumes" Feb 18 15:17:48 crc kubenswrapper[4817]: I0218 15:17:48.975814 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-khngq/crc-debug-6dqc6"] Feb 18 15:17:48 crc kubenswrapper[4817]: E0218 15:17:48.976420 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92fe09f9-6fee-4f62-a3eb-27d9118c8646" containerName="container-00" Feb 18 15:17:48 crc kubenswrapper[4817]: I0218 15:17:48.976445 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="92fe09f9-6fee-4f62-a3eb-27d9118c8646" containerName="container-00" Feb 18 15:17:48 crc kubenswrapper[4817]: I0218 15:17:48.976683 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="92fe09f9-6fee-4f62-a3eb-27d9118c8646" containerName="container-00" Feb 18 15:17:48 crc kubenswrapper[4817]: I0218 15:17:48.977613 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-khngq/crc-debug-6dqc6" Feb 18 15:17:49 crc kubenswrapper[4817]: I0218 15:17:49.029239 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqpvl\" (UniqueName: \"kubernetes.io/projected/d057fd1a-3606-47d7-8a04-c2782c959e51-kube-api-access-mqpvl\") pod \"crc-debug-6dqc6\" (UID: \"d057fd1a-3606-47d7-8a04-c2782c959e51\") " pod="openshift-must-gather-khngq/crc-debug-6dqc6" Feb 18 15:17:49 crc kubenswrapper[4817]: I0218 15:17:49.029535 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d057fd1a-3606-47d7-8a04-c2782c959e51-host\") pod \"crc-debug-6dqc6\" (UID: \"d057fd1a-3606-47d7-8a04-c2782c959e51\") " pod="openshift-must-gather-khngq/crc-debug-6dqc6" Feb 18 15:17:49 crc kubenswrapper[4817]: I0218 15:17:49.132268 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqpvl\" (UniqueName: \"kubernetes.io/projected/d057fd1a-3606-47d7-8a04-c2782c959e51-kube-api-access-mqpvl\") pod \"crc-debug-6dqc6\" (UID: \"d057fd1a-3606-47d7-8a04-c2782c959e51\") " pod="openshift-must-gather-khngq/crc-debug-6dqc6" Feb 18 15:17:49 crc kubenswrapper[4817]: I0218 15:17:49.132465 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d057fd1a-3606-47d7-8a04-c2782c959e51-host\") pod \"crc-debug-6dqc6\" (UID: \"d057fd1a-3606-47d7-8a04-c2782c959e51\") " pod="openshift-must-gather-khngq/crc-debug-6dqc6" Feb 18 15:17:49 crc kubenswrapper[4817]: I0218 15:17:49.132593 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d057fd1a-3606-47d7-8a04-c2782c959e51-host\") pod \"crc-debug-6dqc6\" (UID: \"d057fd1a-3606-47d7-8a04-c2782c959e51\") " pod="openshift-must-gather-khngq/crc-debug-6dqc6" Feb 18 15:17:49 crc kubenswrapper[4817]: I0218 15:17:49.152153 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqpvl\" (UniqueName: \"kubernetes.io/projected/d057fd1a-3606-47d7-8a04-c2782c959e51-kube-api-access-mqpvl\") pod \"crc-debug-6dqc6\" (UID: \"d057fd1a-3606-47d7-8a04-c2782c959e51\") " pod="openshift-must-gather-khngq/crc-debug-6dqc6" Feb 18 15:17:49 crc kubenswrapper[4817]: I0218 15:17:49.298436 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-khngq/crc-debug-6dqc6" Feb 18 15:17:49 crc kubenswrapper[4817]: I0218 15:17:49.912492 4817 generic.go:334] "Generic (PLEG): container finished" podID="d057fd1a-3606-47d7-8a04-c2782c959e51" containerID="55bd525541c028485b307aad553eb91c3799928aa39288a5a3bf4bc25807889a" exitCode=0 Feb 18 15:17:49 crc kubenswrapper[4817]: I0218 15:17:49.912608 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-khngq/crc-debug-6dqc6" event={"ID":"d057fd1a-3606-47d7-8a04-c2782c959e51","Type":"ContainerDied","Data":"55bd525541c028485b307aad553eb91c3799928aa39288a5a3bf4bc25807889a"} Feb 18 15:17:49 crc kubenswrapper[4817]: I0218 15:17:49.913160 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-khngq/crc-debug-6dqc6" event={"ID":"d057fd1a-3606-47d7-8a04-c2782c959e51","Type":"ContainerStarted","Data":"623e7c8c2baec9e3b17713661faf550e5e96c1a322cfe2cabe8a9be33c235fdc"} Feb 18 15:17:49 crc kubenswrapper[4817]: I0218 15:17:49.963033 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-khngq/crc-debug-6dqc6"] Feb 18 15:17:49 crc kubenswrapper[4817]: I0218 15:17:49.972759 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-khngq/crc-debug-6dqc6"] Feb 18 15:17:51 crc kubenswrapper[4817]: I0218 15:17:51.047411 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-khngq/crc-debug-6dqc6" Feb 18 15:17:51 crc kubenswrapper[4817]: I0218 15:17:51.067353 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqpvl\" (UniqueName: \"kubernetes.io/projected/d057fd1a-3606-47d7-8a04-c2782c959e51-kube-api-access-mqpvl\") pod \"d057fd1a-3606-47d7-8a04-c2782c959e51\" (UID: \"d057fd1a-3606-47d7-8a04-c2782c959e51\") " Feb 18 15:17:51 crc kubenswrapper[4817]: I0218 15:17:51.067478 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d057fd1a-3606-47d7-8a04-c2782c959e51-host\") pod \"d057fd1a-3606-47d7-8a04-c2782c959e51\" (UID: \"d057fd1a-3606-47d7-8a04-c2782c959e51\") " Feb 18 15:17:51 crc kubenswrapper[4817]: I0218 15:17:51.068192 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d057fd1a-3606-47d7-8a04-c2782c959e51-host" (OuterVolumeSpecName: "host") pod "d057fd1a-3606-47d7-8a04-c2782c959e51" (UID: "d057fd1a-3606-47d7-8a04-c2782c959e51"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 15:17:51 crc kubenswrapper[4817]: I0218 15:17:51.074786 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d057fd1a-3606-47d7-8a04-c2782c959e51-kube-api-access-mqpvl" (OuterVolumeSpecName: "kube-api-access-mqpvl") pod "d057fd1a-3606-47d7-8a04-c2782c959e51" (UID: "d057fd1a-3606-47d7-8a04-c2782c959e51"). InnerVolumeSpecName "kube-api-access-mqpvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 15:17:51 crc kubenswrapper[4817]: I0218 15:17:51.170703 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqpvl\" (UniqueName: \"kubernetes.io/projected/d057fd1a-3606-47d7-8a04-c2782c959e51-kube-api-access-mqpvl\") on node \"crc\" DevicePath \"\"" Feb 18 15:17:51 crc kubenswrapper[4817]: I0218 15:17:51.170750 4817 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d057fd1a-3606-47d7-8a04-c2782c959e51-host\") on node \"crc\" DevicePath \"\"" Feb 18 15:17:51 crc kubenswrapper[4817]: I0218 15:17:51.934855 4817 scope.go:117] "RemoveContainer" containerID="55bd525541c028485b307aad553eb91c3799928aa39288a5a3bf4bc25807889a" Feb 18 15:17:51 crc kubenswrapper[4817]: I0218 15:17:51.934916 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-khngq/crc-debug-6dqc6" Feb 18 15:17:52 crc kubenswrapper[4817]: I0218 15:17:52.185384 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d057fd1a-3606-47d7-8a04-c2782c959e51" path="/var/lib/kubelet/pods/d057fd1a-3606-47d7-8a04-c2782c959e51/volumes" Feb 18 15:17:58 crc kubenswrapper[4817]: I0218 15:17:58.235336 4817 scope.go:117] "RemoveContainer" containerID="4984bca70bc0993e719f0bc7f551bc184a0706e39bad62118e0e29921ef0f456" Feb 18 15:17:58 crc kubenswrapper[4817]: E0218 15:17:58.250741 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:18:11 crc kubenswrapper[4817]: I0218 15:18:11.172929 4817 scope.go:117] "RemoveContainer" containerID="4984bca70bc0993e719f0bc7f551bc184a0706e39bad62118e0e29921ef0f456" Feb 18 15:18:11 crc kubenswrapper[4817]: E0218 15:18:11.173714 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:18:20 crc kubenswrapper[4817]: I0218 15:18:20.487949 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_9e5146f3-4a88-4e31-82e7-0e0f72188d22/init-config-reloader/0.log" Feb 18 15:18:20 crc kubenswrapper[4817]: I0218 15:18:20.746045 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_9e5146f3-4a88-4e31-82e7-0e0f72188d22/alertmanager/0.log" Feb 18 15:18:20 crc kubenswrapper[4817]: I0218 15:18:20.758683 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_9e5146f3-4a88-4e31-82e7-0e0f72188d22/init-config-reloader/0.log" Feb 18 15:18:20 crc kubenswrapper[4817]: I0218 15:18:20.759762 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_9e5146f3-4a88-4e31-82e7-0e0f72188d22/config-reloader/0.log" Feb 18 15:18:20 crc kubenswrapper[4817]: I0218 15:18:20.960398 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-759f74666b-ms4jl_f6b68ae5-35a8-4050-9c64-e6ef834803fd/barbican-api/0.log" Feb 18 15:18:20 crc kubenswrapper[4817]: I0218 15:18:20.986416 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-759f74666b-ms4jl_f6b68ae5-35a8-4050-9c64-e6ef834803fd/barbican-api-log/0.log" Feb 18 15:18:21 crc kubenswrapper[4817]: I0218 15:18:21.075446 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5bf77b97db-hknps_33929d5f-e679-44e2-a0e9-816088e17cb1/barbican-keystone-listener/0.log" Feb 18 15:18:21 crc kubenswrapper[4817]: I0218 15:18:21.253077 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-78b498b86c-nn4vc_864e8c7f-e3b7-4f27-960e-df753b339571/barbican-worker/0.log" Feb 18 15:18:21 crc kubenswrapper[4817]: I0218 15:18:21.302324 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5bf77b97db-hknps_33929d5f-e679-44e2-a0e9-816088e17cb1/barbican-keystone-listener-log/0.log" Feb 18 15:18:21 crc kubenswrapper[4817]: I0218 15:18:21.353131 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-78b498b86c-nn4vc_864e8c7f-e3b7-4f27-960e-df753b339571/barbican-worker-log/0.log" Feb 18 15:18:21 crc kubenswrapper[4817]: I0218 15:18:21.603581 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f_d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 15:18:21 crc kubenswrapper[4817]: I0218 15:18:21.657660 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8/ceilometer-central-agent/0.log" Feb 18 15:18:21 crc kubenswrapper[4817]: I0218 15:18:21.726940 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8/ceilometer-notification-agent/0.log" Feb 18 15:18:21 crc kubenswrapper[4817]: I0218 15:18:21.839786 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8/sg-core/0.log" Feb 18 15:18:21 crc kubenswrapper[4817]: I0218 15:18:21.887321 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8/proxy-httpd/0.log" Feb 18 15:18:22 crc kubenswrapper[4817]: I0218 15:18:22.004335 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a414e293-71b9-44c3-8f07-20f3696f7db6/cinder-api/0.log" Feb 18 15:18:22 crc kubenswrapper[4817]: I0218 15:18:22.056158 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a414e293-71b9-44c3-8f07-20f3696f7db6/cinder-api-log/0.log" Feb 18 15:18:22 crc kubenswrapper[4817]: I0218 15:18:22.296112 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_89455d4a-c424-4e7a-85c5-42163318e132/probe/0.log" Feb 18 15:18:22 crc kubenswrapper[4817]: I0218 15:18:22.331398 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_89455d4a-c424-4e7a-85c5-42163318e132/cinder-scheduler/0.log" Feb 18 15:18:22 crc kubenswrapper[4817]: I0218 15:18:22.901729 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_09f7d0fc-a70f-4296-82f1-1cdd302a4a60/cloudkitty-api-log/0.log" Feb 18 15:18:22 crc kubenswrapper[4817]: I0218 15:18:22.942899 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_09f7d0fc-a70f-4296-82f1-1cdd302a4a60/cloudkitty-api/0.log" Feb 18 15:18:23 crc kubenswrapper[4817]: I0218 15:18:23.001086 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-compactor-0_75cbd0d0-2a48-48ba-9cae-d465da658b05/loki-compactor/0.log" Feb 18 15:18:23 crc kubenswrapper[4817]: I0218 15:18:23.180997 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-distributor-585d9bcbc-fmj4p_f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b/loki-distributor/0.log" Feb 18 15:18:23 crc kubenswrapper[4817]: I0218 15:18:23.230226 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-7f8685b49f-j7gxb_e596742a-2a5e-4a0c-9177-2b5a1ce00651/gateway/0.log" Feb 18 15:18:23 crc kubenswrapper[4817]: I0218 15:18:23.438244 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-7f8685b49f-n8c26_864c0a91-5aa3-4a84-8b75-6f75e0883aea/gateway/0.log" Feb 18 15:18:23 crc kubenswrapper[4817]: I0218 15:18:23.916913 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-index-gateway-0_6d8b01c7-a1be-49d1-8417-ce412fa834a4/loki-index-gateway/0.log" Feb 18 15:18:24 crc kubenswrapper[4817]: I0218 15:18:24.046408 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-ingester-0_7f685dd5-8921-4e4a-a4d5-d19a499775f5/loki-ingester/0.log" Feb 18 15:18:24 crc kubenswrapper[4817]: I0218 15:18:24.314386 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz_00036a73-dd30-4b48-a135-19b064818e5c/loki-query-frontend/0.log" Feb 18 15:18:25 crc kubenswrapper[4817]: I0218 15:18:25.023771 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-vftp7_f04dc5ce-0657-4e8c-8c0a-3b86924ea903/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 15:18:25 crc kubenswrapper[4817]: I0218 15:18:25.171355 4817 scope.go:117] "RemoveContainer" containerID="4984bca70bc0993e719f0bc7f551bc184a0706e39bad62118e0e29921ef0f456" Feb 18 15:18:25 crc kubenswrapper[4817]: E0218 15:18:25.171679 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:18:25 crc kubenswrapper[4817]: I0218 15:18:25.190630 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-querier-58c84b5844-qb4ph_b7632ade-ab1b-45b8-9f25-9fb98abc4f1a/loki-querier/0.log" Feb 18 15:18:25 crc kubenswrapper[4817]: I0218 15:18:25.287628 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-79xdx_e0745d01-0937-448d-a458-6f5823075a7a/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 15:18:25 crc kubenswrapper[4817]: I0218 15:18:25.460350 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7bb494c7f-kmtc2_d48ebb6a-086e-4e2e-b196-5f30c0a82b14/init/0.log" Feb 18 15:18:25 crc kubenswrapper[4817]: I0218 15:18:25.838636 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7bb494c7f-kmtc2_d48ebb6a-086e-4e2e-b196-5f30c0a82b14/init/0.log" Feb 18 15:18:25 crc kubenswrapper[4817]: I0218 15:18:25.906675 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-5qxkc_5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 15:18:25 crc kubenswrapper[4817]: I0218 15:18:25.907037 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7bb494c7f-kmtc2_d48ebb6a-086e-4e2e-b196-5f30c0a82b14/dnsmasq-dns/0.log" Feb 18 15:18:26 crc kubenswrapper[4817]: I0218 15:18:26.391753 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fc799768-b6dd-4b19-aee6-909d985e2441/glance-httpd/0.log" Feb 18 15:18:26 crc kubenswrapper[4817]: I0218 15:18:26.439251 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fc799768-b6dd-4b19-aee6-909d985e2441/glance-log/0.log" Feb 18 15:18:26 crc kubenswrapper[4817]: I0218 15:18:26.688780 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b542b984-8146-47e2-b20a-1b344762c302/glance-log/0.log" Feb 18 15:18:26 crc kubenswrapper[4817]: I0218 15:18:26.707884 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b542b984-8146-47e2-b20a-1b344762c302/glance-httpd/0.log" Feb 18 15:18:26 crc kubenswrapper[4817]: I0218 15:18:26.846771 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg_f54e3715-121a-4498-a552-5a9f1daed55c/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 15:18:27 crc kubenswrapper[4817]: I0218 15:18:27.078608 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-kttlh_2e45ac1d-02e2-457d-9944-cf1ecaf8edd3/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 15:18:27 crc kubenswrapper[4817]: I0218 15:18:27.331376 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29523781-vf8xd_06ad1a18-8a33-4ac1-a6df-9cb4b251c549/keystone-cron/0.log" Feb 18 15:18:27 crc kubenswrapper[4817]: I0218 15:18:27.675812 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-755bd56c8d-4mwpl_47ace64c-6cdb-4868-8655-7e149f33a069/keystone-api/0.log" Feb 18 15:18:27 crc kubenswrapper[4817]: I0218 15:18:27.680750 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_493be418-0841-4197-9fd7-50f22ecc6a5a/kube-state-metrics/0.log" Feb 18 15:18:27 crc kubenswrapper[4817]: I0218 15:18:27.774680 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2_43d12b3f-f980-4075-8684-a97141a5474d/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 15:18:28 crc kubenswrapper[4817]: I0218 15:18:28.323522 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7c8c8d4f9c-f58g5_206709c6-0550-4932-8f0e-f9d4c342a26c/neutron-api/0.log" Feb 18 15:18:28 crc kubenswrapper[4817]: I0218 15:18:28.404785 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7c8c8d4f9c-f58g5_206709c6-0550-4932-8f0e-f9d4c342a26c/neutron-httpd/0.log" Feb 18 15:18:28 crc kubenswrapper[4817]: I0218 15:18:28.670620 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2_43e2549e-9d03-495a-852e-0d0c283c5d51/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 15:18:29 crc kubenswrapper[4817]: I0218 15:18:29.216540 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_bea1dd6e-5f07-4dd6-a191-f07f59d36043/nova-api-log/0.log" Feb 18 15:18:29 crc kubenswrapper[4817]: I0218 15:18:29.423628 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_4b99418c-8ded-4927-afdf-a9a6edbabf84/nova-cell0-conductor-conductor/0.log" Feb 18 15:18:29 crc kubenswrapper[4817]: I0218 15:18:29.554466 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_bea1dd6e-5f07-4dd6-a191-f07f59d36043/nova-api-api/0.log" Feb 18 15:18:29 crc kubenswrapper[4817]: I0218 15:18:29.902399 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_7bab4ac8-afc6-4ac1-938c-2d04b5dc7822/nova-cell1-conductor-conductor/0.log" Feb 18 15:18:29 crc kubenswrapper[4817]: I0218 15:18:29.987484 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_fc896752-3d52-40cd-8d7f-2b10ba1afab5/nova-cell1-novncproxy-novncproxy/0.log" Feb 18 15:18:30 crc kubenswrapper[4817]: I0218 15:18:30.221755 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-jd5f6_095f77dc-6f9e-4845-9cfe-6aeac65d3ab0/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 15:18:30 crc kubenswrapper[4817]: I0218 15:18:30.603116 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_bd52eab4-329f-4cab-83cc-c082d2d3f1d4/nova-metadata-log/0.log" Feb 18 15:18:31 crc kubenswrapper[4817]: I0218 15:18:31.081700 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_d5bac496-cea3-4c61-91c1-0c0ebc884737/nova-scheduler-scheduler/0.log" Feb 18 15:18:31 crc kubenswrapper[4817]: I0218 15:18:31.184242 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_641785a9-2372-4857-8882-192bf7d7fe45/mysql-bootstrap/0.log" Feb 18 15:18:31 crc kubenswrapper[4817]: I0218 15:18:31.441717 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_641785a9-2372-4857-8882-192bf7d7fe45/mysql-bootstrap/0.log" Feb 18 15:18:31 crc kubenswrapper[4817]: I0218 15:18:31.484414 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_641785a9-2372-4857-8882-192bf7d7fe45/galera/0.log" Feb 18 15:18:31 crc kubenswrapper[4817]: I0218 15:18:31.796445 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_def7b080-de6e-49f1-9437-44d6f40b48c4/mysql-bootstrap/0.log" Feb 18 15:18:31 crc kubenswrapper[4817]: I0218 15:18:31.999098 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_def7b080-de6e-49f1-9437-44d6f40b48c4/mysql-bootstrap/0.log" Feb 18 15:18:32 crc kubenswrapper[4817]: I0218 15:18:32.058431 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_def7b080-de6e-49f1-9437-44d6f40b48c4/galera/0.log" Feb 18 15:18:32 crc kubenswrapper[4817]: I0218 15:18:32.295212 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_ace81bfb-db15-429f-9168-936817dad694/openstackclient/0.log" Feb 18 15:18:32 crc kubenswrapper[4817]: I0218 15:18:32.548071 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-9bcxg_ddb73215-bd2a-47eb-bbcf-b4708117244f/ovn-controller/0.log" Feb 18 15:18:32 crc kubenswrapper[4817]: I0218 15:18:32.863786 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_bd52eab4-329f-4cab-83cc-c082d2d3f1d4/nova-metadata-metadata/0.log" Feb 18 15:18:32 crc kubenswrapper[4817]: I0218 15:18:32.884759 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-gk7p2_be226950-1270-454d-8b23-2260dba4c819/openstack-network-exporter/0.log" Feb 18 15:18:33 crc kubenswrapper[4817]: I0218 15:18:33.170573 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-46wx9_2a166377-16ac-4c6b-9207-cddf8c814dc1/ovsdb-server-init/0.log" Feb 18 15:18:34 crc kubenswrapper[4817]: I0218 15:18:34.138846 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-46wx9_2a166377-16ac-4c6b-9207-cddf8c814dc1/ovsdb-server-init/0.log" Feb 18 15:18:34 crc kubenswrapper[4817]: I0218 15:18:34.218695 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-46wx9_2a166377-16ac-4c6b-9207-cddf8c814dc1/ovsdb-server/0.log" Feb 18 15:18:34 crc kubenswrapper[4817]: I0218 15:18:34.266738 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-46wx9_2a166377-16ac-4c6b-9207-cddf8c814dc1/ovs-vswitchd/0.log" Feb 18 15:18:34 crc kubenswrapper[4817]: I0218 15:18:34.580181 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-6twbf_ae44a6c3-1d36-4f95-b52a-a1bedc6ec272/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 15:18:34 crc kubenswrapper[4817]: I0218 15:18:34.640902 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_8b26cbdf-a148-4d07-bffc-afa241bc30e2/openstack-network-exporter/0.log" Feb 18 15:18:34 crc kubenswrapper[4817]: I0218 15:18:34.823530 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_8b26cbdf-a148-4d07-bffc-afa241bc30e2/ovn-northd/0.log" Feb 18 15:18:35 crc kubenswrapper[4817]: I0218 15:18:35.005967 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b89d17a9-16cb-4abe-ba88-107ce95dbceb/openstack-network-exporter/0.log" Feb 18 15:18:35 crc kubenswrapper[4817]: I0218 15:18:35.083108 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b89d17a9-16cb-4abe-ba88-107ce95dbceb/ovsdbserver-nb/0.log" Feb 18 15:18:35 crc kubenswrapper[4817]: I0218 15:18:35.260366 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_317526d8-4a73-4ae4-9607-b1d7375ba7f6/openstack-network-exporter/0.log" Feb 18 15:18:35 crc kubenswrapper[4817]: I0218 15:18:35.361330 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-proc-0_ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad/cloudkitty-proc/0.log" Feb 18 15:18:35 crc kubenswrapper[4817]: I0218 15:18:35.369704 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_317526d8-4a73-4ae4-9607-b1d7375ba7f6/ovsdbserver-sb/0.log" Feb 18 15:18:35 crc kubenswrapper[4817]: I0218 15:18:35.967906 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7b57694bbd-rpg5b_d284bf7a-b9d0-4fe3-b8bc-06a64d104853/placement-api/0.log" Feb 18 15:18:36 crc kubenswrapper[4817]: I0218 15:18:36.002994 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7b57694bbd-rpg5b_d284bf7a-b9d0-4fe3-b8bc-06a64d104853/placement-log/0.log" Feb 18 15:18:36 crc kubenswrapper[4817]: I0218 15:18:36.187827 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fbb28d6a-260d-45fa-80ec-9f583e8fc37b/init-config-reloader/0.log" Feb 18 15:18:36 crc kubenswrapper[4817]: I0218 15:18:36.440556 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fbb28d6a-260d-45fa-80ec-9f583e8fc37b/prometheus/0.log" Feb 18 15:18:36 crc kubenswrapper[4817]: I0218 15:18:36.467863 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fbb28d6a-260d-45fa-80ec-9f583e8fc37b/init-config-reloader/0.log" Feb 18 15:18:36 crc kubenswrapper[4817]: I0218 15:18:36.521615 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fbb28d6a-260d-45fa-80ec-9f583e8fc37b/thanos-sidecar/0.log" Feb 18 15:18:36 crc kubenswrapper[4817]: I0218 15:18:36.534780 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fbb28d6a-260d-45fa-80ec-9f583e8fc37b/config-reloader/0.log" Feb 18 15:18:36 crc kubenswrapper[4817]: I0218 15:18:36.774444 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e19f3906-864f-49f8-b3f1-e3cfbcae4133/setup-container/0.log" Feb 18 15:18:36 crc kubenswrapper[4817]: I0218 15:18:36.969887 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e19f3906-864f-49f8-b3f1-e3cfbcae4133/setup-container/0.log" Feb 18 15:18:37 crc kubenswrapper[4817]: I0218 15:18:37.011232 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e19f3906-864f-49f8-b3f1-e3cfbcae4133/rabbitmq/0.log" Feb 18 15:18:37 crc kubenswrapper[4817]: I0218 15:18:37.060439 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f49989fd-6326-4020-aba0-45b49ed37872/setup-container/0.log" Feb 18 15:18:37 crc kubenswrapper[4817]: I0218 15:18:37.171361 4817 scope.go:117] "RemoveContainer" containerID="4984bca70bc0993e719f0bc7f551bc184a0706e39bad62118e0e29921ef0f456" Feb 18 15:18:37 crc kubenswrapper[4817]: E0218 15:18:37.171652 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:18:37 crc kubenswrapper[4817]: I0218 15:18:37.404933 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f49989fd-6326-4020-aba0-45b49ed37872/rabbitmq/0.log" Feb 18 15:18:37 crc kubenswrapper[4817]: I0218 15:18:37.418570 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-k5wnj_3c302fa9-5186-4192-9cf6-b6d533570323/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 15:18:37 crc kubenswrapper[4817]: I0218 15:18:37.419451 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f49989fd-6326-4020-aba0-45b49ed37872/setup-container/0.log" Feb 18 15:18:37 crc kubenswrapper[4817]: I0218 15:18:37.644388 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-svdph_3c90beed-8bc0-4b1c-9c6c-2279e303fbb1/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 15:18:37 crc kubenswrapper[4817]: I0218 15:18:37.771376 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-hsvp2_d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 15:18:38 crc kubenswrapper[4817]: I0218 15:18:38.004440 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-vxt6n_abdb2358-3c43-4027-ab8e-fb25932c4f97/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 15:18:38 crc kubenswrapper[4817]: I0218 15:18:38.113015 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-t7ncv_1bef22ac-a84d-4941-8290-6b98eb56367b/ssh-known-hosts-edpm-deployment/0.log" Feb 18 15:18:38 crc kubenswrapper[4817]: I0218 15:18:38.380913 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-549ff9d7ff-4pfxq_85a61008-fd45-4598-90bc-b0cf2856cefa/proxy-server/0.log" Feb 18 15:18:38 crc kubenswrapper[4817]: I0218 15:18:38.515260 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-c5btx_7f8fdaa1-d441-4b5e-b376-8ab67ce68339/swift-ring-rebalance/0.log" Feb 18 15:18:38 crc kubenswrapper[4817]: I0218 15:18:38.615643 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-549ff9d7ff-4pfxq_85a61008-fd45-4598-90bc-b0cf2856cefa/proxy-httpd/0.log" Feb 18 15:18:38 crc kubenswrapper[4817]: I0218 15:18:38.763357 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_77de7364-0925-438c-89e2-6ff0d3cb0776/account-auditor/0.log" Feb 18 15:18:38 crc kubenswrapper[4817]: I0218 15:18:38.843176 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_77de7364-0925-438c-89e2-6ff0d3cb0776/account-reaper/0.log" Feb 18 15:18:38 crc kubenswrapper[4817]: I0218 15:18:38.964504 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_77de7364-0925-438c-89e2-6ff0d3cb0776/account-replicator/0.log" Feb 18 15:18:39 crc kubenswrapper[4817]: I0218 15:18:39.058375 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_77de7364-0925-438c-89e2-6ff0d3cb0776/container-auditor/0.log" Feb 18 15:18:39 crc kubenswrapper[4817]: I0218 15:18:39.189886 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_77de7364-0925-438c-89e2-6ff0d3cb0776/container-server/0.log" Feb 18 15:18:39 crc kubenswrapper[4817]: I0218 15:18:39.196787 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_77de7364-0925-438c-89e2-6ff0d3cb0776/account-server/0.log" Feb 18 15:18:39 crc kubenswrapper[4817]: I0218 15:18:39.205537 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_77de7364-0925-438c-89e2-6ff0d3cb0776/container-replicator/0.log" Feb 18 15:18:39 crc kubenswrapper[4817]: I0218 15:18:39.302706 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_77de7364-0925-438c-89e2-6ff0d3cb0776/container-updater/0.log" Feb 18 15:18:39 crc kubenswrapper[4817]: I0218 15:18:39.403651 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_77de7364-0925-438c-89e2-6ff0d3cb0776/object-expirer/0.log" Feb 18 15:18:39 crc kubenswrapper[4817]: I0218 15:18:39.449755 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_77de7364-0925-438c-89e2-6ff0d3cb0776/object-auditor/0.log" Feb 18 15:18:39 crc kubenswrapper[4817]: I0218 15:18:39.503568 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_77de7364-0925-438c-89e2-6ff0d3cb0776/object-replicator/0.log" Feb 18 15:18:39 crc kubenswrapper[4817]: I0218 15:18:39.556211 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_77de7364-0925-438c-89e2-6ff0d3cb0776/object-server/0.log" Feb 18 15:18:39 crc kubenswrapper[4817]: I0218 15:18:39.646756 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_77de7364-0925-438c-89e2-6ff0d3cb0776/object-updater/0.log" Feb 18 15:18:39 crc kubenswrapper[4817]: I0218 15:18:39.709928 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_77de7364-0925-438c-89e2-6ff0d3cb0776/rsync/0.log" Feb 18 15:18:39 crc kubenswrapper[4817]: I0218 15:18:39.716554 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_77de7364-0925-438c-89e2-6ff0d3cb0776/swift-recon-cron/0.log" Feb 18 15:18:40 crc kubenswrapper[4817]: I0218 15:18:40.042831 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-qxktm_5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 15:18:40 crc kubenswrapper[4817]: I0218 15:18:40.071333 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_33627f57-553f-4c87-a517-4fbe8d221665/tempest-tests-tempest-tests-runner/0.log" Feb 18 15:18:40 crc kubenswrapper[4817]: I0218 15:18:40.334309 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_943aa309-38ce-4bba-8183-bfaf4357b702/test-operator-logs-container/0.log" Feb 18 15:18:40 crc kubenswrapper[4817]: I0218 15:18:40.502129 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-gdv4j_30959336-e13c-426f-9116-3fd2e485a6ed/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 15:18:46 crc kubenswrapper[4817]: I0218 15:18:46.849877 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_cb347a6f-041d-41e7-be8b-b151f150e6ab/memcached/0.log" Feb 18 15:18:49 crc kubenswrapper[4817]: I0218 15:18:49.171915 4817 scope.go:117] "RemoveContainer" containerID="4984bca70bc0993e719f0bc7f551bc184a0706e39bad62118e0e29921ef0f456" Feb 18 15:18:49 crc kubenswrapper[4817]: E0218 15:18:49.172419 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:19:00 crc kubenswrapper[4817]: I0218 15:19:00.172253 4817 scope.go:117] "RemoveContainer" containerID="4984bca70bc0993e719f0bc7f551bc184a0706e39bad62118e0e29921ef0f456" Feb 18 15:19:00 crc kubenswrapper[4817]: E0218 15:19:00.173166 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:19:12 crc kubenswrapper[4817]: I0218 15:19:12.287500 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc_719bba7d-4fc5-4a70-9711-ecb679a5055a/util/0.log" Feb 18 15:19:12 crc kubenswrapper[4817]: I0218 15:19:12.518872 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc_719bba7d-4fc5-4a70-9711-ecb679a5055a/pull/0.log" Feb 18 15:19:12 crc kubenswrapper[4817]: I0218 15:19:12.519636 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc_719bba7d-4fc5-4a70-9711-ecb679a5055a/util/0.log" Feb 18 15:19:12 crc kubenswrapper[4817]: I0218 15:19:12.523164 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc_719bba7d-4fc5-4a70-9711-ecb679a5055a/pull/0.log" Feb 18 15:19:12 crc kubenswrapper[4817]: I0218 15:19:12.730078 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc_719bba7d-4fc5-4a70-9711-ecb679a5055a/util/0.log" Feb 18 15:19:12 crc kubenswrapper[4817]: I0218 15:19:12.758201 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc_719bba7d-4fc5-4a70-9711-ecb679a5055a/pull/0.log" Feb 18 15:19:12 crc kubenswrapper[4817]: I0218 15:19:12.789949 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc_719bba7d-4fc5-4a70-9711-ecb679a5055a/extract/0.log" Feb 18 15:19:13 crc kubenswrapper[4817]: I0218 15:19:13.334683 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-hwmjj_4441a78b-c58a-4030-801b-06dbfa1729b1/manager/0.log" Feb 18 15:19:14 crc kubenswrapper[4817]: I0218 15:19:14.173151 4817 scope.go:117] "RemoveContainer" containerID="4984bca70bc0993e719f0bc7f551bc184a0706e39bad62118e0e29921ef0f456" Feb 18 15:19:14 crc kubenswrapper[4817]: E0218 15:19:14.173645 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:19:14 crc kubenswrapper[4817]: I0218 15:19:14.350202 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-c2b7x_00831f79-f6d5-4896-b718-4120117751b8/manager/0.log" Feb 18 15:19:14 crc kubenswrapper[4817]: I0218 15:19:14.577881 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-zknf5_13246e06-5b63-4076-a556-de264d7afdf4/manager/0.log" Feb 18 15:19:14 crc kubenswrapper[4817]: I0218 15:19:14.880093 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-gq259_5721fd5d-07bb-44df-bfb8-4b4dd80ac7a4/manager/0.log" Feb 18 15:19:16 crc kubenswrapper[4817]: I0218 15:19:16.017607 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-xpwgd_e7be81da-3629-4713-87c6-34cabd9a8347/manager/0.log" Feb 18 15:19:16 crc kubenswrapper[4817]: I0218 15:19:16.070933 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-64rvt_01962a92-98c7-412c-86a7-ee21e6cb92a9/manager/0.log" Feb 18 15:19:16 crc kubenswrapper[4817]: I0218 15:19:16.148325 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-z64rl_4339e125-4e60-44d9-8e15-97b4000669e2/manager/0.log" Feb 18 15:19:16 crc kubenswrapper[4817]: I0218 15:19:16.491933 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-fs8m7_85bd6fc0-d973-4172-b441-c15d4abeb604/manager/0.log" Feb 18 15:19:16 crc kubenswrapper[4817]: I0218 15:19:16.763252 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-4h8qx_c5e8b4c9-5a63-44c3-9f6c-c7ee268dcef3/manager/0.log" Feb 18 15:19:16 crc kubenswrapper[4817]: I0218 15:19:16.971894 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-4rv4f_644f07fc-02ce-49f0-87bf-54f765c15d8c/manager/0.log" Feb 18 15:19:17 crc kubenswrapper[4817]: I0218 15:19:17.386682 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-dr67b_6e783396-37c1-4a0d-bfe4-495fdf4d41bf/manager/0.log" Feb 18 15:19:17 crc kubenswrapper[4817]: I0218 15:19:17.449914 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-9jkwb_ca917110-0727-4c63-ad9a-20722a6cba34/manager/0.log" Feb 18 15:19:17 crc kubenswrapper[4817]: I0218 15:19:17.783850 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j_ad995216-386a-455b-b48d-378dbfd271bf/manager/0.log" Feb 18 15:19:18 crc kubenswrapper[4817]: I0218 15:19:18.192488 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-5864f6ff6b-g4lc4_b05d374b-b714-4826-80b8-246c15521534/operator/0.log" Feb 18 15:19:18 crc kubenswrapper[4817]: I0218 15:19:18.501121 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-hfr8r_51aa0947-7a1c-4a40-bd45-299bb95ff9f1/registry-server/0.log" Feb 18 15:19:18 crc kubenswrapper[4817]: I0218 15:19:18.782001 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-l9fqt_9d933918-c23c-456a-8b3f-08ee4c2909dd/manager/0.log" Feb 18 15:19:19 crc kubenswrapper[4817]: I0218 15:19:19.140520 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-9td4r_2c04c342-bd87-46e7-8a2d-72dc30f858aa/manager/0.log" Feb 18 15:19:19 crc kubenswrapper[4817]: I0218 15:19:19.448043 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-5np9g_086958e1-8a7d-40c9-9725-f18776f863a0/operator/0.log" Feb 18 15:19:19 crc kubenswrapper[4817]: I0218 15:19:19.707509 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-ff84x_b77d6c32-6c30-42be-ab69-36b969d40950/manager/0.log" Feb 18 15:19:20 crc kubenswrapper[4817]: I0218 15:19:20.703212 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-8q5c8_bccec692-ee64-46e1-8979-e6173c132d8e/manager/0.log" Feb 18 15:19:20 crc kubenswrapper[4817]: I0218 15:19:20.916829 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7954588dd9-dngjl_ff1b9fe3-84fe-47fc-902c-aa23c9e829d8/manager/0.log" Feb 18 15:19:20 crc kubenswrapper[4817]: I0218 15:19:20.952423 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6956d67c5c-xbjdr_3374b90b-3a12-4e01-a0cb-ed7c51d844d7/manager/0.log" Feb 18 15:19:21 crc kubenswrapper[4817]: I0218 15:19:21.185686 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5db88f68c-8pfz7_93f59b48-31c6-4fed-8ccc-7d722605d896/manager/0.log" Feb 18 15:19:21 crc kubenswrapper[4817]: I0218 15:19:21.244532 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-qlrqm_0db531ef-d3b4-4b35-9497-8892cbd3db77/manager/0.log" Feb 18 15:19:25 crc kubenswrapper[4817]: I0218 15:19:25.804932 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-rs8vm_4f0674c2-f05e-4276-b2b0-dc5ed66c187a/manager/0.log" Feb 18 15:19:26 crc kubenswrapper[4817]: I0218 15:19:26.172347 4817 scope.go:117] "RemoveContainer" containerID="4984bca70bc0993e719f0bc7f551bc184a0706e39bad62118e0e29921ef0f456" Feb 18 15:19:26 crc kubenswrapper[4817]: E0218 15:19:26.172677 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:19:28 crc kubenswrapper[4817]: I0218 15:19:28.783825 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mx2s7"] Feb 18 15:19:28 crc kubenswrapper[4817]: E0218 15:19:28.784852 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d057fd1a-3606-47d7-8a04-c2782c959e51" containerName="container-00" Feb 18 15:19:28 crc kubenswrapper[4817]: I0218 15:19:28.784870 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="d057fd1a-3606-47d7-8a04-c2782c959e51" containerName="container-00" Feb 18 15:19:28 crc kubenswrapper[4817]: I0218 15:19:28.785085 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="d057fd1a-3606-47d7-8a04-c2782c959e51" containerName="container-00" Feb 18 15:19:28 crc kubenswrapper[4817]: I0218 15:19:28.786800 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mx2s7" Feb 18 15:19:28 crc kubenswrapper[4817]: I0218 15:19:28.795559 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mx2s7"] Feb 18 15:19:28 crc kubenswrapper[4817]: I0218 15:19:28.921013 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzvd8\" (UniqueName: \"kubernetes.io/projected/21c52f79-3702-49f7-b44e-76179b96683f-kube-api-access-qzvd8\") pod \"redhat-marketplace-mx2s7\" (UID: \"21c52f79-3702-49f7-b44e-76179b96683f\") " pod="openshift-marketplace/redhat-marketplace-mx2s7" Feb 18 15:19:28 crc kubenswrapper[4817]: I0218 15:19:28.921145 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21c52f79-3702-49f7-b44e-76179b96683f-catalog-content\") pod \"redhat-marketplace-mx2s7\" (UID: \"21c52f79-3702-49f7-b44e-76179b96683f\") " pod="openshift-marketplace/redhat-marketplace-mx2s7" Feb 18 15:19:28 crc kubenswrapper[4817]: I0218 15:19:28.921220 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21c52f79-3702-49f7-b44e-76179b96683f-utilities\") pod \"redhat-marketplace-mx2s7\" (UID: \"21c52f79-3702-49f7-b44e-76179b96683f\") " pod="openshift-marketplace/redhat-marketplace-mx2s7" Feb 18 15:19:29 crc kubenswrapper[4817]: I0218 15:19:29.023544 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzvd8\" (UniqueName: \"kubernetes.io/projected/21c52f79-3702-49f7-b44e-76179b96683f-kube-api-access-qzvd8\") pod \"redhat-marketplace-mx2s7\" (UID: \"21c52f79-3702-49f7-b44e-76179b96683f\") " pod="openshift-marketplace/redhat-marketplace-mx2s7" Feb 18 15:19:29 crc kubenswrapper[4817]: I0218 15:19:29.023639 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21c52f79-3702-49f7-b44e-76179b96683f-catalog-content\") pod \"redhat-marketplace-mx2s7\" (UID: \"21c52f79-3702-49f7-b44e-76179b96683f\") " pod="openshift-marketplace/redhat-marketplace-mx2s7" Feb 18 15:19:29 crc kubenswrapper[4817]: I0218 15:19:29.023675 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21c52f79-3702-49f7-b44e-76179b96683f-utilities\") pod \"redhat-marketplace-mx2s7\" (UID: \"21c52f79-3702-49f7-b44e-76179b96683f\") " pod="openshift-marketplace/redhat-marketplace-mx2s7" Feb 18 15:19:29 crc kubenswrapper[4817]: I0218 15:19:29.024165 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21c52f79-3702-49f7-b44e-76179b96683f-utilities\") pod \"redhat-marketplace-mx2s7\" (UID: \"21c52f79-3702-49f7-b44e-76179b96683f\") " pod="openshift-marketplace/redhat-marketplace-mx2s7" Feb 18 15:19:29 crc kubenswrapper[4817]: I0218 15:19:29.024345 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21c52f79-3702-49f7-b44e-76179b96683f-catalog-content\") pod \"redhat-marketplace-mx2s7\" (UID: \"21c52f79-3702-49f7-b44e-76179b96683f\") " pod="openshift-marketplace/redhat-marketplace-mx2s7" Feb 18 15:19:29 crc kubenswrapper[4817]: I0218 15:19:29.058393 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzvd8\" (UniqueName: \"kubernetes.io/projected/21c52f79-3702-49f7-b44e-76179b96683f-kube-api-access-qzvd8\") pod \"redhat-marketplace-mx2s7\" (UID: \"21c52f79-3702-49f7-b44e-76179b96683f\") " pod="openshift-marketplace/redhat-marketplace-mx2s7" Feb 18 15:19:29 crc kubenswrapper[4817]: I0218 15:19:29.111489 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mx2s7" Feb 18 15:19:29 crc kubenswrapper[4817]: I0218 15:19:29.664485 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mx2s7"] Feb 18 15:19:30 crc kubenswrapper[4817]: I0218 15:19:30.338313 4817 generic.go:334] "Generic (PLEG): container finished" podID="21c52f79-3702-49f7-b44e-76179b96683f" containerID="f70e6e01c6051b1068e9da205331981d5d7f5374609fa0a15352e216edc4fe3a" exitCode=0 Feb 18 15:19:30 crc kubenswrapper[4817]: I0218 15:19:30.338417 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mx2s7" event={"ID":"21c52f79-3702-49f7-b44e-76179b96683f","Type":"ContainerDied","Data":"f70e6e01c6051b1068e9da205331981d5d7f5374609fa0a15352e216edc4fe3a"} Feb 18 15:19:30 crc kubenswrapper[4817]: I0218 15:19:30.338727 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mx2s7" event={"ID":"21c52f79-3702-49f7-b44e-76179b96683f","Type":"ContainerStarted","Data":"bb450f06c397da02acb20fa76b415472f04e9b48182a7500006e30c2d56a9661"} Feb 18 15:19:30 crc kubenswrapper[4817]: I0218 15:19:30.340132 4817 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 15:19:31 crc kubenswrapper[4817]: I0218 15:19:31.350635 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mx2s7" event={"ID":"21c52f79-3702-49f7-b44e-76179b96683f","Type":"ContainerStarted","Data":"046c7915f0c5211c6e42da57cc55ed714bddf2750974405a9b8ae28b47ccc5e3"} Feb 18 15:19:32 crc kubenswrapper[4817]: I0218 15:19:32.378260 4817 generic.go:334] "Generic (PLEG): container finished" podID="21c52f79-3702-49f7-b44e-76179b96683f" containerID="046c7915f0c5211c6e42da57cc55ed714bddf2750974405a9b8ae28b47ccc5e3" exitCode=0 Feb 18 15:19:32 crc kubenswrapper[4817]: I0218 15:19:32.378351 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mx2s7" event={"ID":"21c52f79-3702-49f7-b44e-76179b96683f","Type":"ContainerDied","Data":"046c7915f0c5211c6e42da57cc55ed714bddf2750974405a9b8ae28b47ccc5e3"} Feb 18 15:19:33 crc kubenswrapper[4817]: I0218 15:19:33.391404 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mx2s7" event={"ID":"21c52f79-3702-49f7-b44e-76179b96683f","Type":"ContainerStarted","Data":"28c07b91fd0e3b6535697da3d1762a724578e50cb3d2ff65ecc16e6ed998baff"} Feb 18 15:19:33 crc kubenswrapper[4817]: I0218 15:19:33.410331 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mx2s7" podStartSLOduration=2.903568727 podStartE2EDuration="5.410314183s" podCreationTimestamp="2026-02-18 15:19:28 +0000 UTC" firstStartedPulling="2026-02-18 15:19:30.339883177 +0000 UTC m=+4832.915419160" lastFinishedPulling="2026-02-18 15:19:32.846628633 +0000 UTC m=+4835.422164616" observedRunningTime="2026-02-18 15:19:33.407643195 +0000 UTC m=+4835.983179188" watchObservedRunningTime="2026-02-18 15:19:33.410314183 +0000 UTC m=+4835.985850166" Feb 18 15:19:38 crc kubenswrapper[4817]: I0218 15:19:38.178274 4817 scope.go:117] "RemoveContainer" containerID="4984bca70bc0993e719f0bc7f551bc184a0706e39bad62118e0e29921ef0f456" Feb 18 15:19:38 crc kubenswrapper[4817]: E0218 15:19:38.180784 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:19:39 crc kubenswrapper[4817]: I0218 15:19:39.112085 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mx2s7" Feb 18 15:19:39 crc kubenswrapper[4817]: I0218 15:19:39.112448 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mx2s7" Feb 18 15:19:39 crc kubenswrapper[4817]: I0218 15:19:39.166261 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mx2s7" Feb 18 15:19:39 crc kubenswrapper[4817]: I0218 15:19:39.921965 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mx2s7" Feb 18 15:19:39 crc kubenswrapper[4817]: I0218 15:19:39.983268 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mx2s7"] Feb 18 15:19:41 crc kubenswrapper[4817]: I0218 15:19:41.479336 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mx2s7" podUID="21c52f79-3702-49f7-b44e-76179b96683f" containerName="registry-server" containerID="cri-o://28c07b91fd0e3b6535697da3d1762a724578e50cb3d2ff65ecc16e6ed998baff" gracePeriod=2 Feb 18 15:19:42 crc kubenswrapper[4817]: I0218 15:19:42.288397 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mx2s7" Feb 18 15:19:42 crc kubenswrapper[4817]: I0218 15:19:42.416850 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21c52f79-3702-49f7-b44e-76179b96683f-catalog-content\") pod \"21c52f79-3702-49f7-b44e-76179b96683f\" (UID: \"21c52f79-3702-49f7-b44e-76179b96683f\") " Feb 18 15:19:42 crc kubenswrapper[4817]: I0218 15:19:42.417085 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzvd8\" (UniqueName: \"kubernetes.io/projected/21c52f79-3702-49f7-b44e-76179b96683f-kube-api-access-qzvd8\") pod \"21c52f79-3702-49f7-b44e-76179b96683f\" (UID: \"21c52f79-3702-49f7-b44e-76179b96683f\") " Feb 18 15:19:42 crc kubenswrapper[4817]: I0218 15:19:42.418367 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21c52f79-3702-49f7-b44e-76179b96683f-utilities\") pod \"21c52f79-3702-49f7-b44e-76179b96683f\" (UID: \"21c52f79-3702-49f7-b44e-76179b96683f\") " Feb 18 15:19:42 crc kubenswrapper[4817]: I0218 15:19:42.419302 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21c52f79-3702-49f7-b44e-76179b96683f-utilities" (OuterVolumeSpecName: "utilities") pod "21c52f79-3702-49f7-b44e-76179b96683f" (UID: "21c52f79-3702-49f7-b44e-76179b96683f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 15:19:42 crc kubenswrapper[4817]: I0218 15:19:42.424860 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21c52f79-3702-49f7-b44e-76179b96683f-kube-api-access-qzvd8" (OuterVolumeSpecName: "kube-api-access-qzvd8") pod "21c52f79-3702-49f7-b44e-76179b96683f" (UID: "21c52f79-3702-49f7-b44e-76179b96683f"). InnerVolumeSpecName "kube-api-access-qzvd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 15:19:42 crc kubenswrapper[4817]: I0218 15:19:42.469266 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21c52f79-3702-49f7-b44e-76179b96683f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "21c52f79-3702-49f7-b44e-76179b96683f" (UID: "21c52f79-3702-49f7-b44e-76179b96683f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 15:19:42 crc kubenswrapper[4817]: I0218 15:19:42.491507 4817 generic.go:334] "Generic (PLEG): container finished" podID="21c52f79-3702-49f7-b44e-76179b96683f" containerID="28c07b91fd0e3b6535697da3d1762a724578e50cb3d2ff65ecc16e6ed998baff" exitCode=0 Feb 18 15:19:42 crc kubenswrapper[4817]: I0218 15:19:42.491561 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mx2s7" event={"ID":"21c52f79-3702-49f7-b44e-76179b96683f","Type":"ContainerDied","Data":"28c07b91fd0e3b6535697da3d1762a724578e50cb3d2ff65ecc16e6ed998baff"} Feb 18 15:19:42 crc kubenswrapper[4817]: I0218 15:19:42.491594 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mx2s7" event={"ID":"21c52f79-3702-49f7-b44e-76179b96683f","Type":"ContainerDied","Data":"bb450f06c397da02acb20fa76b415472f04e9b48182a7500006e30c2d56a9661"} Feb 18 15:19:42 crc kubenswrapper[4817]: I0218 15:19:42.491616 4817 scope.go:117] "RemoveContainer" containerID="28c07b91fd0e3b6535697da3d1762a724578e50cb3d2ff65ecc16e6ed998baff" Feb 18 15:19:42 crc kubenswrapper[4817]: I0218 15:19:42.491788 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mx2s7" Feb 18 15:19:42 crc kubenswrapper[4817]: I0218 15:19:42.512340 4817 scope.go:117] "RemoveContainer" containerID="046c7915f0c5211c6e42da57cc55ed714bddf2750974405a9b8ae28b47ccc5e3" Feb 18 15:19:42 crc kubenswrapper[4817]: I0218 15:19:42.521444 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzvd8\" (UniqueName: \"kubernetes.io/projected/21c52f79-3702-49f7-b44e-76179b96683f-kube-api-access-qzvd8\") on node \"crc\" DevicePath \"\"" Feb 18 15:19:42 crc kubenswrapper[4817]: I0218 15:19:42.521482 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21c52f79-3702-49f7-b44e-76179b96683f-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 15:19:42 crc kubenswrapper[4817]: I0218 15:19:42.521500 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21c52f79-3702-49f7-b44e-76179b96683f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 15:19:42 crc kubenswrapper[4817]: I0218 15:19:42.533330 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mx2s7"] Feb 18 15:19:42 crc kubenswrapper[4817]: I0218 15:19:42.544156 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mx2s7"] Feb 18 15:19:42 crc kubenswrapper[4817]: I0218 15:19:42.549260 4817 scope.go:117] "RemoveContainer" containerID="f70e6e01c6051b1068e9da205331981d5d7f5374609fa0a15352e216edc4fe3a" Feb 18 15:19:42 crc kubenswrapper[4817]: I0218 15:19:42.598189 4817 scope.go:117] "RemoveContainer" containerID="28c07b91fd0e3b6535697da3d1762a724578e50cb3d2ff65ecc16e6ed998baff" Feb 18 15:19:42 crc kubenswrapper[4817]: E0218 15:19:42.598749 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28c07b91fd0e3b6535697da3d1762a724578e50cb3d2ff65ecc16e6ed998baff\": container with ID starting with 28c07b91fd0e3b6535697da3d1762a724578e50cb3d2ff65ecc16e6ed998baff not found: ID does not exist" containerID="28c07b91fd0e3b6535697da3d1762a724578e50cb3d2ff65ecc16e6ed998baff" Feb 18 15:19:42 crc kubenswrapper[4817]: I0218 15:19:42.598820 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28c07b91fd0e3b6535697da3d1762a724578e50cb3d2ff65ecc16e6ed998baff"} err="failed to get container status \"28c07b91fd0e3b6535697da3d1762a724578e50cb3d2ff65ecc16e6ed998baff\": rpc error: code = NotFound desc = could not find container \"28c07b91fd0e3b6535697da3d1762a724578e50cb3d2ff65ecc16e6ed998baff\": container with ID starting with 28c07b91fd0e3b6535697da3d1762a724578e50cb3d2ff65ecc16e6ed998baff not found: ID does not exist" Feb 18 15:19:42 crc kubenswrapper[4817]: I0218 15:19:42.598855 4817 scope.go:117] "RemoveContainer" containerID="046c7915f0c5211c6e42da57cc55ed714bddf2750974405a9b8ae28b47ccc5e3" Feb 18 15:19:42 crc kubenswrapper[4817]: E0218 15:19:42.599429 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"046c7915f0c5211c6e42da57cc55ed714bddf2750974405a9b8ae28b47ccc5e3\": container with ID starting with 046c7915f0c5211c6e42da57cc55ed714bddf2750974405a9b8ae28b47ccc5e3 not found: ID does not exist" containerID="046c7915f0c5211c6e42da57cc55ed714bddf2750974405a9b8ae28b47ccc5e3" Feb 18 15:19:42 crc kubenswrapper[4817]: I0218 15:19:42.599460 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"046c7915f0c5211c6e42da57cc55ed714bddf2750974405a9b8ae28b47ccc5e3"} err="failed to get container status \"046c7915f0c5211c6e42da57cc55ed714bddf2750974405a9b8ae28b47ccc5e3\": rpc error: code = NotFound desc = could not find container \"046c7915f0c5211c6e42da57cc55ed714bddf2750974405a9b8ae28b47ccc5e3\": container with ID starting with 046c7915f0c5211c6e42da57cc55ed714bddf2750974405a9b8ae28b47ccc5e3 not found: ID does not exist" Feb 18 15:19:42 crc kubenswrapper[4817]: I0218 15:19:42.599483 4817 scope.go:117] "RemoveContainer" containerID="f70e6e01c6051b1068e9da205331981d5d7f5374609fa0a15352e216edc4fe3a" Feb 18 15:19:42 crc kubenswrapper[4817]: E0218 15:19:42.599780 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f70e6e01c6051b1068e9da205331981d5d7f5374609fa0a15352e216edc4fe3a\": container with ID starting with f70e6e01c6051b1068e9da205331981d5d7f5374609fa0a15352e216edc4fe3a not found: ID does not exist" containerID="f70e6e01c6051b1068e9da205331981d5d7f5374609fa0a15352e216edc4fe3a" Feb 18 15:19:42 crc kubenswrapper[4817]: I0218 15:19:42.599811 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f70e6e01c6051b1068e9da205331981d5d7f5374609fa0a15352e216edc4fe3a"} err="failed to get container status \"f70e6e01c6051b1068e9da205331981d5d7f5374609fa0a15352e216edc4fe3a\": rpc error: code = NotFound desc = could not find container \"f70e6e01c6051b1068e9da205331981d5d7f5374609fa0a15352e216edc4fe3a\": container with ID starting with f70e6e01c6051b1068e9da205331981d5d7f5374609fa0a15352e216edc4fe3a not found: ID does not exist" Feb 18 15:19:44 crc kubenswrapper[4817]: I0218 15:19:44.185685 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21c52f79-3702-49f7-b44e-76179b96683f" path="/var/lib/kubelet/pods/21c52f79-3702-49f7-b44e-76179b96683f/volumes" Feb 18 15:19:48 crc kubenswrapper[4817]: I0218 15:19:48.884897 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-mj6tz_9f66536d-c481-41b3-b5e5-8259651a95d9/control-plane-machine-set-operator/0.log" Feb 18 15:19:49 crc kubenswrapper[4817]: I0218 15:19:49.108328 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-x8x2r_3bd99710-b175-4115-8944-1fac544145c5/kube-rbac-proxy/0.log" Feb 18 15:19:49 crc kubenswrapper[4817]: I0218 15:19:49.173054 4817 scope.go:117] "RemoveContainer" containerID="4984bca70bc0993e719f0bc7f551bc184a0706e39bad62118e0e29921ef0f456" Feb 18 15:19:49 crc kubenswrapper[4817]: E0218 15:19:49.173468 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:19:49 crc kubenswrapper[4817]: I0218 15:19:49.200200 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-x8x2r_3bd99710-b175-4115-8944-1fac544145c5/machine-api-operator/0.log" Feb 18 15:20:03 crc kubenswrapper[4817]: I0218 15:20:03.171716 4817 scope.go:117] "RemoveContainer" containerID="4984bca70bc0993e719f0bc7f551bc184a0706e39bad62118e0e29921ef0f456" Feb 18 15:20:03 crc kubenswrapper[4817]: E0218 15:20:03.172706 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:20:04 crc kubenswrapper[4817]: I0218 15:20:04.865075 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-52jl8_cff7c17a-00dd-470b-a121-c8e86485d4ac/cert-manager-controller/0.log" Feb 18 15:20:05 crc kubenswrapper[4817]: I0218 15:20:05.131866 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-l2hvx_64c3da31-0521-4691-86b4-66f99e11c898/cert-manager-webhook/0.log" Feb 18 15:20:05 crc kubenswrapper[4817]: I0218 15:20:05.179493 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-kgq95_00d35822-7854-4547-8c51-7d8f747fcb9c/cert-manager-cainjector/0.log" Feb 18 15:20:16 crc kubenswrapper[4817]: I0218 15:20:16.171761 4817 scope.go:117] "RemoveContainer" containerID="4984bca70bc0993e719f0bc7f551bc184a0706e39bad62118e0e29921ef0f456" Feb 18 15:20:16 crc kubenswrapper[4817]: I0218 15:20:16.802801 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerStarted","Data":"de60e1d295605a5ab47852c07cbd58273509dad50b22558ccdb59ed92af85e4c"} Feb 18 15:20:20 crc kubenswrapper[4817]: I0218 15:20:20.920569 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-8qsxw_abf003b3-f87b-4907-ad15-59b8f12108b3/nmstate-console-plugin/0.log" Feb 18 15:20:21 crc kubenswrapper[4817]: I0218 15:20:21.108044 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-l4ntc_8b4f277d-2b45-43de-b3f7-52e968407f19/nmstate-handler/0.log" Feb 18 15:20:21 crc kubenswrapper[4817]: I0218 15:20:21.149808 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-qnftd_65c92dc4-036d-42e0-baa1-3dc7e23c43b3/kube-rbac-proxy/0.log" Feb 18 15:20:21 crc kubenswrapper[4817]: I0218 15:20:21.213505 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-qnftd_65c92dc4-036d-42e0-baa1-3dc7e23c43b3/nmstate-metrics/0.log" Feb 18 15:20:21 crc kubenswrapper[4817]: I0218 15:20:21.349912 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-4jkpg_ca7ab19f-157f-4626-80d3-27ed1a469d95/nmstate-operator/0.log" Feb 18 15:20:21 crc kubenswrapper[4817]: I0218 15:20:21.501370 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-b2vsq_1cc1bcee-c9a0-4bda-9fb1-0f178d5a938a/nmstate-webhook/0.log" Feb 18 15:20:36 crc kubenswrapper[4817]: I0218 15:20:36.399395 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-59d4b4c7c-rvnbw_bf27f33f-390f-44fa-91fb-40f18240d0df/manager/0.log" Feb 18 15:20:36 crc kubenswrapper[4817]: I0218 15:20:36.523577 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-59d4b4c7c-rvnbw_bf27f33f-390f-44fa-91fb-40f18240d0df/kube-rbac-proxy/0.log" Feb 18 15:20:50 crc kubenswrapper[4817]: I0218 15:20:50.944994 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-2p6zw_bc446e23-6b46-40cc-b058-5f8d491d8310/prometheus-operator/0.log" Feb 18 15:20:51 crc kubenswrapper[4817]: I0218 15:20:51.195297 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh_fad4abaa-bb3e-4fa2-9478-37e792ead430/prometheus-operator-admission-webhook/0.log" Feb 18 15:20:51 crc kubenswrapper[4817]: I0218 15:20:51.283409 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw_fa24c32b-4905-4756-a765-195d6b0b6c1a/prometheus-operator-admission-webhook/0.log" Feb 18 15:20:51 crc kubenswrapper[4817]: I0218 15:20:51.509569 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-xcdgl_02b7c5c2-ac49-498f-9c4c-c64cf484d131/operator/0.log" Feb 18 15:20:51 crc kubenswrapper[4817]: I0218 15:20:51.566584 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-qsnlp_f5816544-7d2c-4bf3-aeab-98f546573810/perses-operator/0.log" Feb 18 15:21:09 crc kubenswrapper[4817]: I0218 15:21:09.639133 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-z6fs7_5889628d-b78a-4279-95fd-ec441aac9d34/kube-rbac-proxy/0.log" Feb 18 15:21:09 crc kubenswrapper[4817]: I0218 15:21:09.776469 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-z6fs7_5889628d-b78a-4279-95fd-ec441aac9d34/controller/0.log" Feb 18 15:21:09 crc kubenswrapper[4817]: I0218 15:21:09.869279 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/cp-frr-files/0.log" Feb 18 15:21:10 crc kubenswrapper[4817]: I0218 15:21:10.198204 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/cp-reloader/0.log" Feb 18 15:21:10 crc kubenswrapper[4817]: I0218 15:21:10.221261 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/cp-reloader/0.log" Feb 18 15:21:10 crc kubenswrapper[4817]: I0218 15:21:10.330591 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/cp-frr-files/0.log" Feb 18 15:21:10 crc kubenswrapper[4817]: I0218 15:21:10.344730 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/cp-metrics/0.log" Feb 18 15:21:10 crc kubenswrapper[4817]: I0218 15:21:10.482099 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/cp-frr-files/0.log" Feb 18 15:21:10 crc kubenswrapper[4817]: I0218 15:21:10.508555 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/cp-reloader/0.log" Feb 18 15:21:10 crc kubenswrapper[4817]: I0218 15:21:10.547698 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/cp-metrics/0.log" Feb 18 15:21:10 crc kubenswrapper[4817]: I0218 15:21:10.581543 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/cp-metrics/0.log" Feb 18 15:21:10 crc kubenswrapper[4817]: I0218 15:21:10.760944 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/cp-metrics/0.log" Feb 18 15:21:10 crc kubenswrapper[4817]: I0218 15:21:10.784146 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/controller/0.log" Feb 18 15:21:10 crc kubenswrapper[4817]: I0218 15:21:10.788499 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/cp-reloader/0.log" Feb 18 15:21:10 crc kubenswrapper[4817]: I0218 15:21:10.793341 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/cp-frr-files/0.log" Feb 18 15:21:10 crc kubenswrapper[4817]: I0218 15:21:10.985786 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/kube-rbac-proxy-frr/0.log" Feb 18 15:21:11 crc kubenswrapper[4817]: I0218 15:21:11.020976 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/frr-metrics/0.log" Feb 18 15:21:11 crc kubenswrapper[4817]: I0218 15:21:11.065577 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/kube-rbac-proxy/0.log" Feb 18 15:21:11 crc kubenswrapper[4817]: I0218 15:21:11.448382 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/reloader/0.log" Feb 18 15:21:11 crc kubenswrapper[4817]: I0218 15:21:11.559682 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-kphxt_e2fe6fd7-48f6-47ec-b4b3-60016704bad9/frr-k8s-webhook-server/0.log" Feb 18 15:21:11 crc kubenswrapper[4817]: I0218 15:21:11.863729 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7c8dd94b68-49zzw_ba0591c4-822e-406b-a86b-1f2a6078452c/manager/0.log" Feb 18 15:21:11 crc kubenswrapper[4817]: I0218 15:21:11.982955 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-57fdf9bc8-smmwn_4c5c3b60-c65f-4f6c-869b-162ebd95eb32/webhook-server/0.log" Feb 18 15:21:12 crc kubenswrapper[4817]: I0218 15:21:12.230904 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-rzsqh_2ce92458-8bf0-41c0-95d1-219f6c35cdf5/kube-rbac-proxy/0.log" Feb 18 15:21:12 crc kubenswrapper[4817]: I0218 15:21:12.466148 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/frr/0.log" Feb 18 15:21:12 crc kubenswrapper[4817]: I0218 15:21:12.771821 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-rzsqh_2ce92458-8bf0-41c0-95d1-219f6c35cdf5/speaker/0.log" Feb 18 15:21:28 crc kubenswrapper[4817]: I0218 15:21:28.420409 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp_b460cb7c-dd22-42e4-91a1-1eee6a8340dc/util/0.log" Feb 18 15:21:28 crc kubenswrapper[4817]: I0218 15:21:28.614823 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp_b460cb7c-dd22-42e4-91a1-1eee6a8340dc/util/0.log" Feb 18 15:21:28 crc kubenswrapper[4817]: I0218 15:21:28.621777 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp_b460cb7c-dd22-42e4-91a1-1eee6a8340dc/pull/0.log" Feb 18 15:21:28 crc kubenswrapper[4817]: I0218 15:21:28.662703 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp_b460cb7c-dd22-42e4-91a1-1eee6a8340dc/pull/0.log" Feb 18 15:21:29 crc kubenswrapper[4817]: I0218 15:21:29.098173 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp_b460cb7c-dd22-42e4-91a1-1eee6a8340dc/extract/0.log" Feb 18 15:21:29 crc kubenswrapper[4817]: I0218 15:21:29.173389 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp_b460cb7c-dd22-42e4-91a1-1eee6a8340dc/util/0.log" Feb 18 15:21:29 crc kubenswrapper[4817]: I0218 15:21:29.180650 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp_b460cb7c-dd22-42e4-91a1-1eee6a8340dc/pull/0.log" Feb 18 15:21:29 crc kubenswrapper[4817]: I0218 15:21:29.336725 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk_f23865a8-7bc6-47b8-a9c8-6c9188463757/util/0.log" Feb 18 15:21:29 crc kubenswrapper[4817]: I0218 15:21:29.616814 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk_f23865a8-7bc6-47b8-a9c8-6c9188463757/pull/0.log" Feb 18 15:21:29 crc kubenswrapper[4817]: I0218 15:21:29.621522 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk_f23865a8-7bc6-47b8-a9c8-6c9188463757/util/0.log" Feb 18 15:21:29 crc kubenswrapper[4817]: I0218 15:21:29.658903 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk_f23865a8-7bc6-47b8-a9c8-6c9188463757/pull/0.log" Feb 18 15:21:29 crc kubenswrapper[4817]: I0218 15:21:29.824659 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk_f23865a8-7bc6-47b8-a9c8-6c9188463757/pull/0.log" Feb 18 15:21:29 crc kubenswrapper[4817]: I0218 15:21:29.828181 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk_f23865a8-7bc6-47b8-a9c8-6c9188463757/extract/0.log" Feb 18 15:21:29 crc kubenswrapper[4817]: I0218 15:21:29.832347 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk_f23865a8-7bc6-47b8-a9c8-6c9188463757/util/0.log" Feb 18 15:21:29 crc kubenswrapper[4817]: I0218 15:21:29.992956 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg_d8a0b33c-2815-43e2-bdcc-6a1b99682d34/util/0.log" Feb 18 15:21:30 crc kubenswrapper[4817]: I0218 15:21:30.253613 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg_d8a0b33c-2815-43e2-bdcc-6a1b99682d34/pull/0.log" Feb 18 15:21:30 crc kubenswrapper[4817]: I0218 15:21:30.254355 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg_d8a0b33c-2815-43e2-bdcc-6a1b99682d34/util/0.log" Feb 18 15:21:30 crc kubenswrapper[4817]: I0218 15:21:30.269995 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg_d8a0b33c-2815-43e2-bdcc-6a1b99682d34/pull/0.log" Feb 18 15:21:30 crc kubenswrapper[4817]: I0218 15:21:30.472680 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg_d8a0b33c-2815-43e2-bdcc-6a1b99682d34/util/0.log" Feb 18 15:21:30 crc kubenswrapper[4817]: I0218 15:21:30.484878 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg_d8a0b33c-2815-43e2-bdcc-6a1b99682d34/pull/0.log" Feb 18 15:21:30 crc kubenswrapper[4817]: I0218 15:21:30.513531 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg_d8a0b33c-2815-43e2-bdcc-6a1b99682d34/extract/0.log" Feb 18 15:21:30 crc kubenswrapper[4817]: I0218 15:21:30.697528 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-95btg_97057c75-124d-48f2-8931-667fa9ad766f/extract-utilities/0.log" Feb 18 15:21:30 crc kubenswrapper[4817]: I0218 15:21:30.902038 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-95btg_97057c75-124d-48f2-8931-667fa9ad766f/extract-utilities/0.log" Feb 18 15:21:30 crc kubenswrapper[4817]: I0218 15:21:30.926463 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-95btg_97057c75-124d-48f2-8931-667fa9ad766f/extract-content/0.log" Feb 18 15:21:30 crc kubenswrapper[4817]: I0218 15:21:30.936713 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-95btg_97057c75-124d-48f2-8931-667fa9ad766f/extract-content/0.log" Feb 18 15:21:31 crc kubenswrapper[4817]: I0218 15:21:31.374231 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-95btg_97057c75-124d-48f2-8931-667fa9ad766f/extract-utilities/0.log" Feb 18 15:21:31 crc kubenswrapper[4817]: I0218 15:21:31.450381 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-95btg_97057c75-124d-48f2-8931-667fa9ad766f/extract-content/0.log" Feb 18 15:21:31 crc kubenswrapper[4817]: I0218 15:21:31.720723 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7657x_80976ca8-de28-4b71-a0d1-f3aeb4410466/extract-utilities/0.log" Feb 18 15:21:31 crc kubenswrapper[4817]: I0218 15:21:31.904099 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-95btg_97057c75-124d-48f2-8931-667fa9ad766f/registry-server/0.log" Feb 18 15:21:32 crc kubenswrapper[4817]: I0218 15:21:32.035446 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7657x_80976ca8-de28-4b71-a0d1-f3aeb4410466/extract-utilities/0.log" Feb 18 15:21:32 crc kubenswrapper[4817]: I0218 15:21:32.070059 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7657x_80976ca8-de28-4b71-a0d1-f3aeb4410466/extract-content/0.log" Feb 18 15:21:32 crc kubenswrapper[4817]: I0218 15:21:32.107002 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7657x_80976ca8-de28-4b71-a0d1-f3aeb4410466/extract-content/0.log" Feb 18 15:21:32 crc kubenswrapper[4817]: I0218 15:21:32.227724 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7657x_80976ca8-de28-4b71-a0d1-f3aeb4410466/extract-utilities/0.log" Feb 18 15:21:32 crc kubenswrapper[4817]: I0218 15:21:32.268888 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7657x_80976ca8-de28-4b71-a0d1-f3aeb4410466/extract-content/0.log" Feb 18 15:21:32 crc kubenswrapper[4817]: I0218 15:21:32.591127 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62_ca2187cb-8ba5-4146-a506-4989f6bade5c/util/0.log" Feb 18 15:21:32 crc kubenswrapper[4817]: I0218 15:21:32.601717 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7657x_80976ca8-de28-4b71-a0d1-f3aeb4410466/registry-server/0.log" Feb 18 15:21:32 crc kubenswrapper[4817]: I0218 15:21:32.838424 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62_ca2187cb-8ba5-4146-a506-4989f6bade5c/util/0.log" Feb 18 15:21:32 crc kubenswrapper[4817]: I0218 15:21:32.868004 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62_ca2187cb-8ba5-4146-a506-4989f6bade5c/pull/0.log" Feb 18 15:21:32 crc kubenswrapper[4817]: I0218 15:21:32.891754 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62_ca2187cb-8ba5-4146-a506-4989f6bade5c/pull/0.log" Feb 18 15:21:33 crc kubenswrapper[4817]: I0218 15:21:33.113590 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62_ca2187cb-8ba5-4146-a506-4989f6bade5c/util/0.log" Feb 18 15:21:33 crc kubenswrapper[4817]: I0218 15:21:33.138847 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62_ca2187cb-8ba5-4146-a506-4989f6bade5c/extract/0.log" Feb 18 15:21:33 crc kubenswrapper[4817]: I0218 15:21:33.141597 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62_ca2187cb-8ba5-4146-a506-4989f6bade5c/pull/0.log" Feb 18 15:21:33 crc kubenswrapper[4817]: I0218 15:21:33.149923 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-cw95t_039201b1-3f23-4f22-80cb-17f07e1732df/marketplace-operator/0.log" Feb 18 15:21:33 crc kubenswrapper[4817]: I0218 15:21:33.958838 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jbhmn_32480fcf-d389-4f17-adee-4870e948038c/extract-utilities/0.log" Feb 18 15:21:34 crc kubenswrapper[4817]: I0218 15:21:34.199326 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jbhmn_32480fcf-d389-4f17-adee-4870e948038c/extract-utilities/0.log" Feb 18 15:21:34 crc kubenswrapper[4817]: I0218 15:21:34.220714 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jbhmn_32480fcf-d389-4f17-adee-4870e948038c/extract-content/0.log" Feb 18 15:21:34 crc kubenswrapper[4817]: I0218 15:21:34.220914 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jbhmn_32480fcf-d389-4f17-adee-4870e948038c/extract-content/0.log" Feb 18 15:21:34 crc kubenswrapper[4817]: I0218 15:21:34.363208 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jbhmn_32480fcf-d389-4f17-adee-4870e948038c/extract-utilities/0.log" Feb 18 15:21:34 crc kubenswrapper[4817]: I0218 15:21:34.401099 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jbhmn_32480fcf-d389-4f17-adee-4870e948038c/extract-content/0.log" Feb 18 15:21:34 crc kubenswrapper[4817]: I0218 15:21:34.412386 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hsbc9_4185e717-2ef8-456e-ad88-f8a65231cd06/extract-utilities/0.log" Feb 18 15:21:34 crc kubenswrapper[4817]: I0218 15:21:34.604770 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jbhmn_32480fcf-d389-4f17-adee-4870e948038c/registry-server/0.log" Feb 18 15:21:34 crc kubenswrapper[4817]: I0218 15:21:34.717954 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hsbc9_4185e717-2ef8-456e-ad88-f8a65231cd06/extract-content/0.log" Feb 18 15:21:34 crc kubenswrapper[4817]: I0218 15:21:34.777577 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hsbc9_4185e717-2ef8-456e-ad88-f8a65231cd06/extract-utilities/0.log" Feb 18 15:21:34 crc kubenswrapper[4817]: I0218 15:21:34.817309 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hsbc9_4185e717-2ef8-456e-ad88-f8a65231cd06/extract-content/0.log" Feb 18 15:21:35 crc kubenswrapper[4817]: I0218 15:21:35.163146 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hsbc9_4185e717-2ef8-456e-ad88-f8a65231cd06/extract-utilities/0.log" Feb 18 15:21:35 crc kubenswrapper[4817]: I0218 15:21:35.391297 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hsbc9_4185e717-2ef8-456e-ad88-f8a65231cd06/extract-content/0.log" Feb 18 15:21:35 crc kubenswrapper[4817]: I0218 15:21:35.759849 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hsbc9_4185e717-2ef8-456e-ad88-f8a65231cd06/registry-server/0.log" Feb 18 15:21:49 crc kubenswrapper[4817]: I0218 15:21:49.967532 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh_fad4abaa-bb3e-4fa2-9478-37e792ead430/prometheus-operator-admission-webhook/0.log" Feb 18 15:21:50 crc kubenswrapper[4817]: I0218 15:21:50.017545 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw_fa24c32b-4905-4756-a765-195d6b0b6c1a/prometheus-operator-admission-webhook/0.log" Feb 18 15:21:50 crc kubenswrapper[4817]: I0218 15:21:50.051671 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-2p6zw_bc446e23-6b46-40cc-b058-5f8d491d8310/prometheus-operator/0.log" Feb 18 15:21:50 crc kubenswrapper[4817]: I0218 15:21:50.208707 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-qsnlp_f5816544-7d2c-4bf3-aeab-98f546573810/perses-operator/0.log" Feb 18 15:21:50 crc kubenswrapper[4817]: I0218 15:21:50.234659 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-xcdgl_02b7c5c2-ac49-498f-9c4c-c64cf484d131/operator/0.log" Feb 18 15:22:00 crc kubenswrapper[4817]: I0218 15:22:00.216718 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gfqmd"] Feb 18 15:22:00 crc kubenswrapper[4817]: E0218 15:22:00.217825 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21c52f79-3702-49f7-b44e-76179b96683f" containerName="extract-utilities" Feb 18 15:22:00 crc kubenswrapper[4817]: I0218 15:22:00.217845 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="21c52f79-3702-49f7-b44e-76179b96683f" containerName="extract-utilities" Feb 18 15:22:00 crc kubenswrapper[4817]: E0218 15:22:00.217858 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21c52f79-3702-49f7-b44e-76179b96683f" containerName="extract-content" Feb 18 15:22:00 crc kubenswrapper[4817]: I0218 15:22:00.217866 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="21c52f79-3702-49f7-b44e-76179b96683f" containerName="extract-content" Feb 18 15:22:00 crc kubenswrapper[4817]: E0218 15:22:00.217894 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21c52f79-3702-49f7-b44e-76179b96683f" containerName="registry-server" Feb 18 15:22:00 crc kubenswrapper[4817]: I0218 15:22:00.217903 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="21c52f79-3702-49f7-b44e-76179b96683f" containerName="registry-server" Feb 18 15:22:00 crc kubenswrapper[4817]: I0218 15:22:00.218197 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="21c52f79-3702-49f7-b44e-76179b96683f" containerName="registry-server" Feb 18 15:22:00 crc kubenswrapper[4817]: I0218 15:22:00.220136 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gfqmd" Feb 18 15:22:00 crc kubenswrapper[4817]: I0218 15:22:00.230801 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gfqmd"] Feb 18 15:22:00 crc kubenswrapper[4817]: I0218 15:22:00.395559 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/348f881a-e8c2-419d-8b70-133d9aace1b8-catalog-content\") pod \"community-operators-gfqmd\" (UID: \"348f881a-e8c2-419d-8b70-133d9aace1b8\") " pod="openshift-marketplace/community-operators-gfqmd" Feb 18 15:22:00 crc kubenswrapper[4817]: I0218 15:22:00.395963 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7jxt\" (UniqueName: \"kubernetes.io/projected/348f881a-e8c2-419d-8b70-133d9aace1b8-kube-api-access-r7jxt\") pod \"community-operators-gfqmd\" (UID: \"348f881a-e8c2-419d-8b70-133d9aace1b8\") " pod="openshift-marketplace/community-operators-gfqmd" Feb 18 15:22:00 crc kubenswrapper[4817]: I0218 15:22:00.396187 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/348f881a-e8c2-419d-8b70-133d9aace1b8-utilities\") pod \"community-operators-gfqmd\" (UID: \"348f881a-e8c2-419d-8b70-133d9aace1b8\") " pod="openshift-marketplace/community-operators-gfqmd" Feb 18 15:22:00 crc kubenswrapper[4817]: I0218 15:22:00.497712 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/348f881a-e8c2-419d-8b70-133d9aace1b8-catalog-content\") pod \"community-operators-gfqmd\" (UID: \"348f881a-e8c2-419d-8b70-133d9aace1b8\") " pod="openshift-marketplace/community-operators-gfqmd" Feb 18 15:22:00 crc kubenswrapper[4817]: I0218 15:22:00.497772 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7jxt\" (UniqueName: \"kubernetes.io/projected/348f881a-e8c2-419d-8b70-133d9aace1b8-kube-api-access-r7jxt\") pod \"community-operators-gfqmd\" (UID: \"348f881a-e8c2-419d-8b70-133d9aace1b8\") " pod="openshift-marketplace/community-operators-gfqmd" Feb 18 15:22:00 crc kubenswrapper[4817]: I0218 15:22:00.497894 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/348f881a-e8c2-419d-8b70-133d9aace1b8-utilities\") pod \"community-operators-gfqmd\" (UID: \"348f881a-e8c2-419d-8b70-133d9aace1b8\") " pod="openshift-marketplace/community-operators-gfqmd" Feb 18 15:22:00 crc kubenswrapper[4817]: I0218 15:22:00.498274 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/348f881a-e8c2-419d-8b70-133d9aace1b8-catalog-content\") pod \"community-operators-gfqmd\" (UID: \"348f881a-e8c2-419d-8b70-133d9aace1b8\") " pod="openshift-marketplace/community-operators-gfqmd" Feb 18 15:22:00 crc kubenswrapper[4817]: I0218 15:22:00.498441 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/348f881a-e8c2-419d-8b70-133d9aace1b8-utilities\") pod \"community-operators-gfqmd\" (UID: \"348f881a-e8c2-419d-8b70-133d9aace1b8\") " pod="openshift-marketplace/community-operators-gfqmd" Feb 18 15:22:00 crc kubenswrapper[4817]: I0218 15:22:00.527062 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7jxt\" (UniqueName: \"kubernetes.io/projected/348f881a-e8c2-419d-8b70-133d9aace1b8-kube-api-access-r7jxt\") pod \"community-operators-gfqmd\" (UID: \"348f881a-e8c2-419d-8b70-133d9aace1b8\") " pod="openshift-marketplace/community-operators-gfqmd" Feb 18 15:22:00 crc kubenswrapper[4817]: I0218 15:22:00.562046 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gfqmd" Feb 18 15:22:01 crc kubenswrapper[4817]: I0218 15:22:01.219263 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gfqmd"] Feb 18 15:22:01 crc kubenswrapper[4817]: I0218 15:22:01.831385 4817 generic.go:334] "Generic (PLEG): container finished" podID="348f881a-e8c2-419d-8b70-133d9aace1b8" containerID="350c8d0789af7a1900e8a87035bfbdbef9f855a7ad8c77ac06e2aa84d357da38" exitCode=0 Feb 18 15:22:01 crc kubenswrapper[4817]: I0218 15:22:01.831486 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gfqmd" event={"ID":"348f881a-e8c2-419d-8b70-133d9aace1b8","Type":"ContainerDied","Data":"350c8d0789af7a1900e8a87035bfbdbef9f855a7ad8c77ac06e2aa84d357da38"} Feb 18 15:22:01 crc kubenswrapper[4817]: I0218 15:22:01.831739 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gfqmd" event={"ID":"348f881a-e8c2-419d-8b70-133d9aace1b8","Type":"ContainerStarted","Data":"5e71680f912ec8a266732b5ddb0bfec072eafc62f9a1a9132f40b6d7303457a0"} Feb 18 15:22:03 crc kubenswrapper[4817]: I0218 15:22:03.414409 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zz5kx"] Feb 18 15:22:03 crc kubenswrapper[4817]: I0218 15:22:03.418707 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zz5kx" Feb 18 15:22:03 crc kubenswrapper[4817]: I0218 15:22:03.426967 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zz5kx"] Feb 18 15:22:03 crc kubenswrapper[4817]: I0218 15:22:03.571792 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a657b26e-30dd-4246-bc83-dbbe9c58e4d6-utilities\") pod \"redhat-operators-zz5kx\" (UID: \"a657b26e-30dd-4246-bc83-dbbe9c58e4d6\") " pod="openshift-marketplace/redhat-operators-zz5kx" Feb 18 15:22:03 crc kubenswrapper[4817]: I0218 15:22:03.571954 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a657b26e-30dd-4246-bc83-dbbe9c58e4d6-catalog-content\") pod \"redhat-operators-zz5kx\" (UID: \"a657b26e-30dd-4246-bc83-dbbe9c58e4d6\") " pod="openshift-marketplace/redhat-operators-zz5kx" Feb 18 15:22:03 crc kubenswrapper[4817]: I0218 15:22:03.572099 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5szw\" (UniqueName: \"kubernetes.io/projected/a657b26e-30dd-4246-bc83-dbbe9c58e4d6-kube-api-access-x5szw\") pod \"redhat-operators-zz5kx\" (UID: \"a657b26e-30dd-4246-bc83-dbbe9c58e4d6\") " pod="openshift-marketplace/redhat-operators-zz5kx" Feb 18 15:22:03 crc kubenswrapper[4817]: I0218 15:22:03.674628 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a657b26e-30dd-4246-bc83-dbbe9c58e4d6-utilities\") pod \"redhat-operators-zz5kx\" (UID: \"a657b26e-30dd-4246-bc83-dbbe9c58e4d6\") " pod="openshift-marketplace/redhat-operators-zz5kx" Feb 18 15:22:03 crc kubenswrapper[4817]: I0218 15:22:03.674824 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a657b26e-30dd-4246-bc83-dbbe9c58e4d6-catalog-content\") pod \"redhat-operators-zz5kx\" (UID: \"a657b26e-30dd-4246-bc83-dbbe9c58e4d6\") " pod="openshift-marketplace/redhat-operators-zz5kx" Feb 18 15:22:03 crc kubenswrapper[4817]: I0218 15:22:03.674873 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5szw\" (UniqueName: \"kubernetes.io/projected/a657b26e-30dd-4246-bc83-dbbe9c58e4d6-kube-api-access-x5szw\") pod \"redhat-operators-zz5kx\" (UID: \"a657b26e-30dd-4246-bc83-dbbe9c58e4d6\") " pod="openshift-marketplace/redhat-operators-zz5kx" Feb 18 15:22:03 crc kubenswrapper[4817]: I0218 15:22:03.675842 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a657b26e-30dd-4246-bc83-dbbe9c58e4d6-catalog-content\") pod \"redhat-operators-zz5kx\" (UID: \"a657b26e-30dd-4246-bc83-dbbe9c58e4d6\") " pod="openshift-marketplace/redhat-operators-zz5kx" Feb 18 15:22:03 crc kubenswrapper[4817]: I0218 15:22:03.675920 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a657b26e-30dd-4246-bc83-dbbe9c58e4d6-utilities\") pod \"redhat-operators-zz5kx\" (UID: \"a657b26e-30dd-4246-bc83-dbbe9c58e4d6\") " pod="openshift-marketplace/redhat-operators-zz5kx" Feb 18 15:22:03 crc kubenswrapper[4817]: I0218 15:22:03.710430 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5szw\" (UniqueName: \"kubernetes.io/projected/a657b26e-30dd-4246-bc83-dbbe9c58e4d6-kube-api-access-x5szw\") pod \"redhat-operators-zz5kx\" (UID: \"a657b26e-30dd-4246-bc83-dbbe9c58e4d6\") " pod="openshift-marketplace/redhat-operators-zz5kx" Feb 18 15:22:03 crc kubenswrapper[4817]: I0218 15:22:03.742796 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zz5kx" Feb 18 15:22:03 crc kubenswrapper[4817]: I0218 15:22:03.868661 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gfqmd" event={"ID":"348f881a-e8c2-419d-8b70-133d9aace1b8","Type":"ContainerStarted","Data":"21c45396ef21efbc9b7c57f50d001b85892809ad07b214cba7aa2b2d9fac3a1b"} Feb 18 15:22:04 crc kubenswrapper[4817]: W0218 15:22:04.808940 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda657b26e_30dd_4246_bc83_dbbe9c58e4d6.slice/crio-2ae1ce0cdcad0f685e09614a39b9cf32f238128ad4edf300b007ec62994a8450 WatchSource:0}: Error finding container 2ae1ce0cdcad0f685e09614a39b9cf32f238128ad4edf300b007ec62994a8450: Status 404 returned error can't find the container with id 2ae1ce0cdcad0f685e09614a39b9cf32f238128ad4edf300b007ec62994a8450 Feb 18 15:22:04 crc kubenswrapper[4817]: I0218 15:22:04.813112 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zz5kx"] Feb 18 15:22:04 crc kubenswrapper[4817]: I0218 15:22:04.904419 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zz5kx" event={"ID":"a657b26e-30dd-4246-bc83-dbbe9c58e4d6","Type":"ContainerStarted","Data":"2ae1ce0cdcad0f685e09614a39b9cf32f238128ad4edf300b007ec62994a8450"} Feb 18 15:22:05 crc kubenswrapper[4817]: I0218 15:22:05.920268 4817 generic.go:334] "Generic (PLEG): container finished" podID="a657b26e-30dd-4246-bc83-dbbe9c58e4d6" containerID="20da8deaf9fee06e85e56f1250ed71180f70ba33e81571507bb59cb2653697f3" exitCode=0 Feb 18 15:22:05 crc kubenswrapper[4817]: I0218 15:22:05.920377 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zz5kx" event={"ID":"a657b26e-30dd-4246-bc83-dbbe9c58e4d6","Type":"ContainerDied","Data":"20da8deaf9fee06e85e56f1250ed71180f70ba33e81571507bb59cb2653697f3"} Feb 18 15:22:05 crc kubenswrapper[4817]: I0218 15:22:05.924397 4817 generic.go:334] "Generic (PLEG): container finished" podID="348f881a-e8c2-419d-8b70-133d9aace1b8" containerID="21c45396ef21efbc9b7c57f50d001b85892809ad07b214cba7aa2b2d9fac3a1b" exitCode=0 Feb 18 15:22:05 crc kubenswrapper[4817]: I0218 15:22:05.924445 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gfqmd" event={"ID":"348f881a-e8c2-419d-8b70-133d9aace1b8","Type":"ContainerDied","Data":"21c45396ef21efbc9b7c57f50d001b85892809ad07b214cba7aa2b2d9fac3a1b"} Feb 18 15:22:06 crc kubenswrapper[4817]: I0218 15:22:06.935501 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gfqmd" event={"ID":"348f881a-e8c2-419d-8b70-133d9aace1b8","Type":"ContainerStarted","Data":"63747244eadef2bb999761ab7dfce68ab33f01653f91cc5cdcb67acc17c85844"} Feb 18 15:22:06 crc kubenswrapper[4817]: I0218 15:22:06.937961 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zz5kx" event={"ID":"a657b26e-30dd-4246-bc83-dbbe9c58e4d6","Type":"ContainerStarted","Data":"65009afe41557944f76e1f535ea8ca7af98c0ad2957119581468eef8b3568a09"} Feb 18 15:22:06 crc kubenswrapper[4817]: I0218 15:22:06.966047 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gfqmd" podStartSLOduration=2.4676935 podStartE2EDuration="6.966025482s" podCreationTimestamp="2026-02-18 15:22:00 +0000 UTC" firstStartedPulling="2026-02-18 15:22:01.833076148 +0000 UTC m=+4984.408612141" lastFinishedPulling="2026-02-18 15:22:06.33140814 +0000 UTC m=+4988.906944123" observedRunningTime="2026-02-18 15:22:06.955927817 +0000 UTC m=+4989.531463800" watchObservedRunningTime="2026-02-18 15:22:06.966025482 +0000 UTC m=+4989.541561465" Feb 18 15:22:09 crc kubenswrapper[4817]: I0218 15:22:09.251117 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-59d4b4c7c-rvnbw_bf27f33f-390f-44fa-91fb-40f18240d0df/kube-rbac-proxy/0.log" Feb 18 15:22:09 crc kubenswrapper[4817]: I0218 15:22:09.303138 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-59d4b4c7c-rvnbw_bf27f33f-390f-44fa-91fb-40f18240d0df/manager/0.log" Feb 18 15:22:10 crc kubenswrapper[4817]: I0218 15:22:10.562996 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gfqmd" Feb 18 15:22:10 crc kubenswrapper[4817]: I0218 15:22:10.563340 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gfqmd" Feb 18 15:22:10 crc kubenswrapper[4817]: I0218 15:22:10.618163 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gfqmd" Feb 18 15:22:12 crc kubenswrapper[4817]: I0218 15:22:12.994439 4817 generic.go:334] "Generic (PLEG): container finished" podID="a657b26e-30dd-4246-bc83-dbbe9c58e4d6" containerID="65009afe41557944f76e1f535ea8ca7af98c0ad2957119581468eef8b3568a09" exitCode=0 Feb 18 15:22:12 crc kubenswrapper[4817]: I0218 15:22:12.994506 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zz5kx" event={"ID":"a657b26e-30dd-4246-bc83-dbbe9c58e4d6","Type":"ContainerDied","Data":"65009afe41557944f76e1f535ea8ca7af98c0ad2957119581468eef8b3568a09"} Feb 18 15:22:14 crc kubenswrapper[4817]: I0218 15:22:14.023238 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zz5kx" event={"ID":"a657b26e-30dd-4246-bc83-dbbe9c58e4d6","Type":"ContainerStarted","Data":"6e355f255309729a08035dcee33e921eb9b91670e1a5ab481e48e4d707fce638"} Feb 18 15:22:14 crc kubenswrapper[4817]: I0218 15:22:14.053780 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zz5kx" podStartSLOduration=3.547686097 podStartE2EDuration="11.05375784s" podCreationTimestamp="2026-02-18 15:22:03 +0000 UTC" firstStartedPulling="2026-02-18 15:22:05.922208966 +0000 UTC m=+4988.497744949" lastFinishedPulling="2026-02-18 15:22:13.428280719 +0000 UTC m=+4996.003816692" observedRunningTime="2026-02-18 15:22:14.043445989 +0000 UTC m=+4996.618981982" watchObservedRunningTime="2026-02-18 15:22:14.05375784 +0000 UTC m=+4996.629293823" Feb 18 15:22:20 crc kubenswrapper[4817]: I0218 15:22:20.631785 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gfqmd" Feb 18 15:22:23 crc kubenswrapper[4817]: I0218 15:22:23.743793 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zz5kx" Feb 18 15:22:23 crc kubenswrapper[4817]: I0218 15:22:23.744410 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zz5kx" Feb 18 15:22:24 crc kubenswrapper[4817]: I0218 15:22:24.201547 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gfqmd"] Feb 18 15:22:24 crc kubenswrapper[4817]: I0218 15:22:24.201796 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gfqmd" podUID="348f881a-e8c2-419d-8b70-133d9aace1b8" containerName="registry-server" containerID="cri-o://63747244eadef2bb999761ab7dfce68ab33f01653f91cc5cdcb67acc17c85844" gracePeriod=2 Feb 18 15:22:24 crc kubenswrapper[4817]: I0218 15:22:24.833417 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zz5kx" podUID="a657b26e-30dd-4246-bc83-dbbe9c58e4d6" containerName="registry-server" probeResult="failure" output=< Feb 18 15:22:24 crc kubenswrapper[4817]: timeout: failed to connect service ":50051" within 1s Feb 18 15:22:24 crc kubenswrapper[4817]: > Feb 18 15:22:25 crc kubenswrapper[4817]: I0218 15:22:25.145279 4817 generic.go:334] "Generic (PLEG): container finished" podID="348f881a-e8c2-419d-8b70-133d9aace1b8" containerID="63747244eadef2bb999761ab7dfce68ab33f01653f91cc5cdcb67acc17c85844" exitCode=0 Feb 18 15:22:25 crc kubenswrapper[4817]: I0218 15:22:25.145328 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gfqmd" event={"ID":"348f881a-e8c2-419d-8b70-133d9aace1b8","Type":"ContainerDied","Data":"63747244eadef2bb999761ab7dfce68ab33f01653f91cc5cdcb67acc17c85844"} Feb 18 15:22:25 crc kubenswrapper[4817]: I0218 15:22:25.296548 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gfqmd" Feb 18 15:22:25 crc kubenswrapper[4817]: I0218 15:22:25.416221 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7jxt\" (UniqueName: \"kubernetes.io/projected/348f881a-e8c2-419d-8b70-133d9aace1b8-kube-api-access-r7jxt\") pod \"348f881a-e8c2-419d-8b70-133d9aace1b8\" (UID: \"348f881a-e8c2-419d-8b70-133d9aace1b8\") " Feb 18 15:22:25 crc kubenswrapper[4817]: I0218 15:22:25.416439 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/348f881a-e8c2-419d-8b70-133d9aace1b8-utilities\") pod \"348f881a-e8c2-419d-8b70-133d9aace1b8\" (UID: \"348f881a-e8c2-419d-8b70-133d9aace1b8\") " Feb 18 15:22:25 crc kubenswrapper[4817]: I0218 15:22:25.417162 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/348f881a-e8c2-419d-8b70-133d9aace1b8-utilities" (OuterVolumeSpecName: "utilities") pod "348f881a-e8c2-419d-8b70-133d9aace1b8" (UID: "348f881a-e8c2-419d-8b70-133d9aace1b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 15:22:25 crc kubenswrapper[4817]: I0218 15:22:25.417333 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/348f881a-e8c2-419d-8b70-133d9aace1b8-catalog-content\") pod \"348f881a-e8c2-419d-8b70-133d9aace1b8\" (UID: \"348f881a-e8c2-419d-8b70-133d9aace1b8\") " Feb 18 15:22:25 crc kubenswrapper[4817]: I0218 15:22:25.434266 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/348f881a-e8c2-419d-8b70-133d9aace1b8-kube-api-access-r7jxt" (OuterVolumeSpecName: "kube-api-access-r7jxt") pod "348f881a-e8c2-419d-8b70-133d9aace1b8" (UID: "348f881a-e8c2-419d-8b70-133d9aace1b8"). InnerVolumeSpecName "kube-api-access-r7jxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 15:22:25 crc kubenswrapper[4817]: I0218 15:22:25.435249 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/348f881a-e8c2-419d-8b70-133d9aace1b8-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 15:22:25 crc kubenswrapper[4817]: I0218 15:22:25.435285 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7jxt\" (UniqueName: \"kubernetes.io/projected/348f881a-e8c2-419d-8b70-133d9aace1b8-kube-api-access-r7jxt\") on node \"crc\" DevicePath \"\"" Feb 18 15:22:25 crc kubenswrapper[4817]: I0218 15:22:25.511699 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/348f881a-e8c2-419d-8b70-133d9aace1b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "348f881a-e8c2-419d-8b70-133d9aace1b8" (UID: "348f881a-e8c2-419d-8b70-133d9aace1b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 15:22:25 crc kubenswrapper[4817]: I0218 15:22:25.537239 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/348f881a-e8c2-419d-8b70-133d9aace1b8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 15:22:26 crc kubenswrapper[4817]: I0218 15:22:26.158793 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gfqmd" event={"ID":"348f881a-e8c2-419d-8b70-133d9aace1b8","Type":"ContainerDied","Data":"5e71680f912ec8a266732b5ddb0bfec072eafc62f9a1a9132f40b6d7303457a0"} Feb 18 15:22:26 crc kubenswrapper[4817]: I0218 15:22:26.159098 4817 scope.go:117] "RemoveContainer" containerID="63747244eadef2bb999761ab7dfce68ab33f01653f91cc5cdcb67acc17c85844" Feb 18 15:22:26 crc kubenswrapper[4817]: I0218 15:22:26.159229 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gfqmd" Feb 18 15:22:26 crc kubenswrapper[4817]: I0218 15:22:26.198313 4817 scope.go:117] "RemoveContainer" containerID="21c45396ef21efbc9b7c57f50d001b85892809ad07b214cba7aa2b2d9fac3a1b" Feb 18 15:22:26 crc kubenswrapper[4817]: I0218 15:22:26.210553 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gfqmd"] Feb 18 15:22:26 crc kubenswrapper[4817]: I0218 15:22:26.226315 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gfqmd"] Feb 18 15:22:26 crc kubenswrapper[4817]: I0218 15:22:26.271198 4817 scope.go:117] "RemoveContainer" containerID="350c8d0789af7a1900e8a87035bfbdbef9f855a7ad8c77ac06e2aa84d357da38" Feb 18 15:22:28 crc kubenswrapper[4817]: I0218 15:22:28.183318 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="348f881a-e8c2-419d-8b70-133d9aace1b8" path="/var/lib/kubelet/pods/348f881a-e8c2-419d-8b70-133d9aace1b8/volumes" Feb 18 15:22:34 crc kubenswrapper[4817]: I0218 15:22:34.814907 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zz5kx" podUID="a657b26e-30dd-4246-bc83-dbbe9c58e4d6" containerName="registry-server" probeResult="failure" output=< Feb 18 15:22:34 crc kubenswrapper[4817]: timeout: failed to connect service ":50051" within 1s Feb 18 15:22:34 crc kubenswrapper[4817]: > Feb 18 15:22:42 crc kubenswrapper[4817]: I0218 15:22:42.863167 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 15:22:42 crc kubenswrapper[4817]: I0218 15:22:42.863867 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 15:22:44 crc kubenswrapper[4817]: I0218 15:22:44.791741 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zz5kx" podUID="a657b26e-30dd-4246-bc83-dbbe9c58e4d6" containerName="registry-server" probeResult="failure" output=< Feb 18 15:22:44 crc kubenswrapper[4817]: timeout: failed to connect service ":50051" within 1s Feb 18 15:22:44 crc kubenswrapper[4817]: > Feb 18 15:22:51 crc kubenswrapper[4817]: I0218 15:22:51.490530 4817 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-n8c26" podUID="864c0a91-5aa3-4a84-8b75-6f75e0883aea" containerName="gateway" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 18 15:22:53 crc kubenswrapper[4817]: I0218 15:22:53.792405 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zz5kx" Feb 18 15:22:53 crc kubenswrapper[4817]: I0218 15:22:53.846171 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zz5kx" Feb 18 15:22:54 crc kubenswrapper[4817]: I0218 15:22:54.026348 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zz5kx"] Feb 18 15:22:55 crc kubenswrapper[4817]: I0218 15:22:55.472814 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zz5kx" podUID="a657b26e-30dd-4246-bc83-dbbe9c58e4d6" containerName="registry-server" containerID="cri-o://6e355f255309729a08035dcee33e921eb9b91670e1a5ab481e48e4d707fce638" gracePeriod=2 Feb 18 15:22:56 crc kubenswrapper[4817]: I0218 15:22:56.234873 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zz5kx" Feb 18 15:22:56 crc kubenswrapper[4817]: I0218 15:22:56.303559 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5szw\" (UniqueName: \"kubernetes.io/projected/a657b26e-30dd-4246-bc83-dbbe9c58e4d6-kube-api-access-x5szw\") pod \"a657b26e-30dd-4246-bc83-dbbe9c58e4d6\" (UID: \"a657b26e-30dd-4246-bc83-dbbe9c58e4d6\") " Feb 18 15:22:56 crc kubenswrapper[4817]: I0218 15:22:56.303645 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a657b26e-30dd-4246-bc83-dbbe9c58e4d6-utilities\") pod \"a657b26e-30dd-4246-bc83-dbbe9c58e4d6\" (UID: \"a657b26e-30dd-4246-bc83-dbbe9c58e4d6\") " Feb 18 15:22:56 crc kubenswrapper[4817]: I0218 15:22:56.303710 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a657b26e-30dd-4246-bc83-dbbe9c58e4d6-catalog-content\") pod \"a657b26e-30dd-4246-bc83-dbbe9c58e4d6\" (UID: \"a657b26e-30dd-4246-bc83-dbbe9c58e4d6\") " Feb 18 15:22:56 crc kubenswrapper[4817]: I0218 15:22:56.304883 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a657b26e-30dd-4246-bc83-dbbe9c58e4d6-utilities" (OuterVolumeSpecName: "utilities") pod "a657b26e-30dd-4246-bc83-dbbe9c58e4d6" (UID: "a657b26e-30dd-4246-bc83-dbbe9c58e4d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 15:22:56 crc kubenswrapper[4817]: I0218 15:22:56.311461 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a657b26e-30dd-4246-bc83-dbbe9c58e4d6-kube-api-access-x5szw" (OuterVolumeSpecName: "kube-api-access-x5szw") pod "a657b26e-30dd-4246-bc83-dbbe9c58e4d6" (UID: "a657b26e-30dd-4246-bc83-dbbe9c58e4d6"). InnerVolumeSpecName "kube-api-access-x5szw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 15:22:56 crc kubenswrapper[4817]: I0218 15:22:56.406046 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5szw\" (UniqueName: \"kubernetes.io/projected/a657b26e-30dd-4246-bc83-dbbe9c58e4d6-kube-api-access-x5szw\") on node \"crc\" DevicePath \"\"" Feb 18 15:22:56 crc kubenswrapper[4817]: I0218 15:22:56.406393 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a657b26e-30dd-4246-bc83-dbbe9c58e4d6-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 15:22:56 crc kubenswrapper[4817]: I0218 15:22:56.451010 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a657b26e-30dd-4246-bc83-dbbe9c58e4d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a657b26e-30dd-4246-bc83-dbbe9c58e4d6" (UID: "a657b26e-30dd-4246-bc83-dbbe9c58e4d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 15:22:56 crc kubenswrapper[4817]: I0218 15:22:56.484865 4817 generic.go:334] "Generic (PLEG): container finished" podID="a657b26e-30dd-4246-bc83-dbbe9c58e4d6" containerID="6e355f255309729a08035dcee33e921eb9b91670e1a5ab481e48e4d707fce638" exitCode=0 Feb 18 15:22:56 crc kubenswrapper[4817]: I0218 15:22:56.484910 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zz5kx" event={"ID":"a657b26e-30dd-4246-bc83-dbbe9c58e4d6","Type":"ContainerDied","Data":"6e355f255309729a08035dcee33e921eb9b91670e1a5ab481e48e4d707fce638"} Feb 18 15:22:56 crc kubenswrapper[4817]: I0218 15:22:56.484942 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zz5kx" event={"ID":"a657b26e-30dd-4246-bc83-dbbe9c58e4d6","Type":"ContainerDied","Data":"2ae1ce0cdcad0f685e09614a39b9cf32f238128ad4edf300b007ec62994a8450"} Feb 18 15:22:56 crc kubenswrapper[4817]: I0218 15:22:56.484960 4817 scope.go:117] "RemoveContainer" containerID="6e355f255309729a08035dcee33e921eb9b91670e1a5ab481e48e4d707fce638" Feb 18 15:22:56 crc kubenswrapper[4817]: I0218 15:22:56.485120 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zz5kx" Feb 18 15:22:56 crc kubenswrapper[4817]: I0218 15:22:56.508085 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a657b26e-30dd-4246-bc83-dbbe9c58e4d6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 15:22:56 crc kubenswrapper[4817]: I0218 15:22:56.517855 4817 scope.go:117] "RemoveContainer" containerID="65009afe41557944f76e1f535ea8ca7af98c0ad2957119581468eef8b3568a09" Feb 18 15:22:56 crc kubenswrapper[4817]: I0218 15:22:56.538532 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zz5kx"] Feb 18 15:22:56 crc kubenswrapper[4817]: I0218 15:22:56.549639 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zz5kx"] Feb 18 15:22:56 crc kubenswrapper[4817]: I0218 15:22:56.553781 4817 scope.go:117] "RemoveContainer" containerID="20da8deaf9fee06e85e56f1250ed71180f70ba33e81571507bb59cb2653697f3" Feb 18 15:22:56 crc kubenswrapper[4817]: I0218 15:22:56.605427 4817 scope.go:117] "RemoveContainer" containerID="6e355f255309729a08035dcee33e921eb9b91670e1a5ab481e48e4d707fce638" Feb 18 15:22:56 crc kubenswrapper[4817]: E0218 15:22:56.605829 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e355f255309729a08035dcee33e921eb9b91670e1a5ab481e48e4d707fce638\": container with ID starting with 6e355f255309729a08035dcee33e921eb9b91670e1a5ab481e48e4d707fce638 not found: ID does not exist" containerID="6e355f255309729a08035dcee33e921eb9b91670e1a5ab481e48e4d707fce638" Feb 18 15:22:56 crc kubenswrapper[4817]: I0218 15:22:56.605854 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e355f255309729a08035dcee33e921eb9b91670e1a5ab481e48e4d707fce638"} err="failed to get container status \"6e355f255309729a08035dcee33e921eb9b91670e1a5ab481e48e4d707fce638\": rpc error: code = NotFound desc = could not find container \"6e355f255309729a08035dcee33e921eb9b91670e1a5ab481e48e4d707fce638\": container with ID starting with 6e355f255309729a08035dcee33e921eb9b91670e1a5ab481e48e4d707fce638 not found: ID does not exist" Feb 18 15:22:56 crc kubenswrapper[4817]: I0218 15:22:56.605874 4817 scope.go:117] "RemoveContainer" containerID="65009afe41557944f76e1f535ea8ca7af98c0ad2957119581468eef8b3568a09" Feb 18 15:22:56 crc kubenswrapper[4817]: E0218 15:22:56.606059 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65009afe41557944f76e1f535ea8ca7af98c0ad2957119581468eef8b3568a09\": container with ID starting with 65009afe41557944f76e1f535ea8ca7af98c0ad2957119581468eef8b3568a09 not found: ID does not exist" containerID="65009afe41557944f76e1f535ea8ca7af98c0ad2957119581468eef8b3568a09" Feb 18 15:22:56 crc kubenswrapper[4817]: I0218 15:22:56.606076 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65009afe41557944f76e1f535ea8ca7af98c0ad2957119581468eef8b3568a09"} err="failed to get container status \"65009afe41557944f76e1f535ea8ca7af98c0ad2957119581468eef8b3568a09\": rpc error: code = NotFound desc = could not find container \"65009afe41557944f76e1f535ea8ca7af98c0ad2957119581468eef8b3568a09\": container with ID starting with 65009afe41557944f76e1f535ea8ca7af98c0ad2957119581468eef8b3568a09 not found: ID does not exist" Feb 18 15:22:56 crc kubenswrapper[4817]: I0218 15:22:56.606088 4817 scope.go:117] "RemoveContainer" containerID="20da8deaf9fee06e85e56f1250ed71180f70ba33e81571507bb59cb2653697f3" Feb 18 15:22:56 crc kubenswrapper[4817]: E0218 15:22:56.606351 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20da8deaf9fee06e85e56f1250ed71180f70ba33e81571507bb59cb2653697f3\": container with ID starting with 20da8deaf9fee06e85e56f1250ed71180f70ba33e81571507bb59cb2653697f3 not found: ID does not exist" containerID="20da8deaf9fee06e85e56f1250ed71180f70ba33e81571507bb59cb2653697f3" Feb 18 15:22:56 crc kubenswrapper[4817]: I0218 15:22:56.606403 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20da8deaf9fee06e85e56f1250ed71180f70ba33e81571507bb59cb2653697f3"} err="failed to get container status \"20da8deaf9fee06e85e56f1250ed71180f70ba33e81571507bb59cb2653697f3\": rpc error: code = NotFound desc = could not find container \"20da8deaf9fee06e85e56f1250ed71180f70ba33e81571507bb59cb2653697f3\": container with ID starting with 20da8deaf9fee06e85e56f1250ed71180f70ba33e81571507bb59cb2653697f3 not found: ID does not exist" Feb 18 15:22:58 crc kubenswrapper[4817]: I0218 15:22:58.204282 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a657b26e-30dd-4246-bc83-dbbe9c58e4d6" path="/var/lib/kubelet/pods/a657b26e-30dd-4246-bc83-dbbe9c58e4d6/volumes" Feb 18 15:23:12 crc kubenswrapper[4817]: I0218 15:23:12.863294 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 15:23:12 crc kubenswrapper[4817]: I0218 15:23:12.864002 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 15:23:34 crc kubenswrapper[4817]: I0218 15:23:34.277614 4817 scope.go:117] "RemoveContainer" containerID="c813cd2d1cd6c8bb2376f843f249feb3bd90ba048ac18f948a8a71f735339017" Feb 18 15:23:42 crc kubenswrapper[4817]: I0218 15:23:42.863447 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 15:23:42 crc kubenswrapper[4817]: I0218 15:23:42.864057 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 15:23:42 crc kubenswrapper[4817]: I0218 15:23:42.864107 4817 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" Feb 18 15:23:42 crc kubenswrapper[4817]: I0218 15:23:42.864863 4817 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"de60e1d295605a5ab47852c07cbd58273509dad50b22558ccdb59ed92af85e4c"} pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 15:23:42 crc kubenswrapper[4817]: I0218 15:23:42.864917 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" containerID="cri-o://de60e1d295605a5ab47852c07cbd58273509dad50b22558ccdb59ed92af85e4c" gracePeriod=600 Feb 18 15:23:43 crc kubenswrapper[4817]: I0218 15:23:43.959213 4817 generic.go:334] "Generic (PLEG): container finished" podID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerID="de60e1d295605a5ab47852c07cbd58273509dad50b22558ccdb59ed92af85e4c" exitCode=0 Feb 18 15:23:43 crc kubenswrapper[4817]: I0218 15:23:43.959282 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerDied","Data":"de60e1d295605a5ab47852c07cbd58273509dad50b22558ccdb59ed92af85e4c"} Feb 18 15:23:43 crc kubenswrapper[4817]: I0218 15:23:43.959850 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerStarted","Data":"4391c0b6a1ccf075965fda152c64cede71fd2ad1d19321476a2f1c8612ddc514"} Feb 18 15:23:43 crc kubenswrapper[4817]: I0218 15:23:43.959881 4817 scope.go:117] "RemoveContainer" containerID="4984bca70bc0993e719f0bc7f551bc184a0706e39bad62118e0e29921ef0f456" Feb 18 15:24:19 crc kubenswrapper[4817]: I0218 15:24:19.307059 4817 generic.go:334] "Generic (PLEG): container finished" podID="10c9d1d1-19a5-49f6-9466-5395dc592916" containerID="f90b8a1390161835f4f4c2bae86ad312505d978f84aaff3153555a382150c1ef" exitCode=0 Feb 18 15:24:19 crc kubenswrapper[4817]: I0218 15:24:19.307317 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-khngq/must-gather-8xxh7" event={"ID":"10c9d1d1-19a5-49f6-9466-5395dc592916","Type":"ContainerDied","Data":"f90b8a1390161835f4f4c2bae86ad312505d978f84aaff3153555a382150c1ef"} Feb 18 15:24:19 crc kubenswrapper[4817]: I0218 15:24:19.308297 4817 scope.go:117] "RemoveContainer" containerID="f90b8a1390161835f4f4c2bae86ad312505d978f84aaff3153555a382150c1ef" Feb 18 15:24:19 crc kubenswrapper[4817]: I0218 15:24:19.560407 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-khngq_must-gather-8xxh7_10c9d1d1-19a5-49f6-9466-5395dc592916/gather/0.log" Feb 18 15:24:27 crc kubenswrapper[4817]: I0218 15:24:27.136664 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-khngq/must-gather-8xxh7"] Feb 18 15:24:27 crc kubenswrapper[4817]: I0218 15:24:27.137274 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-khngq/must-gather-8xxh7" podUID="10c9d1d1-19a5-49f6-9466-5395dc592916" containerName="copy" containerID="cri-o://52737c484eadbc6de7ed9d471b9d6ffefbc1357ec6591d9ff6924c0234658ad0" gracePeriod=2 Feb 18 15:24:27 crc kubenswrapper[4817]: I0218 15:24:27.151231 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-khngq/must-gather-8xxh7"] Feb 18 15:24:27 crc kubenswrapper[4817]: I0218 15:24:27.398146 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-khngq_must-gather-8xxh7_10c9d1d1-19a5-49f6-9466-5395dc592916/copy/0.log" Feb 18 15:24:27 crc kubenswrapper[4817]: I0218 15:24:27.398645 4817 generic.go:334] "Generic (PLEG): container finished" podID="10c9d1d1-19a5-49f6-9466-5395dc592916" containerID="52737c484eadbc6de7ed9d471b9d6ffefbc1357ec6591d9ff6924c0234658ad0" exitCode=143 Feb 18 15:24:27 crc kubenswrapper[4817]: I0218 15:24:27.986239 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-khngq_must-gather-8xxh7_10c9d1d1-19a5-49f6-9466-5395dc592916/copy/0.log" Feb 18 15:24:27 crc kubenswrapper[4817]: I0218 15:24:27.986899 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-khngq/must-gather-8xxh7" Feb 18 15:24:28 crc kubenswrapper[4817]: I0218 15:24:28.146923 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/10c9d1d1-19a5-49f6-9466-5395dc592916-must-gather-output\") pod \"10c9d1d1-19a5-49f6-9466-5395dc592916\" (UID: \"10c9d1d1-19a5-49f6-9466-5395dc592916\") " Feb 18 15:24:28 crc kubenswrapper[4817]: I0218 15:24:28.147025 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s554\" (UniqueName: \"kubernetes.io/projected/10c9d1d1-19a5-49f6-9466-5395dc592916-kube-api-access-5s554\") pod \"10c9d1d1-19a5-49f6-9466-5395dc592916\" (UID: \"10c9d1d1-19a5-49f6-9466-5395dc592916\") " Feb 18 15:24:28 crc kubenswrapper[4817]: I0218 15:24:28.165675 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10c9d1d1-19a5-49f6-9466-5395dc592916-kube-api-access-5s554" (OuterVolumeSpecName: "kube-api-access-5s554") pod "10c9d1d1-19a5-49f6-9466-5395dc592916" (UID: "10c9d1d1-19a5-49f6-9466-5395dc592916"). InnerVolumeSpecName "kube-api-access-5s554". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 15:24:28 crc kubenswrapper[4817]: I0218 15:24:28.250177 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s554\" (UniqueName: \"kubernetes.io/projected/10c9d1d1-19a5-49f6-9466-5395dc592916-kube-api-access-5s554\") on node \"crc\" DevicePath \"\"" Feb 18 15:24:28 crc kubenswrapper[4817]: I0218 15:24:28.347375 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10c9d1d1-19a5-49f6-9466-5395dc592916-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "10c9d1d1-19a5-49f6-9466-5395dc592916" (UID: "10c9d1d1-19a5-49f6-9466-5395dc592916"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 15:24:28 crc kubenswrapper[4817]: I0218 15:24:28.351880 4817 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/10c9d1d1-19a5-49f6-9466-5395dc592916-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 18 15:24:28 crc kubenswrapper[4817]: I0218 15:24:28.410023 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-khngq_must-gather-8xxh7_10c9d1d1-19a5-49f6-9466-5395dc592916/copy/0.log" Feb 18 15:24:28 crc kubenswrapper[4817]: I0218 15:24:28.410486 4817 scope.go:117] "RemoveContainer" containerID="52737c484eadbc6de7ed9d471b9d6ffefbc1357ec6591d9ff6924c0234658ad0" Feb 18 15:24:28 crc kubenswrapper[4817]: I0218 15:24:28.410568 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-khngq/must-gather-8xxh7" Feb 18 15:24:28 crc kubenswrapper[4817]: I0218 15:24:28.443361 4817 scope.go:117] "RemoveContainer" containerID="f90b8a1390161835f4f4c2bae86ad312505d978f84aaff3153555a382150c1ef" Feb 18 15:24:30 crc kubenswrapper[4817]: I0218 15:24:30.186284 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10c9d1d1-19a5-49f6-9466-5395dc592916" path="/var/lib/kubelet/pods/10c9d1d1-19a5-49f6-9466-5395dc592916/volumes" Feb 18 15:24:33 crc kubenswrapper[4817]: I0218 15:24:33.543961 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vk9pl"] Feb 18 15:24:33 crc kubenswrapper[4817]: E0218 15:24:33.544789 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10c9d1d1-19a5-49f6-9466-5395dc592916" containerName="gather" Feb 18 15:24:33 crc kubenswrapper[4817]: I0218 15:24:33.544802 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c9d1d1-19a5-49f6-9466-5395dc592916" containerName="gather" Feb 18 15:24:33 crc kubenswrapper[4817]: E0218 15:24:33.544817 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a657b26e-30dd-4246-bc83-dbbe9c58e4d6" containerName="registry-server" Feb 18 15:24:33 crc kubenswrapper[4817]: I0218 15:24:33.544824 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="a657b26e-30dd-4246-bc83-dbbe9c58e4d6" containerName="registry-server" Feb 18 15:24:33 crc kubenswrapper[4817]: E0218 15:24:33.544837 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="348f881a-e8c2-419d-8b70-133d9aace1b8" containerName="extract-content" Feb 18 15:24:33 crc kubenswrapper[4817]: I0218 15:24:33.544842 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="348f881a-e8c2-419d-8b70-133d9aace1b8" containerName="extract-content" Feb 18 15:24:33 crc kubenswrapper[4817]: E0218 15:24:33.544855 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="348f881a-e8c2-419d-8b70-133d9aace1b8" containerName="registry-server" Feb 18 15:24:33 crc kubenswrapper[4817]: I0218 15:24:33.544872 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="348f881a-e8c2-419d-8b70-133d9aace1b8" containerName="registry-server" Feb 18 15:24:33 crc kubenswrapper[4817]: E0218 15:24:33.544903 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a657b26e-30dd-4246-bc83-dbbe9c58e4d6" containerName="extract-utilities" Feb 18 15:24:33 crc kubenswrapper[4817]: I0218 15:24:33.544910 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="a657b26e-30dd-4246-bc83-dbbe9c58e4d6" containerName="extract-utilities" Feb 18 15:24:33 crc kubenswrapper[4817]: E0218 15:24:33.544927 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a657b26e-30dd-4246-bc83-dbbe9c58e4d6" containerName="extract-content" Feb 18 15:24:33 crc kubenswrapper[4817]: I0218 15:24:33.544933 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="a657b26e-30dd-4246-bc83-dbbe9c58e4d6" containerName="extract-content" Feb 18 15:24:33 crc kubenswrapper[4817]: E0218 15:24:33.544946 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10c9d1d1-19a5-49f6-9466-5395dc592916" containerName="copy" Feb 18 15:24:33 crc kubenswrapper[4817]: I0218 15:24:33.544953 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c9d1d1-19a5-49f6-9466-5395dc592916" containerName="copy" Feb 18 15:24:33 crc kubenswrapper[4817]: E0218 15:24:33.544970 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="348f881a-e8c2-419d-8b70-133d9aace1b8" containerName="extract-utilities" Feb 18 15:24:33 crc kubenswrapper[4817]: I0218 15:24:33.544991 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="348f881a-e8c2-419d-8b70-133d9aace1b8" containerName="extract-utilities" Feb 18 15:24:33 crc kubenswrapper[4817]: I0218 15:24:33.545205 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="a657b26e-30dd-4246-bc83-dbbe9c58e4d6" containerName="registry-server" Feb 18 15:24:33 crc kubenswrapper[4817]: I0218 15:24:33.545226 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="348f881a-e8c2-419d-8b70-133d9aace1b8" containerName="registry-server" Feb 18 15:24:33 crc kubenswrapper[4817]: I0218 15:24:33.545238 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="10c9d1d1-19a5-49f6-9466-5395dc592916" containerName="copy" Feb 18 15:24:33 crc kubenswrapper[4817]: I0218 15:24:33.545255 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="10c9d1d1-19a5-49f6-9466-5395dc592916" containerName="gather" Feb 18 15:24:33 crc kubenswrapper[4817]: I0218 15:24:33.546846 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vk9pl" Feb 18 15:24:33 crc kubenswrapper[4817]: I0218 15:24:33.560542 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vk9pl"] Feb 18 15:24:33 crc kubenswrapper[4817]: I0218 15:24:33.663299 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9bafa96-f855-428b-a191-23680be1422a-utilities\") pod \"certified-operators-vk9pl\" (UID: \"a9bafa96-f855-428b-a191-23680be1422a\") " pod="openshift-marketplace/certified-operators-vk9pl" Feb 18 15:24:33 crc kubenswrapper[4817]: I0218 15:24:33.663373 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d7vv\" (UniqueName: \"kubernetes.io/projected/a9bafa96-f855-428b-a191-23680be1422a-kube-api-access-5d7vv\") pod \"certified-operators-vk9pl\" (UID: \"a9bafa96-f855-428b-a191-23680be1422a\") " pod="openshift-marketplace/certified-operators-vk9pl" Feb 18 15:24:33 crc kubenswrapper[4817]: I0218 15:24:33.663408 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9bafa96-f855-428b-a191-23680be1422a-catalog-content\") pod \"certified-operators-vk9pl\" (UID: \"a9bafa96-f855-428b-a191-23680be1422a\") " pod="openshift-marketplace/certified-operators-vk9pl" Feb 18 15:24:33 crc kubenswrapper[4817]: I0218 15:24:33.765536 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9bafa96-f855-428b-a191-23680be1422a-utilities\") pod \"certified-operators-vk9pl\" (UID: \"a9bafa96-f855-428b-a191-23680be1422a\") " pod="openshift-marketplace/certified-operators-vk9pl" Feb 18 15:24:33 crc kubenswrapper[4817]: I0218 15:24:33.765598 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d7vv\" (UniqueName: \"kubernetes.io/projected/a9bafa96-f855-428b-a191-23680be1422a-kube-api-access-5d7vv\") pod \"certified-operators-vk9pl\" (UID: \"a9bafa96-f855-428b-a191-23680be1422a\") " pod="openshift-marketplace/certified-operators-vk9pl" Feb 18 15:24:33 crc kubenswrapper[4817]: I0218 15:24:33.765624 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9bafa96-f855-428b-a191-23680be1422a-catalog-content\") pod \"certified-operators-vk9pl\" (UID: \"a9bafa96-f855-428b-a191-23680be1422a\") " pod="openshift-marketplace/certified-operators-vk9pl" Feb 18 15:24:33 crc kubenswrapper[4817]: I0218 15:24:33.766227 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9bafa96-f855-428b-a191-23680be1422a-catalog-content\") pod \"certified-operators-vk9pl\" (UID: \"a9bafa96-f855-428b-a191-23680be1422a\") " pod="openshift-marketplace/certified-operators-vk9pl" Feb 18 15:24:33 crc kubenswrapper[4817]: I0218 15:24:33.766846 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9bafa96-f855-428b-a191-23680be1422a-utilities\") pod \"certified-operators-vk9pl\" (UID: \"a9bafa96-f855-428b-a191-23680be1422a\") " pod="openshift-marketplace/certified-operators-vk9pl" Feb 18 15:24:33 crc kubenswrapper[4817]: I0218 15:24:33.789261 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d7vv\" (UniqueName: \"kubernetes.io/projected/a9bafa96-f855-428b-a191-23680be1422a-kube-api-access-5d7vv\") pod \"certified-operators-vk9pl\" (UID: \"a9bafa96-f855-428b-a191-23680be1422a\") " pod="openshift-marketplace/certified-operators-vk9pl" Feb 18 15:24:33 crc kubenswrapper[4817]: I0218 15:24:33.877669 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vk9pl" Feb 18 15:24:34 crc kubenswrapper[4817]: I0218 15:24:34.378271 4817 scope.go:117] "RemoveContainer" containerID="f417b606eacd8b00a427ec382dcc8fdbbb60a4eef17209504cb18e3ab9f01803" Feb 18 15:24:34 crc kubenswrapper[4817]: I0218 15:24:34.460104 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vk9pl"] Feb 18 15:24:34 crc kubenswrapper[4817]: I0218 15:24:34.527159 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vk9pl" event={"ID":"a9bafa96-f855-428b-a191-23680be1422a","Type":"ContainerStarted","Data":"719f42f9de9fafb8cfc57ccdc59fd95473e8561b5625f406435977a9932bd6b9"} Feb 18 15:24:35 crc kubenswrapper[4817]: I0218 15:24:35.538958 4817 generic.go:334] "Generic (PLEG): container finished" podID="a9bafa96-f855-428b-a191-23680be1422a" containerID="29c600ad91a556f334a0c7161b85e7ffe1082ba6f8133ce56e97bb696978c398" exitCode=0 Feb 18 15:24:35 crc kubenswrapper[4817]: I0218 15:24:35.539015 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vk9pl" event={"ID":"a9bafa96-f855-428b-a191-23680be1422a","Type":"ContainerDied","Data":"29c600ad91a556f334a0c7161b85e7ffe1082ba6f8133ce56e97bb696978c398"} Feb 18 15:24:35 crc kubenswrapper[4817]: I0218 15:24:35.541639 4817 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 15:24:36 crc kubenswrapper[4817]: I0218 15:24:36.551033 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vk9pl" event={"ID":"a9bafa96-f855-428b-a191-23680be1422a","Type":"ContainerStarted","Data":"865dcfe419a7c74675d063fda74bca9c9ab9b2f3a9eff3e49f8baf99b5d91e19"} Feb 18 15:24:38 crc kubenswrapper[4817]: I0218 15:24:38.572508 4817 generic.go:334] "Generic (PLEG): container finished" podID="a9bafa96-f855-428b-a191-23680be1422a" containerID="865dcfe419a7c74675d063fda74bca9c9ab9b2f3a9eff3e49f8baf99b5d91e19" exitCode=0 Feb 18 15:24:38 crc kubenswrapper[4817]: I0218 15:24:38.572589 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vk9pl" event={"ID":"a9bafa96-f855-428b-a191-23680be1422a","Type":"ContainerDied","Data":"865dcfe419a7c74675d063fda74bca9c9ab9b2f3a9eff3e49f8baf99b5d91e19"} Feb 18 15:24:39 crc kubenswrapper[4817]: I0218 15:24:39.587815 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vk9pl" event={"ID":"a9bafa96-f855-428b-a191-23680be1422a","Type":"ContainerStarted","Data":"2111af743ad7edd1f102e7d4d9d60c3165dcef2a73f12ea7d0b48274b1b44a60"} Feb 18 15:24:39 crc kubenswrapper[4817]: I0218 15:24:39.625189 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vk9pl" podStartSLOduration=3.158727448 podStartE2EDuration="6.625169485s" podCreationTimestamp="2026-02-18 15:24:33 +0000 UTC" firstStartedPulling="2026-02-18 15:24:35.541323191 +0000 UTC m=+5138.116859174" lastFinishedPulling="2026-02-18 15:24:39.007765218 +0000 UTC m=+5141.583301211" observedRunningTime="2026-02-18 15:24:39.617899471 +0000 UTC m=+5142.193435444" watchObservedRunningTime="2026-02-18 15:24:39.625169485 +0000 UTC m=+5142.200705468" Feb 18 15:24:43 crc kubenswrapper[4817]: I0218 15:24:43.878515 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vk9pl" Feb 18 15:24:43 crc kubenswrapper[4817]: I0218 15:24:43.880257 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vk9pl" Feb 18 15:24:43 crc kubenswrapper[4817]: I0218 15:24:43.935925 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vk9pl" Feb 18 15:24:44 crc kubenswrapper[4817]: I0218 15:24:44.679461 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vk9pl" Feb 18 15:24:44 crc kubenswrapper[4817]: I0218 15:24:44.736708 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vk9pl"] Feb 18 15:24:46 crc kubenswrapper[4817]: I0218 15:24:46.646424 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vk9pl" podUID="a9bafa96-f855-428b-a191-23680be1422a" containerName="registry-server" containerID="cri-o://2111af743ad7edd1f102e7d4d9d60c3165dcef2a73f12ea7d0b48274b1b44a60" gracePeriod=2 Feb 18 15:24:47 crc kubenswrapper[4817]: I0218 15:24:47.659054 4817 generic.go:334] "Generic (PLEG): container finished" podID="a9bafa96-f855-428b-a191-23680be1422a" containerID="2111af743ad7edd1f102e7d4d9d60c3165dcef2a73f12ea7d0b48274b1b44a60" exitCode=0 Feb 18 15:24:47 crc kubenswrapper[4817]: I0218 15:24:47.659141 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vk9pl" event={"ID":"a9bafa96-f855-428b-a191-23680be1422a","Type":"ContainerDied","Data":"2111af743ad7edd1f102e7d4d9d60c3165dcef2a73f12ea7d0b48274b1b44a60"} Feb 18 15:24:48 crc kubenswrapper[4817]: I0218 15:24:48.087744 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vk9pl" Feb 18 15:24:48 crc kubenswrapper[4817]: I0218 15:24:48.160682 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5d7vv\" (UniqueName: \"kubernetes.io/projected/a9bafa96-f855-428b-a191-23680be1422a-kube-api-access-5d7vv\") pod \"a9bafa96-f855-428b-a191-23680be1422a\" (UID: \"a9bafa96-f855-428b-a191-23680be1422a\") " Feb 18 15:24:48 crc kubenswrapper[4817]: I0218 15:24:48.161014 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9bafa96-f855-428b-a191-23680be1422a-catalog-content\") pod \"a9bafa96-f855-428b-a191-23680be1422a\" (UID: \"a9bafa96-f855-428b-a191-23680be1422a\") " Feb 18 15:24:48 crc kubenswrapper[4817]: I0218 15:24:48.161265 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9bafa96-f855-428b-a191-23680be1422a-utilities\") pod \"a9bafa96-f855-428b-a191-23680be1422a\" (UID: \"a9bafa96-f855-428b-a191-23680be1422a\") " Feb 18 15:24:48 crc kubenswrapper[4817]: I0218 15:24:48.162374 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9bafa96-f855-428b-a191-23680be1422a-utilities" (OuterVolumeSpecName: "utilities") pod "a9bafa96-f855-428b-a191-23680be1422a" (UID: "a9bafa96-f855-428b-a191-23680be1422a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 15:24:48 crc kubenswrapper[4817]: I0218 15:24:48.173952 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9bafa96-f855-428b-a191-23680be1422a-kube-api-access-5d7vv" (OuterVolumeSpecName: "kube-api-access-5d7vv") pod "a9bafa96-f855-428b-a191-23680be1422a" (UID: "a9bafa96-f855-428b-a191-23680be1422a"). InnerVolumeSpecName "kube-api-access-5d7vv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 15:24:48 crc kubenswrapper[4817]: I0218 15:24:48.254590 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9bafa96-f855-428b-a191-23680be1422a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9bafa96-f855-428b-a191-23680be1422a" (UID: "a9bafa96-f855-428b-a191-23680be1422a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 15:24:48 crc kubenswrapper[4817]: I0218 15:24:48.266334 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9bafa96-f855-428b-a191-23680be1422a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 15:24:48 crc kubenswrapper[4817]: I0218 15:24:48.266370 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9bafa96-f855-428b-a191-23680be1422a-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 15:24:48 crc kubenswrapper[4817]: I0218 15:24:48.266384 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5d7vv\" (UniqueName: \"kubernetes.io/projected/a9bafa96-f855-428b-a191-23680be1422a-kube-api-access-5d7vv\") on node \"crc\" DevicePath \"\"" Feb 18 15:24:48 crc kubenswrapper[4817]: I0218 15:24:48.670155 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vk9pl" event={"ID":"a9bafa96-f855-428b-a191-23680be1422a","Type":"ContainerDied","Data":"719f42f9de9fafb8cfc57ccdc59fd95473e8561b5625f406435977a9932bd6b9"} Feb 18 15:24:48 crc kubenswrapper[4817]: I0218 15:24:48.670208 4817 scope.go:117] "RemoveContainer" containerID="2111af743ad7edd1f102e7d4d9d60c3165dcef2a73f12ea7d0b48274b1b44a60" Feb 18 15:24:48 crc kubenswrapper[4817]: I0218 15:24:48.670223 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vk9pl" Feb 18 15:24:48 crc kubenswrapper[4817]: I0218 15:24:48.706492 4817 scope.go:117] "RemoveContainer" containerID="865dcfe419a7c74675d063fda74bca9c9ab9b2f3a9eff3e49f8baf99b5d91e19" Feb 18 15:24:48 crc kubenswrapper[4817]: I0218 15:24:48.717732 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vk9pl"] Feb 18 15:24:48 crc kubenswrapper[4817]: I0218 15:24:48.729535 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vk9pl"] Feb 18 15:24:48 crc kubenswrapper[4817]: I0218 15:24:48.731201 4817 scope.go:117] "RemoveContainer" containerID="29c600ad91a556f334a0c7161b85e7ffe1082ba6f8133ce56e97bb696978c398" Feb 18 15:24:50 crc kubenswrapper[4817]: I0218 15:24:50.185802 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9bafa96-f855-428b-a191-23680be1422a" path="/var/lib/kubelet/pods/a9bafa96-f855-428b-a191-23680be1422a/volumes" Feb 18 15:26:12 crc kubenswrapper[4817]: I0218 15:26:12.863857 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 15:26:12 crc kubenswrapper[4817]: I0218 15:26:12.864424 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 15:26:42 crc kubenswrapper[4817]: I0218 15:26:42.862949 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 15:26:42 crc kubenswrapper[4817]: I0218 15:26:42.863542 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 15:27:12 crc kubenswrapper[4817]: I0218 15:27:12.863701 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 15:27:12 crc kubenswrapper[4817]: I0218 15:27:12.864182 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 15:27:12 crc kubenswrapper[4817]: I0218 15:27:12.864223 4817 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" Feb 18 15:27:12 crc kubenswrapper[4817]: I0218 15:27:12.864951 4817 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4391c0b6a1ccf075965fda152c64cede71fd2ad1d19321476a2f1c8612ddc514"} pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 15:27:12 crc kubenswrapper[4817]: I0218 15:27:12.865017 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" containerID="cri-o://4391c0b6a1ccf075965fda152c64cede71fd2ad1d19321476a2f1c8612ddc514" gracePeriod=600 Feb 18 15:27:12 crc kubenswrapper[4817]: E0218 15:27:12.988968 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:27:13 crc kubenswrapper[4817]: I0218 15:27:13.615254 4817 generic.go:334] "Generic (PLEG): container finished" podID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerID="4391c0b6a1ccf075965fda152c64cede71fd2ad1d19321476a2f1c8612ddc514" exitCode=0 Feb 18 15:27:13 crc kubenswrapper[4817]: I0218 15:27:13.615301 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerDied","Data":"4391c0b6a1ccf075965fda152c64cede71fd2ad1d19321476a2f1c8612ddc514"} Feb 18 15:27:13 crc kubenswrapper[4817]: I0218 15:27:13.615334 4817 scope.go:117] "RemoveContainer" containerID="de60e1d295605a5ab47852c07cbd58273509dad50b22558ccdb59ed92af85e4c" Feb 18 15:27:13 crc kubenswrapper[4817]: I0218 15:27:13.616133 4817 scope.go:117] "RemoveContainer" containerID="4391c0b6a1ccf075965fda152c64cede71fd2ad1d19321476a2f1c8612ddc514" Feb 18 15:27:13 crc kubenswrapper[4817]: E0218 15:27:13.616524 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:27:26 crc kubenswrapper[4817]: I0218 15:27:26.171610 4817 scope.go:117] "RemoveContainer" containerID="4391c0b6a1ccf075965fda152c64cede71fd2ad1d19321476a2f1c8612ddc514" Feb 18 15:27:26 crc kubenswrapper[4817]: E0218 15:27:26.172394 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:27:37 crc kubenswrapper[4817]: I0218 15:27:37.172323 4817 scope.go:117] "RemoveContainer" containerID="4391c0b6a1ccf075965fda152c64cede71fd2ad1d19321476a2f1c8612ddc514" Feb 18 15:27:37 crc kubenswrapper[4817]: E0218 15:27:37.173152 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:27:51 crc kubenswrapper[4817]: I0218 15:27:51.172675 4817 scope.go:117] "RemoveContainer" containerID="4391c0b6a1ccf075965fda152c64cede71fd2ad1d19321476a2f1c8612ddc514" Feb 18 15:27:51 crc kubenswrapper[4817]: E0218 15:27:51.173623 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:27:53 crc kubenswrapper[4817]: I0218 15:27:53.173694 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6c46z/must-gather-t2lxx"] Feb 18 15:27:53 crc kubenswrapper[4817]: E0218 15:27:53.174543 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9bafa96-f855-428b-a191-23680be1422a" containerName="registry-server" Feb 18 15:27:53 crc kubenswrapper[4817]: I0218 15:27:53.174561 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9bafa96-f855-428b-a191-23680be1422a" containerName="registry-server" Feb 18 15:27:53 crc kubenswrapper[4817]: E0218 15:27:53.174611 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9bafa96-f855-428b-a191-23680be1422a" containerName="extract-content" Feb 18 15:27:53 crc kubenswrapper[4817]: I0218 15:27:53.174620 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9bafa96-f855-428b-a191-23680be1422a" containerName="extract-content" Feb 18 15:27:53 crc kubenswrapper[4817]: E0218 15:27:53.174638 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9bafa96-f855-428b-a191-23680be1422a" containerName="extract-utilities" Feb 18 15:27:53 crc kubenswrapper[4817]: I0218 15:27:53.174645 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9bafa96-f855-428b-a191-23680be1422a" containerName="extract-utilities" Feb 18 15:27:53 crc kubenswrapper[4817]: I0218 15:27:53.174881 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9bafa96-f855-428b-a191-23680be1422a" containerName="registry-server" Feb 18 15:27:53 crc kubenswrapper[4817]: I0218 15:27:53.176323 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6c46z/must-gather-t2lxx" Feb 18 15:27:53 crc kubenswrapper[4817]: I0218 15:27:53.181754 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6c46z"/"openshift-service-ca.crt" Feb 18 15:27:53 crc kubenswrapper[4817]: I0218 15:27:53.181992 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6c46z"/"kube-root-ca.crt" Feb 18 15:27:53 crc kubenswrapper[4817]: I0218 15:27:53.204858 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6c46z/must-gather-t2lxx"] Feb 18 15:27:53 crc kubenswrapper[4817]: I0218 15:27:53.312749 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1934f4f8-1ea2-4e98-bf8e-38ae890846e6-must-gather-output\") pod \"must-gather-t2lxx\" (UID: \"1934f4f8-1ea2-4e98-bf8e-38ae890846e6\") " pod="openshift-must-gather-6c46z/must-gather-t2lxx" Feb 18 15:27:53 crc kubenswrapper[4817]: I0218 15:27:53.312856 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csrj5\" (UniqueName: \"kubernetes.io/projected/1934f4f8-1ea2-4e98-bf8e-38ae890846e6-kube-api-access-csrj5\") pod \"must-gather-t2lxx\" (UID: \"1934f4f8-1ea2-4e98-bf8e-38ae890846e6\") " pod="openshift-must-gather-6c46z/must-gather-t2lxx" Feb 18 15:27:53 crc kubenswrapper[4817]: I0218 15:27:53.414843 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1934f4f8-1ea2-4e98-bf8e-38ae890846e6-must-gather-output\") pod \"must-gather-t2lxx\" (UID: \"1934f4f8-1ea2-4e98-bf8e-38ae890846e6\") " pod="openshift-must-gather-6c46z/must-gather-t2lxx" Feb 18 15:27:53 crc kubenswrapper[4817]: I0218 15:27:53.414918 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csrj5\" (UniqueName: \"kubernetes.io/projected/1934f4f8-1ea2-4e98-bf8e-38ae890846e6-kube-api-access-csrj5\") pod \"must-gather-t2lxx\" (UID: \"1934f4f8-1ea2-4e98-bf8e-38ae890846e6\") " pod="openshift-must-gather-6c46z/must-gather-t2lxx" Feb 18 15:27:53 crc kubenswrapper[4817]: I0218 15:27:53.415426 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1934f4f8-1ea2-4e98-bf8e-38ae890846e6-must-gather-output\") pod \"must-gather-t2lxx\" (UID: \"1934f4f8-1ea2-4e98-bf8e-38ae890846e6\") " pod="openshift-must-gather-6c46z/must-gather-t2lxx" Feb 18 15:27:53 crc kubenswrapper[4817]: I0218 15:27:53.653690 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csrj5\" (UniqueName: \"kubernetes.io/projected/1934f4f8-1ea2-4e98-bf8e-38ae890846e6-kube-api-access-csrj5\") pod \"must-gather-t2lxx\" (UID: \"1934f4f8-1ea2-4e98-bf8e-38ae890846e6\") " pod="openshift-must-gather-6c46z/must-gather-t2lxx" Feb 18 15:27:53 crc kubenswrapper[4817]: I0218 15:27:53.806913 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6c46z/must-gather-t2lxx" Feb 18 15:27:54 crc kubenswrapper[4817]: I0218 15:27:54.336778 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6c46z/must-gather-t2lxx"] Feb 18 15:27:55 crc kubenswrapper[4817]: I0218 15:27:55.016367 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6c46z/must-gather-t2lxx" event={"ID":"1934f4f8-1ea2-4e98-bf8e-38ae890846e6","Type":"ContainerStarted","Data":"e09d56384f727cc8a35e6fa01eacf222a5650536714dc270e270fccedc921ad2"} Feb 18 15:27:55 crc kubenswrapper[4817]: I0218 15:27:55.016947 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6c46z/must-gather-t2lxx" event={"ID":"1934f4f8-1ea2-4e98-bf8e-38ae890846e6","Type":"ContainerStarted","Data":"0ca754a60390073d53c6e9963f75a13413d45ba21c76dcea39c27b15c109f77e"} Feb 18 15:27:55 crc kubenswrapper[4817]: I0218 15:27:55.016965 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6c46z/must-gather-t2lxx" event={"ID":"1934f4f8-1ea2-4e98-bf8e-38ae890846e6","Type":"ContainerStarted","Data":"32dd39c30817a032bcb1421ae52c04ee206cd210191f0a0d54f7002dff80ee92"} Feb 18 15:27:55 crc kubenswrapper[4817]: I0218 15:27:55.034717 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6c46z/must-gather-t2lxx" podStartSLOduration=2.03470206 podStartE2EDuration="2.03470206s" podCreationTimestamp="2026-02-18 15:27:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 15:27:55.02876889 +0000 UTC m=+5337.604304873" watchObservedRunningTime="2026-02-18 15:27:55.03470206 +0000 UTC m=+5337.610238033" Feb 18 15:27:56 crc kubenswrapper[4817]: E0218 15:27:56.821135 4817 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.38:58590->38.102.83.38:36749: write tcp 38.102.83.38:58590->38.102.83.38:36749: write: broken pipe Feb 18 15:27:57 crc kubenswrapper[4817]: E0218 15:27:57.599895 4817 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.38:58694->38.102.83.38:36749: write tcp 38.102.83.38:58694->38.102.83.38:36749: write: connection reset by peer Feb 18 15:27:58 crc kubenswrapper[4817]: I0218 15:27:58.478623 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6c46z/crc-debug-tkbb4"] Feb 18 15:27:58 crc kubenswrapper[4817]: I0218 15:27:58.480799 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6c46z/crc-debug-tkbb4" Feb 18 15:27:58 crc kubenswrapper[4817]: I0218 15:27:58.483253 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-6c46z"/"default-dockercfg-gttqm" Feb 18 15:27:58 crc kubenswrapper[4817]: I0218 15:27:58.522136 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x8z9\" (UniqueName: \"kubernetes.io/projected/321ad991-dbfc-41ac-82f0-67e963a15028-kube-api-access-7x8z9\") pod \"crc-debug-tkbb4\" (UID: \"321ad991-dbfc-41ac-82f0-67e963a15028\") " pod="openshift-must-gather-6c46z/crc-debug-tkbb4" Feb 18 15:27:58 crc kubenswrapper[4817]: I0218 15:27:58.522198 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/321ad991-dbfc-41ac-82f0-67e963a15028-host\") pod \"crc-debug-tkbb4\" (UID: \"321ad991-dbfc-41ac-82f0-67e963a15028\") " pod="openshift-must-gather-6c46z/crc-debug-tkbb4" Feb 18 15:27:58 crc kubenswrapper[4817]: I0218 15:27:58.624481 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/321ad991-dbfc-41ac-82f0-67e963a15028-host\") pod \"crc-debug-tkbb4\" (UID: \"321ad991-dbfc-41ac-82f0-67e963a15028\") " pod="openshift-must-gather-6c46z/crc-debug-tkbb4" Feb 18 15:27:58 crc kubenswrapper[4817]: I0218 15:27:58.624667 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/321ad991-dbfc-41ac-82f0-67e963a15028-host\") pod \"crc-debug-tkbb4\" (UID: \"321ad991-dbfc-41ac-82f0-67e963a15028\") " pod="openshift-must-gather-6c46z/crc-debug-tkbb4" Feb 18 15:27:58 crc kubenswrapper[4817]: I0218 15:27:58.625039 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x8z9\" (UniqueName: \"kubernetes.io/projected/321ad991-dbfc-41ac-82f0-67e963a15028-kube-api-access-7x8z9\") pod \"crc-debug-tkbb4\" (UID: \"321ad991-dbfc-41ac-82f0-67e963a15028\") " pod="openshift-must-gather-6c46z/crc-debug-tkbb4" Feb 18 15:27:58 crc kubenswrapper[4817]: I0218 15:27:58.646283 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x8z9\" (UniqueName: \"kubernetes.io/projected/321ad991-dbfc-41ac-82f0-67e963a15028-kube-api-access-7x8z9\") pod \"crc-debug-tkbb4\" (UID: \"321ad991-dbfc-41ac-82f0-67e963a15028\") " pod="openshift-must-gather-6c46z/crc-debug-tkbb4" Feb 18 15:27:58 crc kubenswrapper[4817]: I0218 15:27:58.798524 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6c46z/crc-debug-tkbb4" Feb 18 15:27:58 crc kubenswrapper[4817]: W0218 15:27:58.825573 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod321ad991_dbfc_41ac_82f0_67e963a15028.slice/crio-c91acc4a2ca5705a0fe4693126b16026a11ec14e81e9914129daa3e08791dcfb WatchSource:0}: Error finding container c91acc4a2ca5705a0fe4693126b16026a11ec14e81e9914129daa3e08791dcfb: Status 404 returned error can't find the container with id c91acc4a2ca5705a0fe4693126b16026a11ec14e81e9914129daa3e08791dcfb Feb 18 15:27:59 crc kubenswrapper[4817]: I0218 15:27:59.051021 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6c46z/crc-debug-tkbb4" event={"ID":"321ad991-dbfc-41ac-82f0-67e963a15028","Type":"ContainerStarted","Data":"c91acc4a2ca5705a0fe4693126b16026a11ec14e81e9914129daa3e08791dcfb"} Feb 18 15:28:00 crc kubenswrapper[4817]: I0218 15:28:00.062376 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6c46z/crc-debug-tkbb4" event={"ID":"321ad991-dbfc-41ac-82f0-67e963a15028","Type":"ContainerStarted","Data":"a0805254dc44a0033e23089b678026fddb479255f76f6fbb3674b29e171925ad"} Feb 18 15:28:00 crc kubenswrapper[4817]: I0218 15:28:00.086512 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6c46z/crc-debug-tkbb4" podStartSLOduration=2.086492422 podStartE2EDuration="2.086492422s" podCreationTimestamp="2026-02-18 15:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 15:28:00.076651803 +0000 UTC m=+5342.652187786" watchObservedRunningTime="2026-02-18 15:28:00.086492422 +0000 UTC m=+5342.662028405" Feb 18 15:28:04 crc kubenswrapper[4817]: I0218 15:28:04.172223 4817 scope.go:117] "RemoveContainer" containerID="4391c0b6a1ccf075965fda152c64cede71fd2ad1d19321476a2f1c8612ddc514" Feb 18 15:28:04 crc kubenswrapper[4817]: E0218 15:28:04.173248 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:28:16 crc kubenswrapper[4817]: I0218 15:28:16.172105 4817 scope.go:117] "RemoveContainer" containerID="4391c0b6a1ccf075965fda152c64cede71fd2ad1d19321476a2f1c8612ddc514" Feb 18 15:28:16 crc kubenswrapper[4817]: E0218 15:28:16.173026 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:28:28 crc kubenswrapper[4817]: I0218 15:28:28.180608 4817 scope.go:117] "RemoveContainer" containerID="4391c0b6a1ccf075965fda152c64cede71fd2ad1d19321476a2f1c8612ddc514" Feb 18 15:28:28 crc kubenswrapper[4817]: E0218 15:28:28.181315 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:28:39 crc kubenswrapper[4817]: I0218 15:28:39.172204 4817 scope.go:117] "RemoveContainer" containerID="4391c0b6a1ccf075965fda152c64cede71fd2ad1d19321476a2f1c8612ddc514" Feb 18 15:28:39 crc kubenswrapper[4817]: E0218 15:28:39.174551 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:28:52 crc kubenswrapper[4817]: I0218 15:28:52.177691 4817 scope.go:117] "RemoveContainer" containerID="4391c0b6a1ccf075965fda152c64cede71fd2ad1d19321476a2f1c8612ddc514" Feb 18 15:28:52 crc kubenswrapper[4817]: E0218 15:28:52.178677 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:28:54 crc kubenswrapper[4817]: I0218 15:28:54.652341 4817 generic.go:334] "Generic (PLEG): container finished" podID="321ad991-dbfc-41ac-82f0-67e963a15028" containerID="a0805254dc44a0033e23089b678026fddb479255f76f6fbb3674b29e171925ad" exitCode=0 Feb 18 15:28:54 crc kubenswrapper[4817]: I0218 15:28:54.652422 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6c46z/crc-debug-tkbb4" event={"ID":"321ad991-dbfc-41ac-82f0-67e963a15028","Type":"ContainerDied","Data":"a0805254dc44a0033e23089b678026fddb479255f76f6fbb3674b29e171925ad"} Feb 18 15:28:55 crc kubenswrapper[4817]: I0218 15:28:55.798217 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6c46z/crc-debug-tkbb4" Feb 18 15:28:55 crc kubenswrapper[4817]: I0218 15:28:55.840475 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6c46z/crc-debug-tkbb4"] Feb 18 15:28:55 crc kubenswrapper[4817]: I0218 15:28:55.853088 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6c46z/crc-debug-tkbb4"] Feb 18 15:28:55 crc kubenswrapper[4817]: I0218 15:28:55.883868 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x8z9\" (UniqueName: \"kubernetes.io/projected/321ad991-dbfc-41ac-82f0-67e963a15028-kube-api-access-7x8z9\") pod \"321ad991-dbfc-41ac-82f0-67e963a15028\" (UID: \"321ad991-dbfc-41ac-82f0-67e963a15028\") " Feb 18 15:28:55 crc kubenswrapper[4817]: I0218 15:28:55.884074 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/321ad991-dbfc-41ac-82f0-67e963a15028-host\") pod \"321ad991-dbfc-41ac-82f0-67e963a15028\" (UID: \"321ad991-dbfc-41ac-82f0-67e963a15028\") " Feb 18 15:28:55 crc kubenswrapper[4817]: I0218 15:28:55.884214 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/321ad991-dbfc-41ac-82f0-67e963a15028-host" (OuterVolumeSpecName: "host") pod "321ad991-dbfc-41ac-82f0-67e963a15028" (UID: "321ad991-dbfc-41ac-82f0-67e963a15028"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 15:28:55 crc kubenswrapper[4817]: I0218 15:28:55.884615 4817 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/321ad991-dbfc-41ac-82f0-67e963a15028-host\") on node \"crc\" DevicePath \"\"" Feb 18 15:28:55 crc kubenswrapper[4817]: I0218 15:28:55.890513 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/321ad991-dbfc-41ac-82f0-67e963a15028-kube-api-access-7x8z9" (OuterVolumeSpecName: "kube-api-access-7x8z9") pod "321ad991-dbfc-41ac-82f0-67e963a15028" (UID: "321ad991-dbfc-41ac-82f0-67e963a15028"). InnerVolumeSpecName "kube-api-access-7x8z9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 15:28:55 crc kubenswrapper[4817]: I0218 15:28:55.991669 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7x8z9\" (UniqueName: \"kubernetes.io/projected/321ad991-dbfc-41ac-82f0-67e963a15028-kube-api-access-7x8z9\") on node \"crc\" DevicePath \"\"" Feb 18 15:28:56 crc kubenswrapper[4817]: I0218 15:28:56.183279 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="321ad991-dbfc-41ac-82f0-67e963a15028" path="/var/lib/kubelet/pods/321ad991-dbfc-41ac-82f0-67e963a15028/volumes" Feb 18 15:28:56 crc kubenswrapper[4817]: I0218 15:28:56.673118 4817 scope.go:117] "RemoveContainer" containerID="a0805254dc44a0033e23089b678026fddb479255f76f6fbb3674b29e171925ad" Feb 18 15:28:56 crc kubenswrapper[4817]: I0218 15:28:56.673201 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6c46z/crc-debug-tkbb4" Feb 18 15:28:57 crc kubenswrapper[4817]: I0218 15:28:57.117789 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6c46z/crc-debug-k857d"] Feb 18 15:28:57 crc kubenswrapper[4817]: E0218 15:28:57.118237 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="321ad991-dbfc-41ac-82f0-67e963a15028" containerName="container-00" Feb 18 15:28:57 crc kubenswrapper[4817]: I0218 15:28:57.118249 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="321ad991-dbfc-41ac-82f0-67e963a15028" containerName="container-00" Feb 18 15:28:57 crc kubenswrapper[4817]: I0218 15:28:57.118461 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="321ad991-dbfc-41ac-82f0-67e963a15028" containerName="container-00" Feb 18 15:28:57 crc kubenswrapper[4817]: I0218 15:28:57.119162 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6c46z/crc-debug-k857d" Feb 18 15:28:57 crc kubenswrapper[4817]: I0218 15:28:57.121705 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-6c46z"/"default-dockercfg-gttqm" Feb 18 15:28:57 crc kubenswrapper[4817]: I0218 15:28:57.218688 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n78d\" (UniqueName: \"kubernetes.io/projected/58607420-0840-4942-bf6b-e168b061e00a-kube-api-access-6n78d\") pod \"crc-debug-k857d\" (UID: \"58607420-0840-4942-bf6b-e168b061e00a\") " pod="openshift-must-gather-6c46z/crc-debug-k857d" Feb 18 15:28:57 crc kubenswrapper[4817]: I0218 15:28:57.218786 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/58607420-0840-4942-bf6b-e168b061e00a-host\") pod \"crc-debug-k857d\" (UID: \"58607420-0840-4942-bf6b-e168b061e00a\") " pod="openshift-must-gather-6c46z/crc-debug-k857d" Feb 18 15:28:57 crc kubenswrapper[4817]: I0218 15:28:57.320783 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n78d\" (UniqueName: \"kubernetes.io/projected/58607420-0840-4942-bf6b-e168b061e00a-kube-api-access-6n78d\") pod \"crc-debug-k857d\" (UID: \"58607420-0840-4942-bf6b-e168b061e00a\") " pod="openshift-must-gather-6c46z/crc-debug-k857d" Feb 18 15:28:57 crc kubenswrapper[4817]: I0218 15:28:57.321177 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/58607420-0840-4942-bf6b-e168b061e00a-host\") pod \"crc-debug-k857d\" (UID: \"58607420-0840-4942-bf6b-e168b061e00a\") " pod="openshift-must-gather-6c46z/crc-debug-k857d" Feb 18 15:28:57 crc kubenswrapper[4817]: I0218 15:28:57.321286 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/58607420-0840-4942-bf6b-e168b061e00a-host\") pod \"crc-debug-k857d\" (UID: \"58607420-0840-4942-bf6b-e168b061e00a\") " pod="openshift-must-gather-6c46z/crc-debug-k857d" Feb 18 15:28:57 crc kubenswrapper[4817]: I0218 15:28:57.345865 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n78d\" (UniqueName: \"kubernetes.io/projected/58607420-0840-4942-bf6b-e168b061e00a-kube-api-access-6n78d\") pod \"crc-debug-k857d\" (UID: \"58607420-0840-4942-bf6b-e168b061e00a\") " pod="openshift-must-gather-6c46z/crc-debug-k857d" Feb 18 15:28:57 crc kubenswrapper[4817]: I0218 15:28:57.436639 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6c46z/crc-debug-k857d" Feb 18 15:28:57 crc kubenswrapper[4817]: I0218 15:28:57.692804 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6c46z/crc-debug-k857d" event={"ID":"58607420-0840-4942-bf6b-e168b061e00a","Type":"ContainerStarted","Data":"98bf9bf296b4a0d747ce6a578e2a0391dd7b18399f32218aa39acb4367d7e467"} Feb 18 15:28:58 crc kubenswrapper[4817]: I0218 15:28:58.704474 4817 generic.go:334] "Generic (PLEG): container finished" podID="58607420-0840-4942-bf6b-e168b061e00a" containerID="4c2f4e5f4370220ea6a491124de236510daa3a12d6915cd706d301a3bade5e74" exitCode=0 Feb 18 15:28:58 crc kubenswrapper[4817]: I0218 15:28:58.704573 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6c46z/crc-debug-k857d" event={"ID":"58607420-0840-4942-bf6b-e168b061e00a","Type":"ContainerDied","Data":"4c2f4e5f4370220ea6a491124de236510daa3a12d6915cd706d301a3bade5e74"} Feb 18 15:28:59 crc kubenswrapper[4817]: I0218 15:28:59.842179 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6c46z/crc-debug-k857d" Feb 18 15:28:59 crc kubenswrapper[4817]: I0218 15:28:59.972042 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n78d\" (UniqueName: \"kubernetes.io/projected/58607420-0840-4942-bf6b-e168b061e00a-kube-api-access-6n78d\") pod \"58607420-0840-4942-bf6b-e168b061e00a\" (UID: \"58607420-0840-4942-bf6b-e168b061e00a\") " Feb 18 15:28:59 crc kubenswrapper[4817]: I0218 15:28:59.972182 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/58607420-0840-4942-bf6b-e168b061e00a-host\") pod \"58607420-0840-4942-bf6b-e168b061e00a\" (UID: \"58607420-0840-4942-bf6b-e168b061e00a\") " Feb 18 15:28:59 crc kubenswrapper[4817]: I0218 15:28:59.972704 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/58607420-0840-4942-bf6b-e168b061e00a-host" (OuterVolumeSpecName: "host") pod "58607420-0840-4942-bf6b-e168b061e00a" (UID: "58607420-0840-4942-bf6b-e168b061e00a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 15:28:59 crc kubenswrapper[4817]: I0218 15:28:59.983219 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58607420-0840-4942-bf6b-e168b061e00a-kube-api-access-6n78d" (OuterVolumeSpecName: "kube-api-access-6n78d") pod "58607420-0840-4942-bf6b-e168b061e00a" (UID: "58607420-0840-4942-bf6b-e168b061e00a"). InnerVolumeSpecName "kube-api-access-6n78d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 15:29:00 crc kubenswrapper[4817]: I0218 15:29:00.074195 4817 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/58607420-0840-4942-bf6b-e168b061e00a-host\") on node \"crc\" DevicePath \"\"" Feb 18 15:29:00 crc kubenswrapper[4817]: I0218 15:29:00.074228 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6n78d\" (UniqueName: \"kubernetes.io/projected/58607420-0840-4942-bf6b-e168b061e00a-kube-api-access-6n78d\") on node \"crc\" DevicePath \"\"" Feb 18 15:29:00 crc kubenswrapper[4817]: I0218 15:29:00.724532 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6c46z/crc-debug-k857d" event={"ID":"58607420-0840-4942-bf6b-e168b061e00a","Type":"ContainerDied","Data":"98bf9bf296b4a0d747ce6a578e2a0391dd7b18399f32218aa39acb4367d7e467"} Feb 18 15:29:00 crc kubenswrapper[4817]: I0218 15:29:00.724560 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6c46z/crc-debug-k857d" Feb 18 15:29:00 crc kubenswrapper[4817]: I0218 15:29:00.724571 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98bf9bf296b4a0d747ce6a578e2a0391dd7b18399f32218aa39acb4367d7e467" Feb 18 15:29:00 crc kubenswrapper[4817]: I0218 15:29:00.932692 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6c46z/crc-debug-k857d"] Feb 18 15:29:00 crc kubenswrapper[4817]: I0218 15:29:00.954755 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6c46z/crc-debug-k857d"] Feb 18 15:29:02 crc kubenswrapper[4817]: I0218 15:29:02.184079 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58607420-0840-4942-bf6b-e168b061e00a" path="/var/lib/kubelet/pods/58607420-0840-4942-bf6b-e168b061e00a/volumes" Feb 18 15:29:02 crc kubenswrapper[4817]: I0218 15:29:02.656219 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6c46z/crc-debug-vxwrh"] Feb 18 15:29:02 crc kubenswrapper[4817]: E0218 15:29:02.656943 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58607420-0840-4942-bf6b-e168b061e00a" containerName="container-00" Feb 18 15:29:02 crc kubenswrapper[4817]: I0218 15:29:02.656959 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="58607420-0840-4942-bf6b-e168b061e00a" containerName="container-00" Feb 18 15:29:02 crc kubenswrapper[4817]: I0218 15:29:02.657185 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="58607420-0840-4942-bf6b-e168b061e00a" containerName="container-00" Feb 18 15:29:02 crc kubenswrapper[4817]: I0218 15:29:02.657914 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6c46z/crc-debug-vxwrh" Feb 18 15:29:02 crc kubenswrapper[4817]: I0218 15:29:02.660601 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-6c46z"/"default-dockercfg-gttqm" Feb 18 15:29:02 crc kubenswrapper[4817]: I0218 15:29:02.826506 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a79d9df-dc40-4a60-8506-8367f9165772-host\") pod \"crc-debug-vxwrh\" (UID: \"0a79d9df-dc40-4a60-8506-8367f9165772\") " pod="openshift-must-gather-6c46z/crc-debug-vxwrh" Feb 18 15:29:02 crc kubenswrapper[4817]: I0218 15:29:02.826724 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8xl4\" (UniqueName: \"kubernetes.io/projected/0a79d9df-dc40-4a60-8506-8367f9165772-kube-api-access-t8xl4\") pod \"crc-debug-vxwrh\" (UID: \"0a79d9df-dc40-4a60-8506-8367f9165772\") " pod="openshift-must-gather-6c46z/crc-debug-vxwrh" Feb 18 15:29:02 crc kubenswrapper[4817]: I0218 15:29:02.928749 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a79d9df-dc40-4a60-8506-8367f9165772-host\") pod \"crc-debug-vxwrh\" (UID: \"0a79d9df-dc40-4a60-8506-8367f9165772\") " pod="openshift-must-gather-6c46z/crc-debug-vxwrh" Feb 18 15:29:02 crc kubenswrapper[4817]: I0218 15:29:02.928904 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8xl4\" (UniqueName: \"kubernetes.io/projected/0a79d9df-dc40-4a60-8506-8367f9165772-kube-api-access-t8xl4\") pod \"crc-debug-vxwrh\" (UID: \"0a79d9df-dc40-4a60-8506-8367f9165772\") " pod="openshift-must-gather-6c46z/crc-debug-vxwrh" Feb 18 15:29:02 crc kubenswrapper[4817]: I0218 15:29:02.928937 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a79d9df-dc40-4a60-8506-8367f9165772-host\") pod \"crc-debug-vxwrh\" (UID: \"0a79d9df-dc40-4a60-8506-8367f9165772\") " pod="openshift-must-gather-6c46z/crc-debug-vxwrh" Feb 18 15:29:03 crc kubenswrapper[4817]: I0218 15:29:03.448911 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8xl4\" (UniqueName: \"kubernetes.io/projected/0a79d9df-dc40-4a60-8506-8367f9165772-kube-api-access-t8xl4\") pod \"crc-debug-vxwrh\" (UID: \"0a79d9df-dc40-4a60-8506-8367f9165772\") " pod="openshift-must-gather-6c46z/crc-debug-vxwrh" Feb 18 15:29:03 crc kubenswrapper[4817]: I0218 15:29:03.575972 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6c46z/crc-debug-vxwrh" Feb 18 15:29:03 crc kubenswrapper[4817]: I0218 15:29:03.764174 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6c46z/crc-debug-vxwrh" event={"ID":"0a79d9df-dc40-4a60-8506-8367f9165772","Type":"ContainerStarted","Data":"d632020d0aa4e2943a982322b81bef4462a5e238c88ed431870e215d728d60f3"} Feb 18 15:29:04 crc kubenswrapper[4817]: I0218 15:29:04.171751 4817 scope.go:117] "RemoveContainer" containerID="4391c0b6a1ccf075965fda152c64cede71fd2ad1d19321476a2f1c8612ddc514" Feb 18 15:29:04 crc kubenswrapper[4817]: E0218 15:29:04.172338 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:29:04 crc kubenswrapper[4817]: I0218 15:29:04.777698 4817 generic.go:334] "Generic (PLEG): container finished" podID="0a79d9df-dc40-4a60-8506-8367f9165772" containerID="0aace716b956e13ef2f74435e474965f0666a13d93b336abde7d0bc9b67abb1d" exitCode=0 Feb 18 15:29:04 crc kubenswrapper[4817]: I0218 15:29:04.777740 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6c46z/crc-debug-vxwrh" event={"ID":"0a79d9df-dc40-4a60-8506-8367f9165772","Type":"ContainerDied","Data":"0aace716b956e13ef2f74435e474965f0666a13d93b336abde7d0bc9b67abb1d"} Feb 18 15:29:04 crc kubenswrapper[4817]: I0218 15:29:04.815177 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6c46z/crc-debug-vxwrh"] Feb 18 15:29:04 crc kubenswrapper[4817]: I0218 15:29:04.825135 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6c46z/crc-debug-vxwrh"] Feb 18 15:29:05 crc kubenswrapper[4817]: I0218 15:29:05.918022 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6c46z/crc-debug-vxwrh" Feb 18 15:29:05 crc kubenswrapper[4817]: I0218 15:29:05.990137 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8xl4\" (UniqueName: \"kubernetes.io/projected/0a79d9df-dc40-4a60-8506-8367f9165772-kube-api-access-t8xl4\") pod \"0a79d9df-dc40-4a60-8506-8367f9165772\" (UID: \"0a79d9df-dc40-4a60-8506-8367f9165772\") " Feb 18 15:29:05 crc kubenswrapper[4817]: I0218 15:29:05.990443 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a79d9df-dc40-4a60-8506-8367f9165772-host\") pod \"0a79d9df-dc40-4a60-8506-8367f9165772\" (UID: \"0a79d9df-dc40-4a60-8506-8367f9165772\") " Feb 18 15:29:05 crc kubenswrapper[4817]: I0218 15:29:05.990622 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0a79d9df-dc40-4a60-8506-8367f9165772-host" (OuterVolumeSpecName: "host") pod "0a79d9df-dc40-4a60-8506-8367f9165772" (UID: "0a79d9df-dc40-4a60-8506-8367f9165772"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 18 15:29:05 crc kubenswrapper[4817]: I0218 15:29:05.991474 4817 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a79d9df-dc40-4a60-8506-8367f9165772-host\") on node \"crc\" DevicePath \"\"" Feb 18 15:29:06 crc kubenswrapper[4817]: I0218 15:29:06.000150 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a79d9df-dc40-4a60-8506-8367f9165772-kube-api-access-t8xl4" (OuterVolumeSpecName: "kube-api-access-t8xl4") pod "0a79d9df-dc40-4a60-8506-8367f9165772" (UID: "0a79d9df-dc40-4a60-8506-8367f9165772"). InnerVolumeSpecName "kube-api-access-t8xl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 15:29:06 crc kubenswrapper[4817]: I0218 15:29:06.094192 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8xl4\" (UniqueName: \"kubernetes.io/projected/0a79d9df-dc40-4a60-8506-8367f9165772-kube-api-access-t8xl4\") on node \"crc\" DevicePath \"\"" Feb 18 15:29:06 crc kubenswrapper[4817]: I0218 15:29:06.184413 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a79d9df-dc40-4a60-8506-8367f9165772" path="/var/lib/kubelet/pods/0a79d9df-dc40-4a60-8506-8367f9165772/volumes" Feb 18 15:29:06 crc kubenswrapper[4817]: I0218 15:29:06.798647 4817 scope.go:117] "RemoveContainer" containerID="0aace716b956e13ef2f74435e474965f0666a13d93b336abde7d0bc9b67abb1d" Feb 18 15:29:06 crc kubenswrapper[4817]: I0218 15:29:06.798673 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6c46z/crc-debug-vxwrh" Feb 18 15:29:17 crc kubenswrapper[4817]: I0218 15:29:17.173420 4817 scope.go:117] "RemoveContainer" containerID="4391c0b6a1ccf075965fda152c64cede71fd2ad1d19321476a2f1c8612ddc514" Feb 18 15:29:17 crc kubenswrapper[4817]: E0218 15:29:17.174308 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:29:31 crc kubenswrapper[4817]: I0218 15:29:31.171599 4817 scope.go:117] "RemoveContainer" containerID="4391c0b6a1ccf075965fda152c64cede71fd2ad1d19321476a2f1c8612ddc514" Feb 18 15:29:31 crc kubenswrapper[4817]: E0218 15:29:31.172246 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:29:42 crc kubenswrapper[4817]: I0218 15:29:42.171832 4817 scope.go:117] "RemoveContainer" containerID="4391c0b6a1ccf075965fda152c64cede71fd2ad1d19321476a2f1c8612ddc514" Feb 18 15:29:42 crc kubenswrapper[4817]: E0218 15:29:42.172493 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:29:42 crc kubenswrapper[4817]: I0218 15:29:42.788045 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_9e5146f3-4a88-4e31-82e7-0e0f72188d22/init-config-reloader/0.log" Feb 18 15:29:43 crc kubenswrapper[4817]: I0218 15:29:43.414729 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_9e5146f3-4a88-4e31-82e7-0e0f72188d22/config-reloader/0.log" Feb 18 15:29:43 crc kubenswrapper[4817]: I0218 15:29:43.431211 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_9e5146f3-4a88-4e31-82e7-0e0f72188d22/init-config-reloader/0.log" Feb 18 15:29:43 crc kubenswrapper[4817]: I0218 15:29:43.446822 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_9e5146f3-4a88-4e31-82e7-0e0f72188d22/alertmanager/0.log" Feb 18 15:29:43 crc kubenswrapper[4817]: I0218 15:29:43.611357 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-759f74666b-ms4jl_f6b68ae5-35a8-4050-9c64-e6ef834803fd/barbican-api/0.log" Feb 18 15:29:43 crc kubenswrapper[4817]: I0218 15:29:43.645498 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-759f74666b-ms4jl_f6b68ae5-35a8-4050-9c64-e6ef834803fd/barbican-api-log/0.log" Feb 18 15:29:43 crc kubenswrapper[4817]: I0218 15:29:43.687055 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5bf77b97db-hknps_33929d5f-e679-44e2-a0e9-816088e17cb1/barbican-keystone-listener/0.log" Feb 18 15:29:43 crc kubenswrapper[4817]: I0218 15:29:43.925005 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5bf77b97db-hknps_33929d5f-e679-44e2-a0e9-816088e17cb1/barbican-keystone-listener-log/0.log" Feb 18 15:29:43 crc kubenswrapper[4817]: I0218 15:29:43.991249 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-78b498b86c-nn4vc_864e8c7f-e3b7-4f27-960e-df753b339571/barbican-worker-log/0.log" Feb 18 15:29:43 crc kubenswrapper[4817]: I0218 15:29:43.992462 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-78b498b86c-nn4vc_864e8c7f-e3b7-4f27-960e-df753b339571/barbican-worker/0.log" Feb 18 15:29:44 crc kubenswrapper[4817]: I0218 15:29:44.242817 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-5df4f_d825b3d4-dc9e-46b4-930e-21c1fa0c5ee9/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 15:29:44 crc kubenswrapper[4817]: I0218 15:29:44.281065 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8/ceilometer-central-agent/0.log" Feb 18 15:29:44 crc kubenswrapper[4817]: I0218 15:29:44.432283 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8/proxy-httpd/0.log" Feb 18 15:29:44 crc kubenswrapper[4817]: I0218 15:29:44.453699 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8/ceilometer-notification-agent/0.log" Feb 18 15:29:44 crc kubenswrapper[4817]: I0218 15:29:44.455733 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b6c6555c-9c58-4d2b-8e3b-b54331a1f9f8/sg-core/0.log" Feb 18 15:29:44 crc kubenswrapper[4817]: I0218 15:29:44.660669 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a414e293-71b9-44c3-8f07-20f3696f7db6/cinder-api-log/0.log" Feb 18 15:29:44 crc kubenswrapper[4817]: I0218 15:29:44.713587 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a414e293-71b9-44c3-8f07-20f3696f7db6/cinder-api/0.log" Feb 18 15:29:44 crc kubenswrapper[4817]: I0218 15:29:44.829129 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_89455d4a-c424-4e7a-85c5-42163318e132/cinder-scheduler/0.log" Feb 18 15:29:45 crc kubenswrapper[4817]: I0218 15:29:45.529440 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_09f7d0fc-a70f-4296-82f1-1cdd302a4a60/cloudkitty-api-log/0.log" Feb 18 15:29:45 crc kubenswrapper[4817]: I0218 15:29:45.541942 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_89455d4a-c424-4e7a-85c5-42163318e132/probe/0.log" Feb 18 15:29:45 crc kubenswrapper[4817]: I0218 15:29:45.703288 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_09f7d0fc-a70f-4296-82f1-1cdd302a4a60/cloudkitty-api/0.log" Feb 18 15:29:45 crc kubenswrapper[4817]: I0218 15:29:45.755023 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-compactor-0_75cbd0d0-2a48-48ba-9cae-d465da658b05/loki-compactor/0.log" Feb 18 15:29:45 crc kubenswrapper[4817]: I0218 15:29:45.906181 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-distributor-585d9bcbc-fmj4p_f8a07d20-5f6a-4cc8-9e98-6e3002a04f7b/loki-distributor/0.log" Feb 18 15:29:46 crc kubenswrapper[4817]: I0218 15:29:46.020751 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-7f8685b49f-j7gxb_e596742a-2a5e-4a0c-9177-2b5a1ce00651/gateway/0.log" Feb 18 15:29:46 crc kubenswrapper[4817]: I0218 15:29:46.096262 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-7f8685b49f-n8c26_864c0a91-5aa3-4a84-8b75-6f75e0883aea/gateway/0.log" Feb 18 15:29:46 crc kubenswrapper[4817]: I0218 15:29:46.732677 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-index-gateway-0_6d8b01c7-a1be-49d1-8417-ce412fa834a4/loki-index-gateway/0.log" Feb 18 15:29:46 crc kubenswrapper[4817]: I0218 15:29:46.733801 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-ingester-0_7f685dd5-8921-4e4a-a4d5-d19a499775f5/loki-ingester/0.log" Feb 18 15:29:46 crc kubenswrapper[4817]: I0218 15:29:46.842082 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-query-frontend-67bb4dfcd8-mbswz_00036a73-dd30-4b48-a135-19b064818e5c/loki-query-frontend/0.log" Feb 18 15:29:47 crc kubenswrapper[4817]: I0218 15:29:47.124649 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-vftp7_f04dc5ce-0657-4e8c-8c0a-3b86924ea903/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 15:29:47 crc kubenswrapper[4817]: I0218 15:29:47.437853 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-79xdx_e0745d01-0937-448d-a458-6f5823075a7a/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 15:29:47 crc kubenswrapper[4817]: I0218 15:29:47.654237 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7bb494c7f-kmtc2_d48ebb6a-086e-4e2e-b196-5f30c0a82b14/init/0.log" Feb 18 15:29:47 crc kubenswrapper[4817]: I0218 15:29:47.862253 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7bb494c7f-kmtc2_d48ebb6a-086e-4e2e-b196-5f30c0a82b14/init/0.log" Feb 18 15:29:47 crc kubenswrapper[4817]: I0218 15:29:47.969290 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7bb494c7f-kmtc2_d48ebb6a-086e-4e2e-b196-5f30c0a82b14/dnsmasq-dns/0.log" Feb 18 15:29:48 crc kubenswrapper[4817]: I0218 15:29:48.090133 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-5qxkc_5b90b086-b1c9-4a0a-9f4e-ebf1c8beb807/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 15:29:48 crc kubenswrapper[4817]: I0218 15:29:48.108318 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-querier-58c84b5844-qb4ph_b7632ade-ab1b-45b8-9f25-9fb98abc4f1a/loki-querier/0.log" Feb 18 15:29:48 crc kubenswrapper[4817]: I0218 15:29:48.260553 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fc799768-b6dd-4b19-aee6-909d985e2441/glance-httpd/0.log" Feb 18 15:29:48 crc kubenswrapper[4817]: I0218 15:29:48.307578 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fc799768-b6dd-4b19-aee6-909d985e2441/glance-log/0.log" Feb 18 15:29:48 crc kubenswrapper[4817]: I0218 15:29:48.487900 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b542b984-8146-47e2-b20a-1b344762c302/glance-httpd/0.log" Feb 18 15:29:48 crc kubenswrapper[4817]: I0218 15:29:48.496580 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b542b984-8146-47e2-b20a-1b344762c302/glance-log/0.log" Feb 18 15:29:48 crc kubenswrapper[4817]: I0218 15:29:48.681942 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-mn4cg_f54e3715-121a-4498-a552-5a9f1daed55c/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 15:29:48 crc kubenswrapper[4817]: I0218 15:29:48.889006 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-kttlh_2e45ac1d-02e2-457d-9944-cf1ecaf8edd3/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 15:29:49 crc kubenswrapper[4817]: I0218 15:29:49.129070 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29523781-vf8xd_06ad1a18-8a33-4ac1-a6df-9cb4b251c549/keystone-cron/0.log" Feb 18 15:29:49 crc kubenswrapper[4817]: I0218 15:29:49.321846 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-755bd56c8d-4mwpl_47ace64c-6cdb-4868-8655-7e149f33a069/keystone-api/0.log" Feb 18 15:29:49 crc kubenswrapper[4817]: I0218 15:29:49.401454 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_493be418-0841-4197-9fd7-50f22ecc6a5a/kube-state-metrics/0.log" Feb 18 15:29:49 crc kubenswrapper[4817]: I0218 15:29:49.523322 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-9dvn2_43d12b3f-f980-4075-8684-a97141a5474d/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 15:29:50 crc kubenswrapper[4817]: I0218 15:29:50.037377 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7c8c8d4f9c-f58g5_206709c6-0550-4932-8f0e-f9d4c342a26c/neutron-api/0.log" Feb 18 15:29:50 crc kubenswrapper[4817]: I0218 15:29:50.083694 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7c8c8d4f9c-f58g5_206709c6-0550-4932-8f0e-f9d4c342a26c/neutron-httpd/0.log" Feb 18 15:29:50 crc kubenswrapper[4817]: I0218 15:29:50.306344 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-gn2g2_43e2549e-9d03-495a-852e-0d0c283c5d51/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 15:29:50 crc kubenswrapper[4817]: I0218 15:29:50.851250 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_bea1dd6e-5f07-4dd6-a191-f07f59d36043/nova-api-log/0.log" Feb 18 15:29:51 crc kubenswrapper[4817]: I0218 15:29:51.138067 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_4b99418c-8ded-4927-afdf-a9a6edbabf84/nova-cell0-conductor-conductor/0.log" Feb 18 15:29:51 crc kubenswrapper[4817]: I0218 15:29:51.427340 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_bea1dd6e-5f07-4dd6-a191-f07f59d36043/nova-api-api/0.log" Feb 18 15:29:51 crc kubenswrapper[4817]: I0218 15:29:51.513684 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_7bab4ac8-afc6-4ac1-938c-2d04b5dc7822/nova-cell1-conductor-conductor/0.log" Feb 18 15:29:51 crc kubenswrapper[4817]: I0218 15:29:51.830840 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-jd5f6_095f77dc-6f9e-4845-9cfe-6aeac65d3ab0/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 15:29:51 crc kubenswrapper[4817]: I0218 15:29:51.833015 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_fc896752-3d52-40cd-8d7f-2b10ba1afab5/nova-cell1-novncproxy-novncproxy/0.log" Feb 18 15:29:52 crc kubenswrapper[4817]: I0218 15:29:52.167554 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_bd52eab4-329f-4cab-83cc-c082d2d3f1d4/nova-metadata-log/0.log" Feb 18 15:29:53 crc kubenswrapper[4817]: I0218 15:29:53.035965 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_641785a9-2372-4857-8882-192bf7d7fe45/mysql-bootstrap/0.log" Feb 18 15:29:53 crc kubenswrapper[4817]: I0218 15:29:53.039511 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_d5bac496-cea3-4c61-91c1-0c0ebc884737/nova-scheduler-scheduler/0.log" Feb 18 15:29:53 crc kubenswrapper[4817]: I0218 15:29:53.353524 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_641785a9-2372-4857-8882-192bf7d7fe45/mysql-bootstrap/0.log" Feb 18 15:29:53 crc kubenswrapper[4817]: I0218 15:29:53.370381 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_641785a9-2372-4857-8882-192bf7d7fe45/galera/0.log" Feb 18 15:29:53 crc kubenswrapper[4817]: I0218 15:29:53.586746 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_def7b080-de6e-49f1-9437-44d6f40b48c4/mysql-bootstrap/0.log" Feb 18 15:29:53 crc kubenswrapper[4817]: I0218 15:29:53.876653 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_def7b080-de6e-49f1-9437-44d6f40b48c4/galera/0.log" Feb 18 15:29:53 crc kubenswrapper[4817]: I0218 15:29:53.893230 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_def7b080-de6e-49f1-9437-44d6f40b48c4/mysql-bootstrap/0.log" Feb 18 15:29:54 crc kubenswrapper[4817]: I0218 15:29:54.073754 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_ace81bfb-db15-429f-9168-936817dad694/openstackclient/0.log" Feb 18 15:29:54 crc kubenswrapper[4817]: I0218 15:29:54.737183 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_bd52eab4-329f-4cab-83cc-c082d2d3f1d4/nova-metadata-metadata/0.log" Feb 18 15:29:54 crc kubenswrapper[4817]: I0218 15:29:54.995641 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-gk7p2_be226950-1270-454d-8b23-2260dba4c819/openstack-network-exporter/0.log" Feb 18 15:29:55 crc kubenswrapper[4817]: I0218 15:29:55.010586 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-9bcxg_ddb73215-bd2a-47eb-bbcf-b4708117244f/ovn-controller/0.log" Feb 18 15:29:55 crc kubenswrapper[4817]: I0218 15:29:55.259084 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-46wx9_2a166377-16ac-4c6b-9207-cddf8c814dc1/ovsdb-server-init/0.log" Feb 18 15:29:55 crc kubenswrapper[4817]: I0218 15:29:55.440660 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-46wx9_2a166377-16ac-4c6b-9207-cddf8c814dc1/ovsdb-server-init/0.log" Feb 18 15:29:55 crc kubenswrapper[4817]: I0218 15:29:55.496899 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-46wx9_2a166377-16ac-4c6b-9207-cddf8c814dc1/ovs-vswitchd/0.log" Feb 18 15:29:55 crc kubenswrapper[4817]: I0218 15:29:55.520930 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-46wx9_2a166377-16ac-4c6b-9207-cddf8c814dc1/ovsdb-server/0.log" Feb 18 15:29:55 crc kubenswrapper[4817]: I0218 15:29:55.804043 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-6twbf_ae44a6c3-1d36-4f95-b52a-a1bedc6ec272/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 15:29:55 crc kubenswrapper[4817]: I0218 15:29:55.978471 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_8b26cbdf-a148-4d07-bffc-afa241bc30e2/openstack-network-exporter/0.log" Feb 18 15:29:56 crc kubenswrapper[4817]: I0218 15:29:56.026255 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_8b26cbdf-a148-4d07-bffc-afa241bc30e2/ovn-northd/0.log" Feb 18 15:29:56 crc kubenswrapper[4817]: I0218 15:29:56.171251 4817 scope.go:117] "RemoveContainer" containerID="4391c0b6a1ccf075965fda152c64cede71fd2ad1d19321476a2f1c8612ddc514" Feb 18 15:29:56 crc kubenswrapper[4817]: E0218 15:29:56.171607 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:29:56 crc kubenswrapper[4817]: I0218 15:29:56.203493 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b89d17a9-16cb-4abe-ba88-107ce95dbceb/openstack-network-exporter/0.log" Feb 18 15:29:56 crc kubenswrapper[4817]: I0218 15:29:56.257436 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b89d17a9-16cb-4abe-ba88-107ce95dbceb/ovsdbserver-nb/0.log" Feb 18 15:29:56 crc kubenswrapper[4817]: I0218 15:29:56.496049 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_317526d8-4a73-4ae4-9607-b1d7375ba7f6/ovsdbserver-sb/0.log" Feb 18 15:29:56 crc kubenswrapper[4817]: I0218 15:29:56.524322 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_317526d8-4a73-4ae4-9607-b1d7375ba7f6/openstack-network-exporter/0.log" Feb 18 15:29:56 crc kubenswrapper[4817]: I0218 15:29:56.914027 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7b57694bbd-rpg5b_d284bf7a-b9d0-4fe3-b8bc-06a64d104853/placement-api/0.log" Feb 18 15:29:56 crc kubenswrapper[4817]: I0218 15:29:56.978796 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7b57694bbd-rpg5b_d284bf7a-b9d0-4fe3-b8bc-06a64d104853/placement-log/0.log" Feb 18 15:29:57 crc kubenswrapper[4817]: I0218 15:29:57.126022 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fbb28d6a-260d-45fa-80ec-9f583e8fc37b/init-config-reloader/0.log" Feb 18 15:29:57 crc kubenswrapper[4817]: I0218 15:29:57.327778 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fbb28d6a-260d-45fa-80ec-9f583e8fc37b/init-config-reloader/0.log" Feb 18 15:29:57 crc kubenswrapper[4817]: I0218 15:29:57.379244 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fbb28d6a-260d-45fa-80ec-9f583e8fc37b/config-reloader/0.log" Feb 18 15:29:57 crc kubenswrapper[4817]: I0218 15:29:57.404616 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fbb28d6a-260d-45fa-80ec-9f583e8fc37b/prometheus/0.log" Feb 18 15:29:57 crc kubenswrapper[4817]: I0218 15:29:57.589902 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fbb28d6a-260d-45fa-80ec-9f583e8fc37b/thanos-sidecar/0.log" Feb 18 15:29:57 crc kubenswrapper[4817]: I0218 15:29:57.664614 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e19f3906-864f-49f8-b3f1-e3cfbcae4133/setup-container/0.log" Feb 18 15:29:57 crc kubenswrapper[4817]: I0218 15:29:57.899554 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e19f3906-864f-49f8-b3f1-e3cfbcae4133/setup-container/0.log" Feb 18 15:29:57 crc kubenswrapper[4817]: I0218 15:29:57.970856 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e19f3906-864f-49f8-b3f1-e3cfbcae4133/rabbitmq/0.log" Feb 18 15:29:58 crc kubenswrapper[4817]: I0218 15:29:58.182196 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f49989fd-6326-4020-aba0-45b49ed37872/setup-container/0.log" Feb 18 15:29:58 crc kubenswrapper[4817]: I0218 15:29:58.362148 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f49989fd-6326-4020-aba0-45b49ed37872/setup-container/0.log" Feb 18 15:29:58 crc kubenswrapper[4817]: I0218 15:29:58.435763 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f49989fd-6326-4020-aba0-45b49ed37872/rabbitmq/0.log" Feb 18 15:29:58 crc kubenswrapper[4817]: I0218 15:29:58.564048 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-k5wnj_3c302fa9-5186-4192-9cf6-b6d533570323/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 15:29:58 crc kubenswrapper[4817]: I0218 15:29:58.770251 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-hsvp2_d1c3dd6e-0387-41b5-b7a1-fb2ea1e053f7/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 15:29:59 crc kubenswrapper[4817]: I0218 15:29:59.052153 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-svdph_3c90beed-8bc0-4b1c-9c6c-2279e303fbb1/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 15:29:59 crc kubenswrapper[4817]: I0218 15:29:59.209341 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-vxt6n_abdb2358-3c43-4027-ab8e-fb25932c4f97/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 15:29:59 crc kubenswrapper[4817]: I0218 15:29:59.332901 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-t7ncv_1bef22ac-a84d-4941-8290-6b98eb56367b/ssh-known-hosts-edpm-deployment/0.log" Feb 18 15:29:59 crc kubenswrapper[4817]: I0218 15:29:59.649324 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-549ff9d7ff-4pfxq_85a61008-fd45-4598-90bc-b0cf2856cefa/proxy-server/0.log" Feb 18 15:29:59 crc kubenswrapper[4817]: I0218 15:29:59.819334 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-549ff9d7ff-4pfxq_85a61008-fd45-4598-90bc-b0cf2856cefa/proxy-httpd/0.log" Feb 18 15:29:59 crc kubenswrapper[4817]: I0218 15:29:59.915471 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-c5btx_7f8fdaa1-d441-4b5e-b376-8ab67ce68339/swift-ring-rebalance/0.log" Feb 18 15:30:00 crc kubenswrapper[4817]: I0218 15:30:00.056536 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_77de7364-0925-438c-89e2-6ff0d3cb0776/account-auditor/0.log" Feb 18 15:30:00 crc kubenswrapper[4817]: I0218 15:30:00.155436 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523810-46zrw"] Feb 18 15:30:00 crc kubenswrapper[4817]: E0218 15:30:00.156011 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a79d9df-dc40-4a60-8506-8367f9165772" containerName="container-00" Feb 18 15:30:00 crc kubenswrapper[4817]: I0218 15:30:00.156030 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a79d9df-dc40-4a60-8506-8367f9165772" containerName="container-00" Feb 18 15:30:00 crc kubenswrapper[4817]: I0218 15:30:00.156315 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a79d9df-dc40-4a60-8506-8367f9165772" containerName="container-00" Feb 18 15:30:00 crc kubenswrapper[4817]: I0218 15:30:00.157286 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523810-46zrw" Feb 18 15:30:00 crc kubenswrapper[4817]: I0218 15:30:00.163690 4817 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 18 15:30:00 crc kubenswrapper[4817]: I0218 15:30:00.163710 4817 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 18 15:30:00 crc kubenswrapper[4817]: I0218 15:30:00.201530 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3b0090b0-ef81-41f3-830e-f99d728e796e-secret-volume\") pod \"collect-profiles-29523810-46zrw\" (UID: \"3b0090b0-ef81-41f3-830e-f99d728e796e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523810-46zrw" Feb 18 15:30:00 crc kubenswrapper[4817]: I0218 15:30:00.201636 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3b0090b0-ef81-41f3-830e-f99d728e796e-config-volume\") pod \"collect-profiles-29523810-46zrw\" (UID: \"3b0090b0-ef81-41f3-830e-f99d728e796e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523810-46zrw" Feb 18 15:30:00 crc kubenswrapper[4817]: I0218 15:30:00.231429 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6ns6\" (UniqueName: \"kubernetes.io/projected/3b0090b0-ef81-41f3-830e-f99d728e796e-kube-api-access-s6ns6\") pod \"collect-profiles-29523810-46zrw\" (UID: \"3b0090b0-ef81-41f3-830e-f99d728e796e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523810-46zrw" Feb 18 15:30:00 crc kubenswrapper[4817]: I0218 15:30:00.252056 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523810-46zrw"] Feb 18 15:30:00 crc kubenswrapper[4817]: I0218 15:30:00.335534 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3b0090b0-ef81-41f3-830e-f99d728e796e-secret-volume\") pod \"collect-profiles-29523810-46zrw\" (UID: \"3b0090b0-ef81-41f3-830e-f99d728e796e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523810-46zrw" Feb 18 15:30:00 crc kubenswrapper[4817]: I0218 15:30:00.335963 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3b0090b0-ef81-41f3-830e-f99d728e796e-config-volume\") pod \"collect-profiles-29523810-46zrw\" (UID: \"3b0090b0-ef81-41f3-830e-f99d728e796e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523810-46zrw" Feb 18 15:30:00 crc kubenswrapper[4817]: I0218 15:30:00.336180 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6ns6\" (UniqueName: \"kubernetes.io/projected/3b0090b0-ef81-41f3-830e-f99d728e796e-kube-api-access-s6ns6\") pod \"collect-profiles-29523810-46zrw\" (UID: \"3b0090b0-ef81-41f3-830e-f99d728e796e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523810-46zrw" Feb 18 15:30:00 crc kubenswrapper[4817]: I0218 15:30:00.337354 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3b0090b0-ef81-41f3-830e-f99d728e796e-config-volume\") pod \"collect-profiles-29523810-46zrw\" (UID: \"3b0090b0-ef81-41f3-830e-f99d728e796e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523810-46zrw" Feb 18 15:30:00 crc kubenswrapper[4817]: I0218 15:30:00.348962 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3b0090b0-ef81-41f3-830e-f99d728e796e-secret-volume\") pod \"collect-profiles-29523810-46zrw\" (UID: \"3b0090b0-ef81-41f3-830e-f99d728e796e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523810-46zrw" Feb 18 15:30:00 crc kubenswrapper[4817]: I0218 15:30:00.381841 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6ns6\" (UniqueName: \"kubernetes.io/projected/3b0090b0-ef81-41f3-830e-f99d728e796e-kube-api-access-s6ns6\") pod \"collect-profiles-29523810-46zrw\" (UID: \"3b0090b0-ef81-41f3-830e-f99d728e796e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29523810-46zrw" Feb 18 15:30:00 crc kubenswrapper[4817]: I0218 15:30:00.423385 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_77de7364-0925-438c-89e2-6ff0d3cb0776/account-reaper/0.log" Feb 18 15:30:00 crc kubenswrapper[4817]: I0218 15:30:00.511451 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523810-46zrw" Feb 18 15:30:00 crc kubenswrapper[4817]: I0218 15:30:00.717374 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_77de7364-0925-438c-89e2-6ff0d3cb0776/account-replicator/0.log" Feb 18 15:30:00 crc kubenswrapper[4817]: I0218 15:30:00.737021 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_77de7364-0925-438c-89e2-6ff0d3cb0776/account-server/0.log" Feb 18 15:30:00 crc kubenswrapper[4817]: I0218 15:30:00.777059 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_77de7364-0925-438c-89e2-6ff0d3cb0776/container-auditor/0.log" Feb 18 15:30:00 crc kubenswrapper[4817]: I0218 15:30:00.940814 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_77de7364-0925-438c-89e2-6ff0d3cb0776/container-server/0.log" Feb 18 15:30:00 crc kubenswrapper[4817]: I0218 15:30:00.983555 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_77de7364-0925-438c-89e2-6ff0d3cb0776/container-replicator/0.log" Feb 18 15:30:00 crc kubenswrapper[4817]: I0218 15:30:00.999363 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_77de7364-0925-438c-89e2-6ff0d3cb0776/container-updater/0.log" Feb 18 15:30:01 crc kubenswrapper[4817]: I0218 15:30:01.045251 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523810-46zrw"] Feb 18 15:30:01 crc kubenswrapper[4817]: I0218 15:30:01.227380 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_77de7364-0925-438c-89e2-6ff0d3cb0776/object-auditor/0.log" Feb 18 15:30:01 crc kubenswrapper[4817]: I0218 15:30:01.277681 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_77de7364-0925-438c-89e2-6ff0d3cb0776/object-expirer/0.log" Feb 18 15:30:01 crc kubenswrapper[4817]: I0218 15:30:01.316654 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_77de7364-0925-438c-89e2-6ff0d3cb0776/object-replicator/0.log" Feb 18 15:30:01 crc kubenswrapper[4817]: I0218 15:30:01.332655 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523810-46zrw" event={"ID":"3b0090b0-ef81-41f3-830e-f99d728e796e","Type":"ContainerStarted","Data":"d3105cc7d0ebd142281e2507ae2ce7abeb17cd299ac0fb11942b4953f745ed53"} Feb 18 15:30:01 crc kubenswrapper[4817]: I0218 15:30:01.332699 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523810-46zrw" event={"ID":"3b0090b0-ef81-41f3-830e-f99d728e796e","Type":"ContainerStarted","Data":"b8b565cf35482552a747a22c5e167d401cbd7402adae4bbf176c928c1b52eeee"} Feb 18 15:30:01 crc kubenswrapper[4817]: I0218 15:30:01.373820 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29523810-46zrw" podStartSLOduration=1.373799091 podStartE2EDuration="1.373799091s" podCreationTimestamp="2026-02-18 15:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-18 15:30:01.355624931 +0000 UTC m=+5463.931160914" watchObservedRunningTime="2026-02-18 15:30:01.373799091 +0000 UTC m=+5463.949335074" Feb 18 15:30:01 crc kubenswrapper[4817]: I0218 15:30:01.489754 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_77de7364-0925-438c-89e2-6ff0d3cb0776/object-updater/0.log" Feb 18 15:30:01 crc kubenswrapper[4817]: I0218 15:30:01.522054 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_77de7364-0925-438c-89e2-6ff0d3cb0776/object-server/0.log" Feb 18 15:30:01 crc kubenswrapper[4817]: I0218 15:30:01.540466 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_77de7364-0925-438c-89e2-6ff0d3cb0776/rsync/0.log" Feb 18 15:30:01 crc kubenswrapper[4817]: I0218 15:30:01.796652 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_77de7364-0925-438c-89e2-6ff0d3cb0776/swift-recon-cron/0.log" Feb 18 15:30:01 crc kubenswrapper[4817]: I0218 15:30:01.898942 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-qxktm_5dcb9a62-3ee9-40ff-a0ba-1f7d71d630ab/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 15:30:02 crc kubenswrapper[4817]: I0218 15:30:02.593532 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_943aa309-38ce-4bba-8183-bfaf4357b702/test-operator-logs-container/0.log" Feb 18 15:30:02 crc kubenswrapper[4817]: I0218 15:30:02.612303 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_33627f57-553f-4c87-a517-4fbe8d221665/tempest-tests-tempest-tests-runner/0.log" Feb 18 15:30:02 crc kubenswrapper[4817]: I0218 15:30:02.945610 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-gdv4j_30959336-e13c-426f-9116-3fd2e485a6ed/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 18 15:30:03 crc kubenswrapper[4817]: I0218 15:30:03.358365 4817 generic.go:334] "Generic (PLEG): container finished" podID="3b0090b0-ef81-41f3-830e-f99d728e796e" containerID="d3105cc7d0ebd142281e2507ae2ce7abeb17cd299ac0fb11942b4953f745ed53" exitCode=0 Feb 18 15:30:03 crc kubenswrapper[4817]: I0218 15:30:03.358413 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523810-46zrw" event={"ID":"3b0090b0-ef81-41f3-830e-f99d728e796e","Type":"ContainerDied","Data":"d3105cc7d0ebd142281e2507ae2ce7abeb17cd299ac0fb11942b4953f745ed53"} Feb 18 15:30:05 crc kubenswrapper[4817]: I0218 15:30:05.394277 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29523810-46zrw" event={"ID":"3b0090b0-ef81-41f3-830e-f99d728e796e","Type":"ContainerDied","Data":"b8b565cf35482552a747a22c5e167d401cbd7402adae4bbf176c928c1b52eeee"} Feb 18 15:30:05 crc kubenswrapper[4817]: I0218 15:30:05.394785 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8b565cf35482552a747a22c5e167d401cbd7402adae4bbf176c928c1b52eeee" Feb 18 15:30:05 crc kubenswrapper[4817]: I0218 15:30:05.484514 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523810-46zrw" Feb 18 15:30:05 crc kubenswrapper[4817]: I0218 15:30:05.541306 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3b0090b0-ef81-41f3-830e-f99d728e796e-config-volume\") pod \"3b0090b0-ef81-41f3-830e-f99d728e796e\" (UID: \"3b0090b0-ef81-41f3-830e-f99d728e796e\") " Feb 18 15:30:05 crc kubenswrapper[4817]: I0218 15:30:05.541671 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3b0090b0-ef81-41f3-830e-f99d728e796e-secret-volume\") pod \"3b0090b0-ef81-41f3-830e-f99d728e796e\" (UID: \"3b0090b0-ef81-41f3-830e-f99d728e796e\") " Feb 18 15:30:05 crc kubenswrapper[4817]: I0218 15:30:05.541727 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6ns6\" (UniqueName: \"kubernetes.io/projected/3b0090b0-ef81-41f3-830e-f99d728e796e-kube-api-access-s6ns6\") pod \"3b0090b0-ef81-41f3-830e-f99d728e796e\" (UID: \"3b0090b0-ef81-41f3-830e-f99d728e796e\") " Feb 18 15:30:05 crc kubenswrapper[4817]: I0218 15:30:05.543779 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b0090b0-ef81-41f3-830e-f99d728e796e-config-volume" (OuterVolumeSpecName: "config-volume") pod "3b0090b0-ef81-41f3-830e-f99d728e796e" (UID: "3b0090b0-ef81-41f3-830e-f99d728e796e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 18 15:30:05 crc kubenswrapper[4817]: I0218 15:30:05.551688 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b0090b0-ef81-41f3-830e-f99d728e796e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3b0090b0-ef81-41f3-830e-f99d728e796e" (UID: "3b0090b0-ef81-41f3-830e-f99d728e796e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 18 15:30:05 crc kubenswrapper[4817]: I0218 15:30:05.572138 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b0090b0-ef81-41f3-830e-f99d728e796e-kube-api-access-s6ns6" (OuterVolumeSpecName: "kube-api-access-s6ns6") pod "3b0090b0-ef81-41f3-830e-f99d728e796e" (UID: "3b0090b0-ef81-41f3-830e-f99d728e796e"). InnerVolumeSpecName "kube-api-access-s6ns6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 15:30:05 crc kubenswrapper[4817]: I0218 15:30:05.643836 4817 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3b0090b0-ef81-41f3-830e-f99d728e796e-config-volume\") on node \"crc\" DevicePath \"\"" Feb 18 15:30:05 crc kubenswrapper[4817]: I0218 15:30:05.643863 4817 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3b0090b0-ef81-41f3-830e-f99d728e796e-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 18 15:30:05 crc kubenswrapper[4817]: I0218 15:30:05.643874 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6ns6\" (UniqueName: \"kubernetes.io/projected/3b0090b0-ef81-41f3-830e-f99d728e796e-kube-api-access-s6ns6\") on node \"crc\" DevicePath \"\"" Feb 18 15:30:06 crc kubenswrapper[4817]: I0218 15:30:06.403555 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29523810-46zrw" Feb 18 15:30:06 crc kubenswrapper[4817]: I0218 15:30:06.407824 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-proc-0_ff7b237e-b7c9-43f1-b2fc-4b955e8cd7ad/cloudkitty-proc/0.log" Feb 18 15:30:06 crc kubenswrapper[4817]: I0218 15:30:06.579109 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523765-hjbpq"] Feb 18 15:30:06 crc kubenswrapper[4817]: I0218 15:30:06.593654 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29523765-hjbpq"] Feb 18 15:30:08 crc kubenswrapper[4817]: I0218 15:30:08.222744 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cb5930f-4bff-4d3b-aa87-1fa0c403d342" path="/var/lib/kubelet/pods/7cb5930f-4bff-4d3b-aa87-1fa0c403d342/volumes" Feb 18 15:30:10 crc kubenswrapper[4817]: I0218 15:30:10.171766 4817 scope.go:117] "RemoveContainer" containerID="4391c0b6a1ccf075965fda152c64cede71fd2ad1d19321476a2f1c8612ddc514" Feb 18 15:30:10 crc kubenswrapper[4817]: E0218 15:30:10.172269 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:30:12 crc kubenswrapper[4817]: I0218 15:30:12.840481 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_cb347a6f-041d-41e7-be8b-b151f150e6ab/memcached/0.log" Feb 18 15:30:25 crc kubenswrapper[4817]: I0218 15:30:25.171636 4817 scope.go:117] "RemoveContainer" containerID="4391c0b6a1ccf075965fda152c64cede71fd2ad1d19321476a2f1c8612ddc514" Feb 18 15:30:25 crc kubenswrapper[4817]: E0218 15:30:25.172740 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:30:34 crc kubenswrapper[4817]: I0218 15:30:34.719467 4817 scope.go:117] "RemoveContainer" containerID="de9407e60ced8c69134b232d728c93bf4d9328f9b5557fd38070be0e7b056e1f" Feb 18 15:30:35 crc kubenswrapper[4817]: I0218 15:30:35.317266 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc_719bba7d-4fc5-4a70-9711-ecb679a5055a/util/0.log" Feb 18 15:30:35 crc kubenswrapper[4817]: I0218 15:30:35.522017 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc_719bba7d-4fc5-4a70-9711-ecb679a5055a/util/0.log" Feb 18 15:30:35 crc kubenswrapper[4817]: I0218 15:30:35.539338 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc_719bba7d-4fc5-4a70-9711-ecb679a5055a/pull/0.log" Feb 18 15:30:35 crc kubenswrapper[4817]: I0218 15:30:35.545423 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc_719bba7d-4fc5-4a70-9711-ecb679a5055a/pull/0.log" Feb 18 15:30:35 crc kubenswrapper[4817]: I0218 15:30:35.726500 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc_719bba7d-4fc5-4a70-9711-ecb679a5055a/pull/0.log" Feb 18 15:30:35 crc kubenswrapper[4817]: I0218 15:30:35.771425 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc_719bba7d-4fc5-4a70-9711-ecb679a5055a/extract/0.log" Feb 18 15:30:35 crc kubenswrapper[4817]: I0218 15:30:35.786823 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5fa593b11d514d0a7be460a834e869028c01bb0b0d1b03b6172f8c46aetjpqc_719bba7d-4fc5-4a70-9711-ecb679a5055a/util/0.log" Feb 18 15:30:36 crc kubenswrapper[4817]: I0218 15:30:36.308533 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-hwmjj_4441a78b-c58a-4030-801b-06dbfa1729b1/manager/0.log" Feb 18 15:30:36 crc kubenswrapper[4817]: I0218 15:30:36.761093 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-c2b7x_00831f79-f6d5-4896-b718-4120117751b8/manager/0.log" Feb 18 15:30:36 crc kubenswrapper[4817]: I0218 15:30:36.965367 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-zknf5_13246e06-5b63-4076-a556-de264d7afdf4/manager/0.log" Feb 18 15:30:37 crc kubenswrapper[4817]: I0218 15:30:37.172048 4817 scope.go:117] "RemoveContainer" containerID="4391c0b6a1ccf075965fda152c64cede71fd2ad1d19321476a2f1c8612ddc514" Feb 18 15:30:37 crc kubenswrapper[4817]: E0218 15:30:37.172299 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:30:37 crc kubenswrapper[4817]: I0218 15:30:37.242865 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-gq259_5721fd5d-07bb-44df-bfb8-4b4dd80ac7a4/manager/0.log" Feb 18 15:30:37 crc kubenswrapper[4817]: I0218 15:30:37.987519 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-64rvt_01962a92-98c7-412c-86a7-ee21e6cb92a9/manager/0.log" Feb 18 15:30:38 crc kubenswrapper[4817]: I0218 15:30:38.075048 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-xpwgd_e7be81da-3629-4713-87c6-34cabd9a8347/manager/0.log" Feb 18 15:30:38 crc kubenswrapper[4817]: I0218 15:30:38.461699 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-fs8m7_85bd6fc0-d973-4172-b441-c15d4abeb604/manager/0.log" Feb 18 15:30:38 crc kubenswrapper[4817]: I0218 15:30:38.759743 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-4h8qx_c5e8b4c9-5a63-44c3-9f6c-c7ee268dcef3/manager/0.log" Feb 18 15:30:39 crc kubenswrapper[4817]: I0218 15:30:39.117822 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-z64rl_4339e125-4e60-44d9-8e15-97b4000669e2/manager/0.log" Feb 18 15:30:39 crc kubenswrapper[4817]: I0218 15:30:39.371310 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-4rv4f_644f07fc-02ce-49f0-87bf-54f765c15d8c/manager/0.log" Feb 18 15:30:39 crc kubenswrapper[4817]: I0218 15:30:39.481691 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-dr67b_6e783396-37c1-4a0d-bfe4-495fdf4d41bf/manager/0.log" Feb 18 15:30:39 crc kubenswrapper[4817]: I0218 15:30:39.736795 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-9jkwb_ca917110-0727-4c63-ad9a-20722a6cba34/manager/0.log" Feb 18 15:30:39 crc kubenswrapper[4817]: I0218 15:30:39.979727 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9c7hc4j_ad995216-386a-455b-b48d-378dbfd271bf/manager/0.log" Feb 18 15:30:40 crc kubenswrapper[4817]: I0218 15:30:40.221234 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-5864f6ff6b-g4lc4_b05d374b-b714-4826-80b8-246c15521534/operator/0.log" Feb 18 15:30:40 crc kubenswrapper[4817]: I0218 15:30:40.472360 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-hfr8r_51aa0947-7a1c-4a40-bd45-299bb95ff9f1/registry-server/0.log" Feb 18 15:30:40 crc kubenswrapper[4817]: I0218 15:30:40.850603 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-l9fqt_9d933918-c23c-456a-8b3f-08ee4c2909dd/manager/0.log" Feb 18 15:30:41 crc kubenswrapper[4817]: I0218 15:30:41.175371 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-9td4r_2c04c342-bd87-46e7-8a2d-72dc30f858aa/manager/0.log" Feb 18 15:30:41 crc kubenswrapper[4817]: I0218 15:30:41.482407 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-5np9g_086958e1-8a7d-40c9-9725-f18776f863a0/operator/0.log" Feb 18 15:30:41 crc kubenswrapper[4817]: I0218 15:30:41.839510 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-ff84x_b77d6c32-6c30-42be-ab69-36b969d40950/manager/0.log" Feb 18 15:30:42 crc kubenswrapper[4817]: I0218 15:30:42.444640 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-8q5c8_bccec692-ee64-46e1-8979-e6173c132d8e/manager/0.log" Feb 18 15:30:42 crc kubenswrapper[4817]: I0218 15:30:42.675101 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7954588dd9-dngjl_ff1b9fe3-84fe-47fc-902c-aa23c9e829d8/manager/0.log" Feb 18 15:30:42 crc kubenswrapper[4817]: I0218 15:30:42.946346 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6956d67c5c-xbjdr_3374b90b-3a12-4e01-a0cb-ed7c51d844d7/manager/0.log" Feb 18 15:30:43 crc kubenswrapper[4817]: I0218 15:30:43.023115 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5db88f68c-8pfz7_93f59b48-31c6-4fed-8ccc-7d722605d896/manager/0.log" Feb 18 15:30:43 crc kubenswrapper[4817]: I0218 15:30:43.134811 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-qlrqm_0db531ef-d3b4-4b35-9497-8892cbd3db77/manager/0.log" Feb 18 15:30:48 crc kubenswrapper[4817]: I0218 15:30:48.801868 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-rs8vm_4f0674c2-f05e-4276-b2b0-dc5ed66c187a/manager/0.log" Feb 18 15:30:49 crc kubenswrapper[4817]: I0218 15:30:49.004405 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cf95d"] Feb 18 15:30:49 crc kubenswrapper[4817]: E0218 15:30:49.005055 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b0090b0-ef81-41f3-830e-f99d728e796e" containerName="collect-profiles" Feb 18 15:30:49 crc kubenswrapper[4817]: I0218 15:30:49.005070 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b0090b0-ef81-41f3-830e-f99d728e796e" containerName="collect-profiles" Feb 18 15:30:49 crc kubenswrapper[4817]: I0218 15:30:49.005257 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b0090b0-ef81-41f3-830e-f99d728e796e" containerName="collect-profiles" Feb 18 15:30:49 crc kubenswrapper[4817]: I0218 15:30:49.006644 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cf95d" Feb 18 15:30:49 crc kubenswrapper[4817]: I0218 15:30:49.021038 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cf95d"] Feb 18 15:30:49 crc kubenswrapper[4817]: I0218 15:30:49.142397 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/188a449c-d452-4cd2-bb39-914c797b7e4c-catalog-content\") pod \"redhat-marketplace-cf95d\" (UID: \"188a449c-d452-4cd2-bb39-914c797b7e4c\") " pod="openshift-marketplace/redhat-marketplace-cf95d" Feb 18 15:30:49 crc kubenswrapper[4817]: I0218 15:30:49.142570 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfzw6\" (UniqueName: \"kubernetes.io/projected/188a449c-d452-4cd2-bb39-914c797b7e4c-kube-api-access-nfzw6\") pod \"redhat-marketplace-cf95d\" (UID: \"188a449c-d452-4cd2-bb39-914c797b7e4c\") " pod="openshift-marketplace/redhat-marketplace-cf95d" Feb 18 15:30:49 crc kubenswrapper[4817]: I0218 15:30:49.142610 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/188a449c-d452-4cd2-bb39-914c797b7e4c-utilities\") pod \"redhat-marketplace-cf95d\" (UID: \"188a449c-d452-4cd2-bb39-914c797b7e4c\") " pod="openshift-marketplace/redhat-marketplace-cf95d" Feb 18 15:30:49 crc kubenswrapper[4817]: I0218 15:30:49.171590 4817 scope.go:117] "RemoveContainer" containerID="4391c0b6a1ccf075965fda152c64cede71fd2ad1d19321476a2f1c8612ddc514" Feb 18 15:30:49 crc kubenswrapper[4817]: E0218 15:30:49.171888 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:30:49 crc kubenswrapper[4817]: I0218 15:30:49.244365 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/188a449c-d452-4cd2-bb39-914c797b7e4c-catalog-content\") pod \"redhat-marketplace-cf95d\" (UID: \"188a449c-d452-4cd2-bb39-914c797b7e4c\") " pod="openshift-marketplace/redhat-marketplace-cf95d" Feb 18 15:30:49 crc kubenswrapper[4817]: I0218 15:30:49.245008 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfzw6\" (UniqueName: \"kubernetes.io/projected/188a449c-d452-4cd2-bb39-914c797b7e4c-kube-api-access-nfzw6\") pod \"redhat-marketplace-cf95d\" (UID: \"188a449c-d452-4cd2-bb39-914c797b7e4c\") " pod="openshift-marketplace/redhat-marketplace-cf95d" Feb 18 15:30:49 crc kubenswrapper[4817]: I0218 15:30:49.245132 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/188a449c-d452-4cd2-bb39-914c797b7e4c-utilities\") pod \"redhat-marketplace-cf95d\" (UID: \"188a449c-d452-4cd2-bb39-914c797b7e4c\") " pod="openshift-marketplace/redhat-marketplace-cf95d" Feb 18 15:30:49 crc kubenswrapper[4817]: I0218 15:30:49.245462 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/188a449c-d452-4cd2-bb39-914c797b7e4c-catalog-content\") pod \"redhat-marketplace-cf95d\" (UID: \"188a449c-d452-4cd2-bb39-914c797b7e4c\") " pod="openshift-marketplace/redhat-marketplace-cf95d" Feb 18 15:30:49 crc kubenswrapper[4817]: I0218 15:30:49.245579 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/188a449c-d452-4cd2-bb39-914c797b7e4c-utilities\") pod \"redhat-marketplace-cf95d\" (UID: \"188a449c-d452-4cd2-bb39-914c797b7e4c\") " pod="openshift-marketplace/redhat-marketplace-cf95d" Feb 18 15:30:49 crc kubenswrapper[4817]: I0218 15:30:49.263142 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfzw6\" (UniqueName: \"kubernetes.io/projected/188a449c-d452-4cd2-bb39-914c797b7e4c-kube-api-access-nfzw6\") pod \"redhat-marketplace-cf95d\" (UID: \"188a449c-d452-4cd2-bb39-914c797b7e4c\") " pod="openshift-marketplace/redhat-marketplace-cf95d" Feb 18 15:30:49 crc kubenswrapper[4817]: I0218 15:30:49.326528 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cf95d" Feb 18 15:30:49 crc kubenswrapper[4817]: I0218 15:30:49.962422 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cf95d"] Feb 18 15:30:50 crc kubenswrapper[4817]: I0218 15:30:50.824716 4817 generic.go:334] "Generic (PLEG): container finished" podID="188a449c-d452-4cd2-bb39-914c797b7e4c" containerID="3a22ab53ee45531155813e020bc333a34036bb640c589dc5f81000a343fda120" exitCode=0 Feb 18 15:30:50 crc kubenswrapper[4817]: I0218 15:30:50.824832 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cf95d" event={"ID":"188a449c-d452-4cd2-bb39-914c797b7e4c","Type":"ContainerDied","Data":"3a22ab53ee45531155813e020bc333a34036bb640c589dc5f81000a343fda120"} Feb 18 15:30:50 crc kubenswrapper[4817]: I0218 15:30:50.825036 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cf95d" event={"ID":"188a449c-d452-4cd2-bb39-914c797b7e4c","Type":"ContainerStarted","Data":"bbec770959f117aef418caf78853e11845bb60eaa91537ba52510c576bed2946"} Feb 18 15:30:50 crc kubenswrapper[4817]: I0218 15:30:50.827566 4817 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 18 15:30:51 crc kubenswrapper[4817]: I0218 15:30:51.836362 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cf95d" event={"ID":"188a449c-d452-4cd2-bb39-914c797b7e4c","Type":"ContainerStarted","Data":"7bbb7fb7377a6c3b184492b7d6344f4aafb58d88419beee11bebcc330f87631b"} Feb 18 15:30:52 crc kubenswrapper[4817]: I0218 15:30:52.847237 4817 generic.go:334] "Generic (PLEG): container finished" podID="188a449c-d452-4cd2-bb39-914c797b7e4c" containerID="7bbb7fb7377a6c3b184492b7d6344f4aafb58d88419beee11bebcc330f87631b" exitCode=0 Feb 18 15:30:52 crc kubenswrapper[4817]: I0218 15:30:52.847352 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cf95d" event={"ID":"188a449c-d452-4cd2-bb39-914c797b7e4c","Type":"ContainerDied","Data":"7bbb7fb7377a6c3b184492b7d6344f4aafb58d88419beee11bebcc330f87631b"} Feb 18 15:30:53 crc kubenswrapper[4817]: I0218 15:30:53.857804 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cf95d" event={"ID":"188a449c-d452-4cd2-bb39-914c797b7e4c","Type":"ContainerStarted","Data":"0ce512d6d7b93a329a19b442d3c165d0b80b457497f4d11eeaf60e6d57c60d4d"} Feb 18 15:30:53 crc kubenswrapper[4817]: I0218 15:30:53.888276 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cf95d" podStartSLOduration=3.338760584 podStartE2EDuration="5.888256711s" podCreationTimestamp="2026-02-18 15:30:48 +0000 UTC" firstStartedPulling="2026-02-18 15:30:50.827293655 +0000 UTC m=+5513.402829638" lastFinishedPulling="2026-02-18 15:30:53.376789782 +0000 UTC m=+5515.952325765" observedRunningTime="2026-02-18 15:30:53.87634811 +0000 UTC m=+5516.451884093" watchObservedRunningTime="2026-02-18 15:30:53.888256711 +0000 UTC m=+5516.463792694" Feb 18 15:30:59 crc kubenswrapper[4817]: I0218 15:30:59.327597 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cf95d" Feb 18 15:30:59 crc kubenswrapper[4817]: I0218 15:30:59.328241 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cf95d" Feb 18 15:30:59 crc kubenswrapper[4817]: I0218 15:30:59.377860 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cf95d" Feb 18 15:30:59 crc kubenswrapper[4817]: I0218 15:30:59.965797 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cf95d" Feb 18 15:31:00 crc kubenswrapper[4817]: I0218 15:31:00.020086 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cf95d"] Feb 18 15:31:01 crc kubenswrapper[4817]: I0218 15:31:01.936242 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cf95d" podUID="188a449c-d452-4cd2-bb39-914c797b7e4c" containerName="registry-server" containerID="cri-o://0ce512d6d7b93a329a19b442d3c165d0b80b457497f4d11eeaf60e6d57c60d4d" gracePeriod=2 Feb 18 15:31:02 crc kubenswrapper[4817]: I0218 15:31:02.713049 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cf95d" Feb 18 15:31:02 crc kubenswrapper[4817]: I0218 15:31:02.839452 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfzw6\" (UniqueName: \"kubernetes.io/projected/188a449c-d452-4cd2-bb39-914c797b7e4c-kube-api-access-nfzw6\") pod \"188a449c-d452-4cd2-bb39-914c797b7e4c\" (UID: \"188a449c-d452-4cd2-bb39-914c797b7e4c\") " Feb 18 15:31:02 crc kubenswrapper[4817]: I0218 15:31:02.839588 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/188a449c-d452-4cd2-bb39-914c797b7e4c-utilities\") pod \"188a449c-d452-4cd2-bb39-914c797b7e4c\" (UID: \"188a449c-d452-4cd2-bb39-914c797b7e4c\") " Feb 18 15:31:02 crc kubenswrapper[4817]: I0218 15:31:02.839631 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/188a449c-d452-4cd2-bb39-914c797b7e4c-catalog-content\") pod \"188a449c-d452-4cd2-bb39-914c797b7e4c\" (UID: \"188a449c-d452-4cd2-bb39-914c797b7e4c\") " Feb 18 15:31:02 crc kubenswrapper[4817]: I0218 15:31:02.840684 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/188a449c-d452-4cd2-bb39-914c797b7e4c-utilities" (OuterVolumeSpecName: "utilities") pod "188a449c-d452-4cd2-bb39-914c797b7e4c" (UID: "188a449c-d452-4cd2-bb39-914c797b7e4c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 15:31:02 crc kubenswrapper[4817]: I0218 15:31:02.845511 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/188a449c-d452-4cd2-bb39-914c797b7e4c-kube-api-access-nfzw6" (OuterVolumeSpecName: "kube-api-access-nfzw6") pod "188a449c-d452-4cd2-bb39-914c797b7e4c" (UID: "188a449c-d452-4cd2-bb39-914c797b7e4c"). InnerVolumeSpecName "kube-api-access-nfzw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 15:31:02 crc kubenswrapper[4817]: I0218 15:31:02.868764 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/188a449c-d452-4cd2-bb39-914c797b7e4c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "188a449c-d452-4cd2-bb39-914c797b7e4c" (UID: "188a449c-d452-4cd2-bb39-914c797b7e4c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 15:31:02 crc kubenswrapper[4817]: I0218 15:31:02.942924 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/188a449c-d452-4cd2-bb39-914c797b7e4c-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 15:31:02 crc kubenswrapper[4817]: I0218 15:31:02.942959 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/188a449c-d452-4cd2-bb39-914c797b7e4c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 15:31:02 crc kubenswrapper[4817]: I0218 15:31:02.942992 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfzw6\" (UniqueName: \"kubernetes.io/projected/188a449c-d452-4cd2-bb39-914c797b7e4c-kube-api-access-nfzw6\") on node \"crc\" DevicePath \"\"" Feb 18 15:31:02 crc kubenswrapper[4817]: I0218 15:31:02.948561 4817 generic.go:334] "Generic (PLEG): container finished" podID="188a449c-d452-4cd2-bb39-914c797b7e4c" containerID="0ce512d6d7b93a329a19b442d3c165d0b80b457497f4d11eeaf60e6d57c60d4d" exitCode=0 Feb 18 15:31:02 crc kubenswrapper[4817]: I0218 15:31:02.948609 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cf95d" event={"ID":"188a449c-d452-4cd2-bb39-914c797b7e4c","Type":"ContainerDied","Data":"0ce512d6d7b93a329a19b442d3c165d0b80b457497f4d11eeaf60e6d57c60d4d"} Feb 18 15:31:02 crc kubenswrapper[4817]: I0218 15:31:02.948634 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cf95d" Feb 18 15:31:02 crc kubenswrapper[4817]: I0218 15:31:02.948652 4817 scope.go:117] "RemoveContainer" containerID="0ce512d6d7b93a329a19b442d3c165d0b80b457497f4d11eeaf60e6d57c60d4d" Feb 18 15:31:02 crc kubenswrapper[4817]: I0218 15:31:02.948639 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cf95d" event={"ID":"188a449c-d452-4cd2-bb39-914c797b7e4c","Type":"ContainerDied","Data":"bbec770959f117aef418caf78853e11845bb60eaa91537ba52510c576bed2946"} Feb 18 15:31:02 crc kubenswrapper[4817]: I0218 15:31:02.977605 4817 scope.go:117] "RemoveContainer" containerID="7bbb7fb7377a6c3b184492b7d6344f4aafb58d88419beee11bebcc330f87631b" Feb 18 15:31:02 crc kubenswrapper[4817]: I0218 15:31:02.985932 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cf95d"] Feb 18 15:31:02 crc kubenswrapper[4817]: I0218 15:31:02.994608 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cf95d"] Feb 18 15:31:02 crc kubenswrapper[4817]: I0218 15:31:02.997521 4817 scope.go:117] "RemoveContainer" containerID="3a22ab53ee45531155813e020bc333a34036bb640c589dc5f81000a343fda120" Feb 18 15:31:03 crc kubenswrapper[4817]: I0218 15:31:03.044917 4817 scope.go:117] "RemoveContainer" containerID="0ce512d6d7b93a329a19b442d3c165d0b80b457497f4d11eeaf60e6d57c60d4d" Feb 18 15:31:03 crc kubenswrapper[4817]: E0218 15:31:03.045431 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ce512d6d7b93a329a19b442d3c165d0b80b457497f4d11eeaf60e6d57c60d4d\": container with ID starting with 0ce512d6d7b93a329a19b442d3c165d0b80b457497f4d11eeaf60e6d57c60d4d not found: ID does not exist" containerID="0ce512d6d7b93a329a19b442d3c165d0b80b457497f4d11eeaf60e6d57c60d4d" Feb 18 15:31:03 crc kubenswrapper[4817]: I0218 15:31:03.045470 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ce512d6d7b93a329a19b442d3c165d0b80b457497f4d11eeaf60e6d57c60d4d"} err="failed to get container status \"0ce512d6d7b93a329a19b442d3c165d0b80b457497f4d11eeaf60e6d57c60d4d\": rpc error: code = NotFound desc = could not find container \"0ce512d6d7b93a329a19b442d3c165d0b80b457497f4d11eeaf60e6d57c60d4d\": container with ID starting with 0ce512d6d7b93a329a19b442d3c165d0b80b457497f4d11eeaf60e6d57c60d4d not found: ID does not exist" Feb 18 15:31:03 crc kubenswrapper[4817]: I0218 15:31:03.045501 4817 scope.go:117] "RemoveContainer" containerID="7bbb7fb7377a6c3b184492b7d6344f4aafb58d88419beee11bebcc330f87631b" Feb 18 15:31:03 crc kubenswrapper[4817]: E0218 15:31:03.045822 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bbb7fb7377a6c3b184492b7d6344f4aafb58d88419beee11bebcc330f87631b\": container with ID starting with 7bbb7fb7377a6c3b184492b7d6344f4aafb58d88419beee11bebcc330f87631b not found: ID does not exist" containerID="7bbb7fb7377a6c3b184492b7d6344f4aafb58d88419beee11bebcc330f87631b" Feb 18 15:31:03 crc kubenswrapper[4817]: I0218 15:31:03.045860 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bbb7fb7377a6c3b184492b7d6344f4aafb58d88419beee11bebcc330f87631b"} err="failed to get container status \"7bbb7fb7377a6c3b184492b7d6344f4aafb58d88419beee11bebcc330f87631b\": rpc error: code = NotFound desc = could not find container \"7bbb7fb7377a6c3b184492b7d6344f4aafb58d88419beee11bebcc330f87631b\": container with ID starting with 7bbb7fb7377a6c3b184492b7d6344f4aafb58d88419beee11bebcc330f87631b not found: ID does not exist" Feb 18 15:31:03 crc kubenswrapper[4817]: I0218 15:31:03.045887 4817 scope.go:117] "RemoveContainer" containerID="3a22ab53ee45531155813e020bc333a34036bb640c589dc5f81000a343fda120" Feb 18 15:31:03 crc kubenswrapper[4817]: E0218 15:31:03.046211 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a22ab53ee45531155813e020bc333a34036bb640c589dc5f81000a343fda120\": container with ID starting with 3a22ab53ee45531155813e020bc333a34036bb640c589dc5f81000a343fda120 not found: ID does not exist" containerID="3a22ab53ee45531155813e020bc333a34036bb640c589dc5f81000a343fda120" Feb 18 15:31:03 crc kubenswrapper[4817]: I0218 15:31:03.046237 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a22ab53ee45531155813e020bc333a34036bb640c589dc5f81000a343fda120"} err="failed to get container status \"3a22ab53ee45531155813e020bc333a34036bb640c589dc5f81000a343fda120\": rpc error: code = NotFound desc = could not find container \"3a22ab53ee45531155813e020bc333a34036bb640c589dc5f81000a343fda120\": container with ID starting with 3a22ab53ee45531155813e020bc333a34036bb640c589dc5f81000a343fda120 not found: ID does not exist" Feb 18 15:31:03 crc kubenswrapper[4817]: I0218 15:31:03.171546 4817 scope.go:117] "RemoveContainer" containerID="4391c0b6a1ccf075965fda152c64cede71fd2ad1d19321476a2f1c8612ddc514" Feb 18 15:31:03 crc kubenswrapper[4817]: E0218 15:31:03.171962 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:31:04 crc kubenswrapper[4817]: I0218 15:31:04.182526 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="188a449c-d452-4cd2-bb39-914c797b7e4c" path="/var/lib/kubelet/pods/188a449c-d452-4cd2-bb39-914c797b7e4c/volumes" Feb 18 15:31:07 crc kubenswrapper[4817]: I0218 15:31:07.154243 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-mj6tz_9f66536d-c481-41b3-b5e5-8259651a95d9/control-plane-machine-set-operator/0.log" Feb 18 15:31:07 crc kubenswrapper[4817]: I0218 15:31:07.431774 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-x8x2r_3bd99710-b175-4115-8944-1fac544145c5/machine-api-operator/0.log" Feb 18 15:31:07 crc kubenswrapper[4817]: I0218 15:31:07.446945 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-x8x2r_3bd99710-b175-4115-8944-1fac544145c5/kube-rbac-proxy/0.log" Feb 18 15:31:18 crc kubenswrapper[4817]: I0218 15:31:18.179654 4817 scope.go:117] "RemoveContainer" containerID="4391c0b6a1ccf075965fda152c64cede71fd2ad1d19321476a2f1c8612ddc514" Feb 18 15:31:18 crc kubenswrapper[4817]: E0218 15:31:18.180433 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:31:20 crc kubenswrapper[4817]: I0218 15:31:20.075769 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-52jl8_cff7c17a-00dd-470b-a121-c8e86485d4ac/cert-manager-controller/0.log" Feb 18 15:31:20 crc kubenswrapper[4817]: I0218 15:31:20.251568 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-kgq95_00d35822-7854-4547-8c51-7d8f747fcb9c/cert-manager-cainjector/0.log" Feb 18 15:31:20 crc kubenswrapper[4817]: I0218 15:31:20.309180 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-l2hvx_64c3da31-0521-4691-86b4-66f99e11c898/cert-manager-webhook/0.log" Feb 18 15:31:29 crc kubenswrapper[4817]: I0218 15:31:29.172200 4817 scope.go:117] "RemoveContainer" containerID="4391c0b6a1ccf075965fda152c64cede71fd2ad1d19321476a2f1c8612ddc514" Feb 18 15:31:29 crc kubenswrapper[4817]: E0218 15:31:29.173001 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:31:33 crc kubenswrapper[4817]: I0218 15:31:33.639929 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-8qsxw_abf003b3-f87b-4907-ad15-59b8f12108b3/nmstate-console-plugin/0.log" Feb 18 15:31:33 crc kubenswrapper[4817]: I0218 15:31:33.843094 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-l4ntc_8b4f277d-2b45-43de-b3f7-52e968407f19/nmstate-handler/0.log" Feb 18 15:31:33 crc kubenswrapper[4817]: I0218 15:31:33.920199 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-qnftd_65c92dc4-036d-42e0-baa1-3dc7e23c43b3/kube-rbac-proxy/0.log" Feb 18 15:31:34 crc kubenswrapper[4817]: I0218 15:31:34.002423 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-qnftd_65c92dc4-036d-42e0-baa1-3dc7e23c43b3/nmstate-metrics/0.log" Feb 18 15:31:34 crc kubenswrapper[4817]: I0218 15:31:34.196733 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-4jkpg_ca7ab19f-157f-4626-80d3-27ed1a469d95/nmstate-operator/0.log" Feb 18 15:31:34 crc kubenswrapper[4817]: I0218 15:31:34.259138 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-b2vsq_1cc1bcee-c9a0-4bda-9fb1-0f178d5a938a/nmstate-webhook/0.log" Feb 18 15:31:43 crc kubenswrapper[4817]: I0218 15:31:43.171832 4817 scope.go:117] "RemoveContainer" containerID="4391c0b6a1ccf075965fda152c64cede71fd2ad1d19321476a2f1c8612ddc514" Feb 18 15:31:43 crc kubenswrapper[4817]: E0218 15:31:43.174465 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:31:47 crc kubenswrapper[4817]: I0218 15:31:47.451768 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-59d4b4c7c-rvnbw_bf27f33f-390f-44fa-91fb-40f18240d0df/kube-rbac-proxy/0.log" Feb 18 15:31:47 crc kubenswrapper[4817]: I0218 15:31:47.656749 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-59d4b4c7c-rvnbw_bf27f33f-390f-44fa-91fb-40f18240d0df/manager/0.log" Feb 18 15:31:57 crc kubenswrapper[4817]: I0218 15:31:57.171640 4817 scope.go:117] "RemoveContainer" containerID="4391c0b6a1ccf075965fda152c64cede71fd2ad1d19321476a2f1c8612ddc514" Feb 18 15:31:57 crc kubenswrapper[4817]: E0218 15:31:57.172503 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:32:00 crc kubenswrapper[4817]: I0218 15:32:00.473361 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-2p6zw_bc446e23-6b46-40cc-b058-5f8d491d8310/prometheus-operator/0.log" Feb 18 15:32:00 crc kubenswrapper[4817]: I0218 15:32:00.709337 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh_fad4abaa-bb3e-4fa2-9478-37e792ead430/prometheus-operator-admission-webhook/0.log" Feb 18 15:32:00 crc kubenswrapper[4817]: I0218 15:32:00.753807 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw_fa24c32b-4905-4756-a765-195d6b0b6c1a/prometheus-operator-admission-webhook/0.log" Feb 18 15:32:00 crc kubenswrapper[4817]: I0218 15:32:00.970820 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-xcdgl_02b7c5c2-ac49-498f-9c4c-c64cf484d131/operator/0.log" Feb 18 15:32:01 crc kubenswrapper[4817]: I0218 15:32:01.005245 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-qsnlp_f5816544-7d2c-4bf3-aeab-98f546573810/perses-operator/0.log" Feb 18 15:32:12 crc kubenswrapper[4817]: I0218 15:32:12.172056 4817 scope.go:117] "RemoveContainer" containerID="4391c0b6a1ccf075965fda152c64cede71fd2ad1d19321476a2f1c8612ddc514" Feb 18 15:32:12 crc kubenswrapper[4817]: E0218 15:32:12.172819 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:32:12 crc kubenswrapper[4817]: I0218 15:32:12.582395 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-x6jg6"] Feb 18 15:32:12 crc kubenswrapper[4817]: E0218 15:32:12.582806 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="188a449c-d452-4cd2-bb39-914c797b7e4c" containerName="extract-content" Feb 18 15:32:12 crc kubenswrapper[4817]: I0218 15:32:12.582823 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="188a449c-d452-4cd2-bb39-914c797b7e4c" containerName="extract-content" Feb 18 15:32:12 crc kubenswrapper[4817]: E0218 15:32:12.582836 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="188a449c-d452-4cd2-bb39-914c797b7e4c" containerName="extract-utilities" Feb 18 15:32:12 crc kubenswrapper[4817]: I0218 15:32:12.582843 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="188a449c-d452-4cd2-bb39-914c797b7e4c" containerName="extract-utilities" Feb 18 15:32:12 crc kubenswrapper[4817]: E0218 15:32:12.582881 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="188a449c-d452-4cd2-bb39-914c797b7e4c" containerName="registry-server" Feb 18 15:32:12 crc kubenswrapper[4817]: I0218 15:32:12.582887 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="188a449c-d452-4cd2-bb39-914c797b7e4c" containerName="registry-server" Feb 18 15:32:12 crc kubenswrapper[4817]: I0218 15:32:12.583094 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="188a449c-d452-4cd2-bb39-914c797b7e4c" containerName="registry-server" Feb 18 15:32:12 crc kubenswrapper[4817]: I0218 15:32:12.584482 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x6jg6" Feb 18 15:32:12 crc kubenswrapper[4817]: I0218 15:32:12.595327 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x6jg6"] Feb 18 15:32:12 crc kubenswrapper[4817]: I0218 15:32:12.712935 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5ce4f38-77cb-49cd-b921-6d63fe15dac2-catalog-content\") pod \"redhat-operators-x6jg6\" (UID: \"d5ce4f38-77cb-49cd-b921-6d63fe15dac2\") " pod="openshift-marketplace/redhat-operators-x6jg6" Feb 18 15:32:12 crc kubenswrapper[4817]: I0218 15:32:12.712999 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5ce4f38-77cb-49cd-b921-6d63fe15dac2-utilities\") pod \"redhat-operators-x6jg6\" (UID: \"d5ce4f38-77cb-49cd-b921-6d63fe15dac2\") " pod="openshift-marketplace/redhat-operators-x6jg6" Feb 18 15:32:12 crc kubenswrapper[4817]: I0218 15:32:12.713064 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn526\" (UniqueName: \"kubernetes.io/projected/d5ce4f38-77cb-49cd-b921-6d63fe15dac2-kube-api-access-kn526\") pod \"redhat-operators-x6jg6\" (UID: \"d5ce4f38-77cb-49cd-b921-6d63fe15dac2\") " pod="openshift-marketplace/redhat-operators-x6jg6" Feb 18 15:32:12 crc kubenswrapper[4817]: I0218 15:32:12.815044 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn526\" (UniqueName: \"kubernetes.io/projected/d5ce4f38-77cb-49cd-b921-6d63fe15dac2-kube-api-access-kn526\") pod \"redhat-operators-x6jg6\" (UID: \"d5ce4f38-77cb-49cd-b921-6d63fe15dac2\") " pod="openshift-marketplace/redhat-operators-x6jg6" Feb 18 15:32:12 crc kubenswrapper[4817]: I0218 15:32:12.815509 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5ce4f38-77cb-49cd-b921-6d63fe15dac2-catalog-content\") pod \"redhat-operators-x6jg6\" (UID: \"d5ce4f38-77cb-49cd-b921-6d63fe15dac2\") " pod="openshift-marketplace/redhat-operators-x6jg6" Feb 18 15:32:12 crc kubenswrapper[4817]: I0218 15:32:12.815644 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5ce4f38-77cb-49cd-b921-6d63fe15dac2-utilities\") pod \"redhat-operators-x6jg6\" (UID: \"d5ce4f38-77cb-49cd-b921-6d63fe15dac2\") " pod="openshift-marketplace/redhat-operators-x6jg6" Feb 18 15:32:12 crc kubenswrapper[4817]: I0218 15:32:12.815940 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5ce4f38-77cb-49cd-b921-6d63fe15dac2-catalog-content\") pod \"redhat-operators-x6jg6\" (UID: \"d5ce4f38-77cb-49cd-b921-6d63fe15dac2\") " pod="openshift-marketplace/redhat-operators-x6jg6" Feb 18 15:32:12 crc kubenswrapper[4817]: I0218 15:32:12.816136 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5ce4f38-77cb-49cd-b921-6d63fe15dac2-utilities\") pod \"redhat-operators-x6jg6\" (UID: \"d5ce4f38-77cb-49cd-b921-6d63fe15dac2\") " pod="openshift-marketplace/redhat-operators-x6jg6" Feb 18 15:32:12 crc kubenswrapper[4817]: I0218 15:32:12.835614 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn526\" (UniqueName: \"kubernetes.io/projected/d5ce4f38-77cb-49cd-b921-6d63fe15dac2-kube-api-access-kn526\") pod \"redhat-operators-x6jg6\" (UID: \"d5ce4f38-77cb-49cd-b921-6d63fe15dac2\") " pod="openshift-marketplace/redhat-operators-x6jg6" Feb 18 15:32:12 crc kubenswrapper[4817]: I0218 15:32:12.923263 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x6jg6" Feb 18 15:32:13 crc kubenswrapper[4817]: I0218 15:32:13.459116 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x6jg6"] Feb 18 15:32:13 crc kubenswrapper[4817]: I0218 15:32:13.603238 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6jg6" event={"ID":"d5ce4f38-77cb-49cd-b921-6d63fe15dac2","Type":"ContainerStarted","Data":"8cd67c52bf5c09e20b845b0112c229742ac8340cf05118f1045947fc7be9d1df"} Feb 18 15:32:14 crc kubenswrapper[4817]: I0218 15:32:14.614078 4817 generic.go:334] "Generic (PLEG): container finished" podID="d5ce4f38-77cb-49cd-b921-6d63fe15dac2" containerID="9de91cb0e58a30483fd1d45b54c45f5d3e9880d13438a48f5b3cd10dc60eac65" exitCode=0 Feb 18 15:32:14 crc kubenswrapper[4817]: I0218 15:32:14.614154 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6jg6" event={"ID":"d5ce4f38-77cb-49cd-b921-6d63fe15dac2","Type":"ContainerDied","Data":"9de91cb0e58a30483fd1d45b54c45f5d3e9880d13438a48f5b3cd10dc60eac65"} Feb 18 15:32:14 crc kubenswrapper[4817]: I0218 15:32:14.983711 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8vc6d"] Feb 18 15:32:14 crc kubenswrapper[4817]: I0218 15:32:14.986706 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vc6d" Feb 18 15:32:15 crc kubenswrapper[4817]: I0218 15:32:15.008611 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8vc6d"] Feb 18 15:32:15 crc kubenswrapper[4817]: I0218 15:32:15.064029 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvd5s\" (UniqueName: \"kubernetes.io/projected/e6df996b-a456-4566-a21d-d65896d0bc92-kube-api-access-vvd5s\") pod \"community-operators-8vc6d\" (UID: \"e6df996b-a456-4566-a21d-d65896d0bc92\") " pod="openshift-marketplace/community-operators-8vc6d" Feb 18 15:32:15 crc kubenswrapper[4817]: I0218 15:32:15.064227 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6df996b-a456-4566-a21d-d65896d0bc92-utilities\") pod \"community-operators-8vc6d\" (UID: \"e6df996b-a456-4566-a21d-d65896d0bc92\") " pod="openshift-marketplace/community-operators-8vc6d" Feb 18 15:32:15 crc kubenswrapper[4817]: I0218 15:32:15.064260 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6df996b-a456-4566-a21d-d65896d0bc92-catalog-content\") pod \"community-operators-8vc6d\" (UID: \"e6df996b-a456-4566-a21d-d65896d0bc92\") " pod="openshift-marketplace/community-operators-8vc6d" Feb 18 15:32:15 crc kubenswrapper[4817]: I0218 15:32:15.166719 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvd5s\" (UniqueName: \"kubernetes.io/projected/e6df996b-a456-4566-a21d-d65896d0bc92-kube-api-access-vvd5s\") pod \"community-operators-8vc6d\" (UID: \"e6df996b-a456-4566-a21d-d65896d0bc92\") " pod="openshift-marketplace/community-operators-8vc6d" Feb 18 15:32:15 crc kubenswrapper[4817]: I0218 15:32:15.166884 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6df996b-a456-4566-a21d-d65896d0bc92-utilities\") pod \"community-operators-8vc6d\" (UID: \"e6df996b-a456-4566-a21d-d65896d0bc92\") " pod="openshift-marketplace/community-operators-8vc6d" Feb 18 15:32:15 crc kubenswrapper[4817]: I0218 15:32:15.166919 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6df996b-a456-4566-a21d-d65896d0bc92-catalog-content\") pod \"community-operators-8vc6d\" (UID: \"e6df996b-a456-4566-a21d-d65896d0bc92\") " pod="openshift-marketplace/community-operators-8vc6d" Feb 18 15:32:15 crc kubenswrapper[4817]: I0218 15:32:15.167398 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6df996b-a456-4566-a21d-d65896d0bc92-catalog-content\") pod \"community-operators-8vc6d\" (UID: \"e6df996b-a456-4566-a21d-d65896d0bc92\") " pod="openshift-marketplace/community-operators-8vc6d" Feb 18 15:32:15 crc kubenswrapper[4817]: I0218 15:32:15.167400 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6df996b-a456-4566-a21d-d65896d0bc92-utilities\") pod \"community-operators-8vc6d\" (UID: \"e6df996b-a456-4566-a21d-d65896d0bc92\") " pod="openshift-marketplace/community-operators-8vc6d" Feb 18 15:32:15 crc kubenswrapper[4817]: I0218 15:32:15.195888 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvd5s\" (UniqueName: \"kubernetes.io/projected/e6df996b-a456-4566-a21d-d65896d0bc92-kube-api-access-vvd5s\") pod \"community-operators-8vc6d\" (UID: \"e6df996b-a456-4566-a21d-d65896d0bc92\") " pod="openshift-marketplace/community-operators-8vc6d" Feb 18 15:32:15 crc kubenswrapper[4817]: I0218 15:32:15.303747 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vc6d" Feb 18 15:32:15 crc kubenswrapper[4817]: W0218 15:32:15.899531 4817 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6df996b_a456_4566_a21d_d65896d0bc92.slice/crio-e962f47986c52f5bc045e4264a951ddddbc31a76ab79c2a1bcdbd2004e9b6fdd WatchSource:0}: Error finding container e962f47986c52f5bc045e4264a951ddddbc31a76ab79c2a1bcdbd2004e9b6fdd: Status 404 returned error can't find the container with id e962f47986c52f5bc045e4264a951ddddbc31a76ab79c2a1bcdbd2004e9b6fdd Feb 18 15:32:15 crc kubenswrapper[4817]: I0218 15:32:15.901372 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8vc6d"] Feb 18 15:32:16 crc kubenswrapper[4817]: I0218 15:32:16.638964 4817 generic.go:334] "Generic (PLEG): container finished" podID="e6df996b-a456-4566-a21d-d65896d0bc92" containerID="3cb6d579dbe7e3dc6a58168eb2eb5b8d343fef5d4dd0abc497104e3961ea35ea" exitCode=0 Feb 18 15:32:16 crc kubenswrapper[4817]: I0218 15:32:16.639108 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vc6d" event={"ID":"e6df996b-a456-4566-a21d-d65896d0bc92","Type":"ContainerDied","Data":"3cb6d579dbe7e3dc6a58168eb2eb5b8d343fef5d4dd0abc497104e3961ea35ea"} Feb 18 15:32:16 crc kubenswrapper[4817]: I0218 15:32:16.639655 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vc6d" event={"ID":"e6df996b-a456-4566-a21d-d65896d0bc92","Type":"ContainerStarted","Data":"e962f47986c52f5bc045e4264a951ddddbc31a76ab79c2a1bcdbd2004e9b6fdd"} Feb 18 15:32:16 crc kubenswrapper[4817]: I0218 15:32:16.641894 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6jg6" event={"ID":"d5ce4f38-77cb-49cd-b921-6d63fe15dac2","Type":"ContainerStarted","Data":"e41cc465ec84d88c0dbec0ea1c4aa51073cfd14bdf45102cccdef54936f12cc0"} Feb 18 15:32:17 crc kubenswrapper[4817]: I0218 15:32:17.653169 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vc6d" event={"ID":"e6df996b-a456-4566-a21d-d65896d0bc92","Type":"ContainerStarted","Data":"97000a6764f4dcdd3b4df1e3eb8e221450e6195de37a9e21d8813ddf49249e1b"} Feb 18 15:32:18 crc kubenswrapper[4817]: I0218 15:32:18.835955 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-z6fs7_5889628d-b78a-4279-95fd-ec441aac9d34/kube-rbac-proxy/0.log" Feb 18 15:32:18 crc kubenswrapper[4817]: I0218 15:32:18.969704 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-z6fs7_5889628d-b78a-4279-95fd-ec441aac9d34/controller/0.log" Feb 18 15:32:19 crc kubenswrapper[4817]: I0218 15:32:19.143754 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/cp-frr-files/0.log" Feb 18 15:32:19 crc kubenswrapper[4817]: I0218 15:32:19.426785 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/cp-frr-files/0.log" Feb 18 15:32:19 crc kubenswrapper[4817]: I0218 15:32:19.428011 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/cp-reloader/0.log" Feb 18 15:32:19 crc kubenswrapper[4817]: I0218 15:32:19.480116 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/cp-reloader/0.log" Feb 18 15:32:19 crc kubenswrapper[4817]: I0218 15:32:19.488675 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/cp-metrics/0.log" Feb 18 15:32:20 crc kubenswrapper[4817]: I0218 15:32:20.030631 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/cp-metrics/0.log" Feb 18 15:32:20 crc kubenswrapper[4817]: I0218 15:32:20.063151 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/cp-reloader/0.log" Feb 18 15:32:20 crc kubenswrapper[4817]: I0218 15:32:20.096807 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/cp-metrics/0.log" Feb 18 15:32:20 crc kubenswrapper[4817]: I0218 15:32:20.097571 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/cp-frr-files/0.log" Feb 18 15:32:20 crc kubenswrapper[4817]: I0218 15:32:20.303661 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/cp-frr-files/0.log" Feb 18 15:32:20 crc kubenswrapper[4817]: I0218 15:32:20.315930 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/cp-reloader/0.log" Feb 18 15:32:20 crc kubenswrapper[4817]: I0218 15:32:20.403634 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/controller/0.log" Feb 18 15:32:20 crc kubenswrapper[4817]: I0218 15:32:20.431604 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/cp-metrics/0.log" Feb 18 15:32:20 crc kubenswrapper[4817]: I0218 15:32:20.642207 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/kube-rbac-proxy-frr/0.log" Feb 18 15:32:20 crc kubenswrapper[4817]: I0218 15:32:20.655933 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/kube-rbac-proxy/0.log" Feb 18 15:32:20 crc kubenswrapper[4817]: I0218 15:32:20.689481 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/frr-metrics/0.log" Feb 18 15:32:20 crc kubenswrapper[4817]: I0218 15:32:20.945161 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-kphxt_e2fe6fd7-48f6-47ec-b4b3-60016704bad9/frr-k8s-webhook-server/0.log" Feb 18 15:32:21 crc kubenswrapper[4817]: I0218 15:32:21.037105 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/reloader/0.log" Feb 18 15:32:21 crc kubenswrapper[4817]: I0218 15:32:21.305552 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7c8dd94b68-49zzw_ba0591c4-822e-406b-a86b-1f2a6078452c/manager/0.log" Feb 18 15:32:21 crc kubenswrapper[4817]: I0218 15:32:21.707777 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-57fdf9bc8-smmwn_4c5c3b60-c65f-4f6c-869b-162ebd95eb32/webhook-server/0.log" Feb 18 15:32:21 crc kubenswrapper[4817]: I0218 15:32:21.709906 4817 generic.go:334] "Generic (PLEG): container finished" podID="d5ce4f38-77cb-49cd-b921-6d63fe15dac2" containerID="e41cc465ec84d88c0dbec0ea1c4aa51073cfd14bdf45102cccdef54936f12cc0" exitCode=0 Feb 18 15:32:21 crc kubenswrapper[4817]: I0218 15:32:21.710016 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6jg6" event={"ID":"d5ce4f38-77cb-49cd-b921-6d63fe15dac2","Type":"ContainerDied","Data":"e41cc465ec84d88c0dbec0ea1c4aa51073cfd14bdf45102cccdef54936f12cc0"} Feb 18 15:32:21 crc kubenswrapper[4817]: I0218 15:32:21.724431 4817 generic.go:334] "Generic (PLEG): container finished" podID="e6df996b-a456-4566-a21d-d65896d0bc92" containerID="97000a6764f4dcdd3b4df1e3eb8e221450e6195de37a9e21d8813ddf49249e1b" exitCode=0 Feb 18 15:32:21 crc kubenswrapper[4817]: I0218 15:32:21.724647 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vc6d" event={"ID":"e6df996b-a456-4566-a21d-d65896d0bc92","Type":"ContainerDied","Data":"97000a6764f4dcdd3b4df1e3eb8e221450e6195de37a9e21d8813ddf49249e1b"} Feb 18 15:32:21 crc kubenswrapper[4817]: I0218 15:32:21.816294 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-rzsqh_2ce92458-8bf0-41c0-95d1-219f6c35cdf5/kube-rbac-proxy/0.log" Feb 18 15:32:21 crc kubenswrapper[4817]: I0218 15:32:21.921391 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dzb5b_ea454868-79b0-415d-8c0a-6c176b3ca98b/frr/0.log" Feb 18 15:32:22 crc kubenswrapper[4817]: I0218 15:32:22.323779 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-rzsqh_2ce92458-8bf0-41c0-95d1-219f6c35cdf5/speaker/0.log" Feb 18 15:32:22 crc kubenswrapper[4817]: I0218 15:32:22.737658 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6jg6" event={"ID":"d5ce4f38-77cb-49cd-b921-6d63fe15dac2","Type":"ContainerStarted","Data":"59291abb4ccbda7ff12f8caa54d244c4812d70a0c2294558d026515d821f35ef"} Feb 18 15:32:22 crc kubenswrapper[4817]: I0218 15:32:22.741819 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vc6d" event={"ID":"e6df996b-a456-4566-a21d-d65896d0bc92","Type":"ContainerStarted","Data":"58902be658f1030d034079e95ac9119d77863e8790cc1b121aba497534644594"} Feb 18 15:32:22 crc kubenswrapper[4817]: I0218 15:32:22.827149 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-x6jg6" podStartSLOduration=3.107684612 podStartE2EDuration="10.827127418s" podCreationTimestamp="2026-02-18 15:32:12 +0000 UTC" firstStartedPulling="2026-02-18 15:32:14.616591688 +0000 UTC m=+5597.192127671" lastFinishedPulling="2026-02-18 15:32:22.336034494 +0000 UTC m=+5604.911570477" observedRunningTime="2026-02-18 15:32:22.784275855 +0000 UTC m=+5605.359811858" watchObservedRunningTime="2026-02-18 15:32:22.827127418 +0000 UTC m=+5605.402663401" Feb 18 15:32:22 crc kubenswrapper[4817]: I0218 15:32:22.833241 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8vc6d" podStartSLOduration=3.346364823 podStartE2EDuration="8.833225063s" podCreationTimestamp="2026-02-18 15:32:14 +0000 UTC" firstStartedPulling="2026-02-18 15:32:16.640844108 +0000 UTC m=+5599.216380091" lastFinishedPulling="2026-02-18 15:32:22.127704348 +0000 UTC m=+5604.703240331" observedRunningTime="2026-02-18 15:32:22.824595004 +0000 UTC m=+5605.400130987" watchObservedRunningTime="2026-02-18 15:32:22.833225063 +0000 UTC m=+5605.408761046" Feb 18 15:32:22 crc kubenswrapper[4817]: I0218 15:32:22.923393 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-x6jg6" Feb 18 15:32:22 crc kubenswrapper[4817]: I0218 15:32:22.923493 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-x6jg6" Feb 18 15:32:23 crc kubenswrapper[4817]: I0218 15:32:23.976093 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-x6jg6" podUID="d5ce4f38-77cb-49cd-b921-6d63fe15dac2" containerName="registry-server" probeResult="failure" output=< Feb 18 15:32:23 crc kubenswrapper[4817]: timeout: failed to connect service ":50051" within 1s Feb 18 15:32:23 crc kubenswrapper[4817]: > Feb 18 15:32:24 crc kubenswrapper[4817]: I0218 15:32:24.171734 4817 scope.go:117] "RemoveContainer" containerID="4391c0b6a1ccf075965fda152c64cede71fd2ad1d19321476a2f1c8612ddc514" Feb 18 15:32:24 crc kubenswrapper[4817]: I0218 15:32:24.762801 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerStarted","Data":"cb50bfaa6f54127fe536eead08106baf14e441ffb218d0ce957318bef07ec7c2"} Feb 18 15:32:25 crc kubenswrapper[4817]: I0218 15:32:25.305186 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8vc6d" Feb 18 15:32:25 crc kubenswrapper[4817]: I0218 15:32:25.305538 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8vc6d" Feb 18 15:32:26 crc kubenswrapper[4817]: I0218 15:32:26.364514 4817 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-8vc6d" podUID="e6df996b-a456-4566-a21d-d65896d0bc92" containerName="registry-server" probeResult="failure" output=< Feb 18 15:32:26 crc kubenswrapper[4817]: timeout: failed to connect service ":50051" within 1s Feb 18 15:32:26 crc kubenswrapper[4817]: > Feb 18 15:32:32 crc kubenswrapper[4817]: I0218 15:32:32.974757 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-x6jg6" Feb 18 15:32:33 crc kubenswrapper[4817]: I0218 15:32:33.032134 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-x6jg6" Feb 18 15:32:33 crc kubenswrapper[4817]: I0218 15:32:33.222232 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x6jg6"] Feb 18 15:32:34 crc kubenswrapper[4817]: I0218 15:32:34.857883 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-x6jg6" podUID="d5ce4f38-77cb-49cd-b921-6d63fe15dac2" containerName="registry-server" containerID="cri-o://59291abb4ccbda7ff12f8caa54d244c4812d70a0c2294558d026515d821f35ef" gracePeriod=2 Feb 18 15:32:35 crc kubenswrapper[4817]: I0218 15:32:35.376324 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8vc6d" Feb 18 15:32:35 crc kubenswrapper[4817]: I0218 15:32:35.435169 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8vc6d" Feb 18 15:32:35 crc kubenswrapper[4817]: I0218 15:32:35.662612 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x6jg6" Feb 18 15:32:35 crc kubenswrapper[4817]: I0218 15:32:35.724911 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5ce4f38-77cb-49cd-b921-6d63fe15dac2-utilities\") pod \"d5ce4f38-77cb-49cd-b921-6d63fe15dac2\" (UID: \"d5ce4f38-77cb-49cd-b921-6d63fe15dac2\") " Feb 18 15:32:35 crc kubenswrapper[4817]: I0218 15:32:35.725075 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kn526\" (UniqueName: \"kubernetes.io/projected/d5ce4f38-77cb-49cd-b921-6d63fe15dac2-kube-api-access-kn526\") pod \"d5ce4f38-77cb-49cd-b921-6d63fe15dac2\" (UID: \"d5ce4f38-77cb-49cd-b921-6d63fe15dac2\") " Feb 18 15:32:35 crc kubenswrapper[4817]: I0218 15:32:35.725199 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5ce4f38-77cb-49cd-b921-6d63fe15dac2-catalog-content\") pod \"d5ce4f38-77cb-49cd-b921-6d63fe15dac2\" (UID: \"d5ce4f38-77cb-49cd-b921-6d63fe15dac2\") " Feb 18 15:32:35 crc kubenswrapper[4817]: I0218 15:32:35.725793 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5ce4f38-77cb-49cd-b921-6d63fe15dac2-utilities" (OuterVolumeSpecName: "utilities") pod "d5ce4f38-77cb-49cd-b921-6d63fe15dac2" (UID: "d5ce4f38-77cb-49cd-b921-6d63fe15dac2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 15:32:35 crc kubenswrapper[4817]: I0218 15:32:35.751117 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5ce4f38-77cb-49cd-b921-6d63fe15dac2-kube-api-access-kn526" (OuterVolumeSpecName: "kube-api-access-kn526") pod "d5ce4f38-77cb-49cd-b921-6d63fe15dac2" (UID: "d5ce4f38-77cb-49cd-b921-6d63fe15dac2"). InnerVolumeSpecName "kube-api-access-kn526". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 15:32:35 crc kubenswrapper[4817]: I0218 15:32:35.827372 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5ce4f38-77cb-49cd-b921-6d63fe15dac2-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 15:32:35 crc kubenswrapper[4817]: I0218 15:32:35.827435 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kn526\" (UniqueName: \"kubernetes.io/projected/d5ce4f38-77cb-49cd-b921-6d63fe15dac2-kube-api-access-kn526\") on node \"crc\" DevicePath \"\"" Feb 18 15:32:35 crc kubenswrapper[4817]: I0218 15:32:35.872201 4817 generic.go:334] "Generic (PLEG): container finished" podID="d5ce4f38-77cb-49cd-b921-6d63fe15dac2" containerID="59291abb4ccbda7ff12f8caa54d244c4812d70a0c2294558d026515d821f35ef" exitCode=0 Feb 18 15:32:35 crc kubenswrapper[4817]: I0218 15:32:35.872699 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6jg6" event={"ID":"d5ce4f38-77cb-49cd-b921-6d63fe15dac2","Type":"ContainerDied","Data":"59291abb4ccbda7ff12f8caa54d244c4812d70a0c2294558d026515d821f35ef"} Feb 18 15:32:35 crc kubenswrapper[4817]: I0218 15:32:35.872756 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6jg6" event={"ID":"d5ce4f38-77cb-49cd-b921-6d63fe15dac2","Type":"ContainerDied","Data":"8cd67c52bf5c09e20b845b0112c229742ac8340cf05118f1045947fc7be9d1df"} Feb 18 15:32:35 crc kubenswrapper[4817]: I0218 15:32:35.872782 4817 scope.go:117] "RemoveContainer" containerID="59291abb4ccbda7ff12f8caa54d244c4812d70a0c2294558d026515d821f35ef" Feb 18 15:32:35 crc kubenswrapper[4817]: I0218 15:32:35.872784 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x6jg6" Feb 18 15:32:35 crc kubenswrapper[4817]: I0218 15:32:35.888153 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5ce4f38-77cb-49cd-b921-6d63fe15dac2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d5ce4f38-77cb-49cd-b921-6d63fe15dac2" (UID: "d5ce4f38-77cb-49cd-b921-6d63fe15dac2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 15:32:35 crc kubenswrapper[4817]: I0218 15:32:35.895113 4817 scope.go:117] "RemoveContainer" containerID="e41cc465ec84d88c0dbec0ea1c4aa51073cfd14bdf45102cccdef54936f12cc0" Feb 18 15:32:35 crc kubenswrapper[4817]: I0218 15:32:35.918885 4817 scope.go:117] "RemoveContainer" containerID="9de91cb0e58a30483fd1d45b54c45f5d3e9880d13438a48f5b3cd10dc60eac65" Feb 18 15:32:35 crc kubenswrapper[4817]: I0218 15:32:35.930416 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5ce4f38-77cb-49cd-b921-6d63fe15dac2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 15:32:35 crc kubenswrapper[4817]: I0218 15:32:35.968794 4817 scope.go:117] "RemoveContainer" containerID="59291abb4ccbda7ff12f8caa54d244c4812d70a0c2294558d026515d821f35ef" Feb 18 15:32:35 crc kubenswrapper[4817]: E0218 15:32:35.969799 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59291abb4ccbda7ff12f8caa54d244c4812d70a0c2294558d026515d821f35ef\": container with ID starting with 59291abb4ccbda7ff12f8caa54d244c4812d70a0c2294558d026515d821f35ef not found: ID does not exist" containerID="59291abb4ccbda7ff12f8caa54d244c4812d70a0c2294558d026515d821f35ef" Feb 18 15:32:35 crc kubenswrapper[4817]: I0218 15:32:35.969840 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59291abb4ccbda7ff12f8caa54d244c4812d70a0c2294558d026515d821f35ef"} err="failed to get container status \"59291abb4ccbda7ff12f8caa54d244c4812d70a0c2294558d026515d821f35ef\": rpc error: code = NotFound desc = could not find container \"59291abb4ccbda7ff12f8caa54d244c4812d70a0c2294558d026515d821f35ef\": container with ID starting with 59291abb4ccbda7ff12f8caa54d244c4812d70a0c2294558d026515d821f35ef not found: ID does not exist" Feb 18 15:32:35 crc kubenswrapper[4817]: I0218 15:32:35.969867 4817 scope.go:117] "RemoveContainer" containerID="e41cc465ec84d88c0dbec0ea1c4aa51073cfd14bdf45102cccdef54936f12cc0" Feb 18 15:32:35 crc kubenswrapper[4817]: E0218 15:32:35.970839 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e41cc465ec84d88c0dbec0ea1c4aa51073cfd14bdf45102cccdef54936f12cc0\": container with ID starting with e41cc465ec84d88c0dbec0ea1c4aa51073cfd14bdf45102cccdef54936f12cc0 not found: ID does not exist" containerID="e41cc465ec84d88c0dbec0ea1c4aa51073cfd14bdf45102cccdef54936f12cc0" Feb 18 15:32:35 crc kubenswrapper[4817]: I0218 15:32:35.970913 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e41cc465ec84d88c0dbec0ea1c4aa51073cfd14bdf45102cccdef54936f12cc0"} err="failed to get container status \"e41cc465ec84d88c0dbec0ea1c4aa51073cfd14bdf45102cccdef54936f12cc0\": rpc error: code = NotFound desc = could not find container \"e41cc465ec84d88c0dbec0ea1c4aa51073cfd14bdf45102cccdef54936f12cc0\": container with ID starting with e41cc465ec84d88c0dbec0ea1c4aa51073cfd14bdf45102cccdef54936f12cc0 not found: ID does not exist" Feb 18 15:32:35 crc kubenswrapper[4817]: I0218 15:32:35.970950 4817 scope.go:117] "RemoveContainer" containerID="9de91cb0e58a30483fd1d45b54c45f5d3e9880d13438a48f5b3cd10dc60eac65" Feb 18 15:32:35 crc kubenswrapper[4817]: E0218 15:32:35.971270 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9de91cb0e58a30483fd1d45b54c45f5d3e9880d13438a48f5b3cd10dc60eac65\": container with ID starting with 9de91cb0e58a30483fd1d45b54c45f5d3e9880d13438a48f5b3cd10dc60eac65 not found: ID does not exist" containerID="9de91cb0e58a30483fd1d45b54c45f5d3e9880d13438a48f5b3cd10dc60eac65" Feb 18 15:32:35 crc kubenswrapper[4817]: I0218 15:32:35.971310 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9de91cb0e58a30483fd1d45b54c45f5d3e9880d13438a48f5b3cd10dc60eac65"} err="failed to get container status \"9de91cb0e58a30483fd1d45b54c45f5d3e9880d13438a48f5b3cd10dc60eac65\": rpc error: code = NotFound desc = could not find container \"9de91cb0e58a30483fd1d45b54c45f5d3e9880d13438a48f5b3cd10dc60eac65\": container with ID starting with 9de91cb0e58a30483fd1d45b54c45f5d3e9880d13438a48f5b3cd10dc60eac65 not found: ID does not exist" Feb 18 15:32:36 crc kubenswrapper[4817]: I0218 15:32:36.202590 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x6jg6"] Feb 18 15:32:36 crc kubenswrapper[4817]: I0218 15:32:36.211288 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-x6jg6"] Feb 18 15:32:37 crc kubenswrapper[4817]: I0218 15:32:37.429046 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8vc6d"] Feb 18 15:32:37 crc kubenswrapper[4817]: I0218 15:32:37.429325 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8vc6d" podUID="e6df996b-a456-4566-a21d-d65896d0bc92" containerName="registry-server" containerID="cri-o://58902be658f1030d034079e95ac9119d77863e8790cc1b121aba497534644594" gracePeriod=2 Feb 18 15:32:37 crc kubenswrapper[4817]: I0218 15:32:37.920381 4817 generic.go:334] "Generic (PLEG): container finished" podID="e6df996b-a456-4566-a21d-d65896d0bc92" containerID="58902be658f1030d034079e95ac9119d77863e8790cc1b121aba497534644594" exitCode=0 Feb 18 15:32:37 crc kubenswrapper[4817]: I0218 15:32:37.920426 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vc6d" event={"ID":"e6df996b-a456-4566-a21d-d65896d0bc92","Type":"ContainerDied","Data":"58902be658f1030d034079e95ac9119d77863e8790cc1b121aba497534644594"} Feb 18 15:32:38 crc kubenswrapper[4817]: I0218 15:32:38.182894 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5ce4f38-77cb-49cd-b921-6d63fe15dac2" path="/var/lib/kubelet/pods/d5ce4f38-77cb-49cd-b921-6d63fe15dac2/volumes" Feb 18 15:32:38 crc kubenswrapper[4817]: I0218 15:32:38.289422 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vc6d" Feb 18 15:32:38 crc kubenswrapper[4817]: I0218 15:32:38.384527 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6df996b-a456-4566-a21d-d65896d0bc92-catalog-content\") pod \"e6df996b-a456-4566-a21d-d65896d0bc92\" (UID: \"e6df996b-a456-4566-a21d-d65896d0bc92\") " Feb 18 15:32:38 crc kubenswrapper[4817]: I0218 15:32:38.385721 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6df996b-a456-4566-a21d-d65896d0bc92-utilities\") pod \"e6df996b-a456-4566-a21d-d65896d0bc92\" (UID: \"e6df996b-a456-4566-a21d-d65896d0bc92\") " Feb 18 15:32:38 crc kubenswrapper[4817]: I0218 15:32:38.385965 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvd5s\" (UniqueName: \"kubernetes.io/projected/e6df996b-a456-4566-a21d-d65896d0bc92-kube-api-access-vvd5s\") pod \"e6df996b-a456-4566-a21d-d65896d0bc92\" (UID: \"e6df996b-a456-4566-a21d-d65896d0bc92\") " Feb 18 15:32:38 crc kubenswrapper[4817]: I0218 15:32:38.388062 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6df996b-a456-4566-a21d-d65896d0bc92-utilities" (OuterVolumeSpecName: "utilities") pod "e6df996b-a456-4566-a21d-d65896d0bc92" (UID: "e6df996b-a456-4566-a21d-d65896d0bc92"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 15:32:38 crc kubenswrapper[4817]: I0218 15:32:38.393396 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6df996b-a456-4566-a21d-d65896d0bc92-kube-api-access-vvd5s" (OuterVolumeSpecName: "kube-api-access-vvd5s") pod "e6df996b-a456-4566-a21d-d65896d0bc92" (UID: "e6df996b-a456-4566-a21d-d65896d0bc92"). InnerVolumeSpecName "kube-api-access-vvd5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 15:32:38 crc kubenswrapper[4817]: I0218 15:32:38.450867 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6df996b-a456-4566-a21d-d65896d0bc92-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6df996b-a456-4566-a21d-d65896d0bc92" (UID: "e6df996b-a456-4566-a21d-d65896d0bc92"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 15:32:38 crc kubenswrapper[4817]: I0218 15:32:38.489106 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvd5s\" (UniqueName: \"kubernetes.io/projected/e6df996b-a456-4566-a21d-d65896d0bc92-kube-api-access-vvd5s\") on node \"crc\" DevicePath \"\"" Feb 18 15:32:38 crc kubenswrapper[4817]: I0218 15:32:38.489148 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6df996b-a456-4566-a21d-d65896d0bc92-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 15:32:38 crc kubenswrapper[4817]: I0218 15:32:38.489161 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6df996b-a456-4566-a21d-d65896d0bc92-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 15:32:38 crc kubenswrapper[4817]: I0218 15:32:38.932637 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vc6d" event={"ID":"e6df996b-a456-4566-a21d-d65896d0bc92","Type":"ContainerDied","Data":"e962f47986c52f5bc045e4264a951ddddbc31a76ab79c2a1bcdbd2004e9b6fdd"} Feb 18 15:32:38 crc kubenswrapper[4817]: I0218 15:32:38.932969 4817 scope.go:117] "RemoveContainer" containerID="58902be658f1030d034079e95ac9119d77863e8790cc1b121aba497534644594" Feb 18 15:32:38 crc kubenswrapper[4817]: I0218 15:32:38.932693 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vc6d" Feb 18 15:32:38 crc kubenswrapper[4817]: I0218 15:32:38.955612 4817 scope.go:117] "RemoveContainer" containerID="97000a6764f4dcdd3b4df1e3eb8e221450e6195de37a9e21d8813ddf49249e1b" Feb 18 15:32:38 crc kubenswrapper[4817]: I0218 15:32:38.974380 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8vc6d"] Feb 18 15:32:38 crc kubenswrapper[4817]: I0218 15:32:38.989928 4817 scope.go:117] "RemoveContainer" containerID="3cb6d579dbe7e3dc6a58168eb2eb5b8d343fef5d4dd0abc497104e3961ea35ea" Feb 18 15:32:38 crc kubenswrapper[4817]: I0218 15:32:38.999650 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8vc6d"] Feb 18 15:32:39 crc kubenswrapper[4817]: I0218 15:32:39.806545 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp_b460cb7c-dd22-42e4-91a1-1eee6a8340dc/util/0.log" Feb 18 15:32:40 crc kubenswrapper[4817]: I0218 15:32:40.085965 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp_b460cb7c-dd22-42e4-91a1-1eee6a8340dc/util/0.log" Feb 18 15:32:40 crc kubenswrapper[4817]: I0218 15:32:40.108024 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp_b460cb7c-dd22-42e4-91a1-1eee6a8340dc/pull/0.log" Feb 18 15:32:40 crc kubenswrapper[4817]: I0218 15:32:40.194710 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6df996b-a456-4566-a21d-d65896d0bc92" path="/var/lib/kubelet/pods/e6df996b-a456-4566-a21d-d65896d0bc92/volumes" Feb 18 15:32:40 crc kubenswrapper[4817]: I0218 15:32:40.239093 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp_b460cb7c-dd22-42e4-91a1-1eee6a8340dc/pull/0.log" Feb 18 15:32:40 crc kubenswrapper[4817]: I0218 15:32:40.394905 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp_b460cb7c-dd22-42e4-91a1-1eee6a8340dc/extract/0.log" Feb 18 15:32:40 crc kubenswrapper[4817]: I0218 15:32:40.411334 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp_b460cb7c-dd22-42e4-91a1-1eee6a8340dc/pull/0.log" Feb 18 15:32:40 crc kubenswrapper[4817]: I0218 15:32:40.413549 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651c5glp_b460cb7c-dd22-42e4-91a1-1eee6a8340dc/util/0.log" Feb 18 15:32:40 crc kubenswrapper[4817]: I0218 15:32:40.554030 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk_f23865a8-7bc6-47b8-a9c8-6c9188463757/util/0.log" Feb 18 15:32:40 crc kubenswrapper[4817]: I0218 15:32:40.759617 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk_f23865a8-7bc6-47b8-a9c8-6c9188463757/util/0.log" Feb 18 15:32:40 crc kubenswrapper[4817]: I0218 15:32:40.762336 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk_f23865a8-7bc6-47b8-a9c8-6c9188463757/pull/0.log" Feb 18 15:32:40 crc kubenswrapper[4817]: I0218 15:32:40.812290 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk_f23865a8-7bc6-47b8-a9c8-6c9188463757/pull/0.log" Feb 18 15:32:41 crc kubenswrapper[4817]: I0218 15:32:41.246386 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk_f23865a8-7bc6-47b8-a9c8-6c9188463757/util/0.log" Feb 18 15:32:41 crc kubenswrapper[4817]: I0218 15:32:41.309682 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk_f23865a8-7bc6-47b8-a9c8-6c9188463757/extract/0.log" Feb 18 15:32:41 crc kubenswrapper[4817]: I0218 15:32:41.380633 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wjdsk_f23865a8-7bc6-47b8-a9c8-6c9188463757/pull/0.log" Feb 18 15:32:41 crc kubenswrapper[4817]: I0218 15:32:41.543988 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg_d8a0b33c-2815-43e2-bdcc-6a1b99682d34/util/0.log" Feb 18 15:32:41 crc kubenswrapper[4817]: I0218 15:32:41.822671 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg_d8a0b33c-2815-43e2-bdcc-6a1b99682d34/pull/0.log" Feb 18 15:32:41 crc kubenswrapper[4817]: I0218 15:32:41.837504 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg_d8a0b33c-2815-43e2-bdcc-6a1b99682d34/pull/0.log" Feb 18 15:32:41 crc kubenswrapper[4817]: I0218 15:32:41.846370 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg_d8a0b33c-2815-43e2-bdcc-6a1b99682d34/util/0.log" Feb 18 15:32:42 crc kubenswrapper[4817]: I0218 15:32:42.029701 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg_d8a0b33c-2815-43e2-bdcc-6a1b99682d34/extract/0.log" Feb 18 15:32:42 crc kubenswrapper[4817]: I0218 15:32:42.043736 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg_d8a0b33c-2815-43e2-bdcc-6a1b99682d34/util/0.log" Feb 18 15:32:42 crc kubenswrapper[4817]: I0218 15:32:42.079942 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213rgxhg_d8a0b33c-2815-43e2-bdcc-6a1b99682d34/pull/0.log" Feb 18 15:32:42 crc kubenswrapper[4817]: I0218 15:32:42.243658 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-95btg_97057c75-124d-48f2-8931-667fa9ad766f/extract-utilities/0.log" Feb 18 15:32:42 crc kubenswrapper[4817]: I0218 15:32:42.561124 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-95btg_97057c75-124d-48f2-8931-667fa9ad766f/extract-content/0.log" Feb 18 15:32:42 crc kubenswrapper[4817]: I0218 15:32:42.602892 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-95btg_97057c75-124d-48f2-8931-667fa9ad766f/extract-utilities/0.log" Feb 18 15:32:42 crc kubenswrapper[4817]: I0218 15:32:42.620593 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-95btg_97057c75-124d-48f2-8931-667fa9ad766f/extract-content/0.log" Feb 18 15:32:42 crc kubenswrapper[4817]: I0218 15:32:42.831411 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-95btg_97057c75-124d-48f2-8931-667fa9ad766f/extract-utilities/0.log" Feb 18 15:32:42 crc kubenswrapper[4817]: I0218 15:32:42.849532 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-95btg_97057c75-124d-48f2-8931-667fa9ad766f/extract-content/0.log" Feb 18 15:32:43 crc kubenswrapper[4817]: I0218 15:32:43.137285 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7657x_80976ca8-de28-4b71-a0d1-f3aeb4410466/extract-utilities/0.log" Feb 18 15:32:43 crc kubenswrapper[4817]: I0218 15:32:43.316799 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7657x_80976ca8-de28-4b71-a0d1-f3aeb4410466/extract-utilities/0.log" Feb 18 15:32:43 crc kubenswrapper[4817]: I0218 15:32:43.391580 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7657x_80976ca8-de28-4b71-a0d1-f3aeb4410466/extract-content/0.log" Feb 18 15:32:43 crc kubenswrapper[4817]: I0218 15:32:43.476019 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7657x_80976ca8-de28-4b71-a0d1-f3aeb4410466/extract-content/0.log" Feb 18 15:32:43 crc kubenswrapper[4817]: I0218 15:32:43.675392 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-95btg_97057c75-124d-48f2-8931-667fa9ad766f/registry-server/0.log" Feb 18 15:32:43 crc kubenswrapper[4817]: I0218 15:32:43.795264 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7657x_80976ca8-de28-4b71-a0d1-f3aeb4410466/extract-content/0.log" Feb 18 15:32:43 crc kubenswrapper[4817]: I0218 15:32:43.874463 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7657x_80976ca8-de28-4b71-a0d1-f3aeb4410466/extract-utilities/0.log" Feb 18 15:32:44 crc kubenswrapper[4817]: I0218 15:32:44.131668 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62_ca2187cb-8ba5-4146-a506-4989f6bade5c/util/0.log" Feb 18 15:32:44 crc kubenswrapper[4817]: I0218 15:32:44.318169 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7657x_80976ca8-de28-4b71-a0d1-f3aeb4410466/registry-server/0.log" Feb 18 15:32:44 crc kubenswrapper[4817]: I0218 15:32:44.369303 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62_ca2187cb-8ba5-4146-a506-4989f6bade5c/util/0.log" Feb 18 15:32:44 crc kubenswrapper[4817]: I0218 15:32:44.371192 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62_ca2187cb-8ba5-4146-a506-4989f6bade5c/pull/0.log" Feb 18 15:32:44 crc kubenswrapper[4817]: I0218 15:32:44.422042 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62_ca2187cb-8ba5-4146-a506-4989f6bade5c/pull/0.log" Feb 18 15:32:44 crc kubenswrapper[4817]: I0218 15:32:44.605323 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62_ca2187cb-8ba5-4146-a506-4989f6bade5c/util/0.log" Feb 18 15:32:44 crc kubenswrapper[4817]: I0218 15:32:44.615942 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62_ca2187cb-8ba5-4146-a506-4989f6bade5c/pull/0.log" Feb 18 15:32:44 crc kubenswrapper[4817]: I0218 15:32:44.661691 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca45p62_ca2187cb-8ba5-4146-a506-4989f6bade5c/extract/0.log" Feb 18 15:32:44 crc kubenswrapper[4817]: I0218 15:32:44.731729 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-cw95t_039201b1-3f23-4f22-80cb-17f07e1732df/marketplace-operator/0.log" Feb 18 15:32:45 crc kubenswrapper[4817]: I0218 15:32:45.094877 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jbhmn_32480fcf-d389-4f17-adee-4870e948038c/extract-utilities/0.log" Feb 18 15:32:45 crc kubenswrapper[4817]: I0218 15:32:45.377110 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jbhmn_32480fcf-d389-4f17-adee-4870e948038c/extract-utilities/0.log" Feb 18 15:32:45 crc kubenswrapper[4817]: I0218 15:32:45.410303 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jbhmn_32480fcf-d389-4f17-adee-4870e948038c/extract-content/0.log" Feb 18 15:32:45 crc kubenswrapper[4817]: I0218 15:32:45.448792 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jbhmn_32480fcf-d389-4f17-adee-4870e948038c/extract-content/0.log" Feb 18 15:32:45 crc kubenswrapper[4817]: I0218 15:32:45.589076 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jbhmn_32480fcf-d389-4f17-adee-4870e948038c/extract-content/0.log" Feb 18 15:32:45 crc kubenswrapper[4817]: I0218 15:32:45.648509 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jbhmn_32480fcf-d389-4f17-adee-4870e948038c/extract-utilities/0.log" Feb 18 15:32:45 crc kubenswrapper[4817]: I0218 15:32:45.692587 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hsbc9_4185e717-2ef8-456e-ad88-f8a65231cd06/extract-utilities/0.log" Feb 18 15:32:45 crc kubenswrapper[4817]: I0218 15:32:45.855431 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jbhmn_32480fcf-d389-4f17-adee-4870e948038c/registry-server/0.log" Feb 18 15:32:45 crc kubenswrapper[4817]: I0218 15:32:45.932962 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hsbc9_4185e717-2ef8-456e-ad88-f8a65231cd06/extract-utilities/0.log" Feb 18 15:32:45 crc kubenswrapper[4817]: I0218 15:32:45.958100 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hsbc9_4185e717-2ef8-456e-ad88-f8a65231cd06/extract-content/0.log" Feb 18 15:32:45 crc kubenswrapper[4817]: I0218 15:32:45.958188 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hsbc9_4185e717-2ef8-456e-ad88-f8a65231cd06/extract-content/0.log" Feb 18 15:32:46 crc kubenswrapper[4817]: I0218 15:32:46.114236 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hsbc9_4185e717-2ef8-456e-ad88-f8a65231cd06/extract-utilities/0.log" Feb 18 15:32:46 crc kubenswrapper[4817]: I0218 15:32:46.142277 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hsbc9_4185e717-2ef8-456e-ad88-f8a65231cd06/extract-content/0.log" Feb 18 15:32:46 crc kubenswrapper[4817]: I0218 15:32:46.581127 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hsbc9_4185e717-2ef8-456e-ad88-f8a65231cd06/registry-server/0.log" Feb 18 15:33:01 crc kubenswrapper[4817]: I0218 15:33:01.038664 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-2p6zw_bc446e23-6b46-40cc-b058-5f8d491d8310/prometheus-operator/0.log" Feb 18 15:33:01 crc kubenswrapper[4817]: I0218 15:33:01.115613 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6bcf4d9f-vclxw_fa24c32b-4905-4756-a765-195d6b0b6c1a/prometheus-operator-admission-webhook/0.log" Feb 18 15:33:01 crc kubenswrapper[4817]: I0218 15:33:01.159603 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6bcf4d9f-77bzh_fad4abaa-bb3e-4fa2-9478-37e792ead430/prometheus-operator-admission-webhook/0.log" Feb 18 15:33:01 crc kubenswrapper[4817]: I0218 15:33:01.343620 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-xcdgl_02b7c5c2-ac49-498f-9c4c-c64cf484d131/operator/0.log" Feb 18 15:33:01 crc kubenswrapper[4817]: I0218 15:33:01.392510 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-qsnlp_f5816544-7d2c-4bf3-aeab-98f546573810/perses-operator/0.log" Feb 18 15:33:16 crc kubenswrapper[4817]: I0218 15:33:16.851622 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-59d4b4c7c-rvnbw_bf27f33f-390f-44fa-91fb-40f18240d0df/kube-rbac-proxy/0.log" Feb 18 15:33:16 crc kubenswrapper[4817]: I0218 15:33:16.866869 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-59d4b4c7c-rvnbw_bf27f33f-390f-44fa-91fb-40f18240d0df/manager/0.log" Feb 18 15:34:42 crc kubenswrapper[4817]: I0218 15:34:42.863556 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 15:34:42 crc kubenswrapper[4817]: I0218 15:34:42.864055 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 15:35:06 crc kubenswrapper[4817]: I0218 15:35:06.413108 4817 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-67r9c"] Feb 18 15:35:06 crc kubenswrapper[4817]: E0218 15:35:06.414448 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5ce4f38-77cb-49cd-b921-6d63fe15dac2" containerName="extract-utilities" Feb 18 15:35:06 crc kubenswrapper[4817]: I0218 15:35:06.414472 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5ce4f38-77cb-49cd-b921-6d63fe15dac2" containerName="extract-utilities" Feb 18 15:35:06 crc kubenswrapper[4817]: E0218 15:35:06.414501 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6df996b-a456-4566-a21d-d65896d0bc92" containerName="registry-server" Feb 18 15:35:06 crc kubenswrapper[4817]: I0218 15:35:06.414513 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6df996b-a456-4566-a21d-d65896d0bc92" containerName="registry-server" Feb 18 15:35:06 crc kubenswrapper[4817]: E0218 15:35:06.414548 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6df996b-a456-4566-a21d-d65896d0bc92" containerName="extract-content" Feb 18 15:35:06 crc kubenswrapper[4817]: I0218 15:35:06.414559 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6df996b-a456-4566-a21d-d65896d0bc92" containerName="extract-content" Feb 18 15:35:06 crc kubenswrapper[4817]: E0218 15:35:06.414601 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5ce4f38-77cb-49cd-b921-6d63fe15dac2" containerName="registry-server" Feb 18 15:35:06 crc kubenswrapper[4817]: I0218 15:35:06.414613 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5ce4f38-77cb-49cd-b921-6d63fe15dac2" containerName="registry-server" Feb 18 15:35:06 crc kubenswrapper[4817]: E0218 15:35:06.414657 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5ce4f38-77cb-49cd-b921-6d63fe15dac2" containerName="extract-content" Feb 18 15:35:06 crc kubenswrapper[4817]: I0218 15:35:06.414669 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5ce4f38-77cb-49cd-b921-6d63fe15dac2" containerName="extract-content" Feb 18 15:35:06 crc kubenswrapper[4817]: E0218 15:35:06.414696 4817 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6df996b-a456-4566-a21d-d65896d0bc92" containerName="extract-utilities" Feb 18 15:35:06 crc kubenswrapper[4817]: I0218 15:35:06.414708 4817 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6df996b-a456-4566-a21d-d65896d0bc92" containerName="extract-utilities" Feb 18 15:35:06 crc kubenswrapper[4817]: I0218 15:35:06.415045 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6df996b-a456-4566-a21d-d65896d0bc92" containerName="registry-server" Feb 18 15:35:06 crc kubenswrapper[4817]: I0218 15:35:06.415080 4817 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5ce4f38-77cb-49cd-b921-6d63fe15dac2" containerName="registry-server" Feb 18 15:35:06 crc kubenswrapper[4817]: I0218 15:35:06.418323 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-67r9c" Feb 18 15:35:06 crc kubenswrapper[4817]: I0218 15:35:06.435349 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-67r9c"] Feb 18 15:35:06 crc kubenswrapper[4817]: I0218 15:35:06.502284 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6765764d-cbb5-44a7-ba5d-937be73805d0-utilities\") pod \"certified-operators-67r9c\" (UID: \"6765764d-cbb5-44a7-ba5d-937be73805d0\") " pod="openshift-marketplace/certified-operators-67r9c" Feb 18 15:35:06 crc kubenswrapper[4817]: I0218 15:35:06.502368 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6765764d-cbb5-44a7-ba5d-937be73805d0-catalog-content\") pod \"certified-operators-67r9c\" (UID: \"6765764d-cbb5-44a7-ba5d-937be73805d0\") " pod="openshift-marketplace/certified-operators-67r9c" Feb 18 15:35:06 crc kubenswrapper[4817]: I0218 15:35:06.502559 4817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw9t4\" (UniqueName: \"kubernetes.io/projected/6765764d-cbb5-44a7-ba5d-937be73805d0-kube-api-access-vw9t4\") pod \"certified-operators-67r9c\" (UID: \"6765764d-cbb5-44a7-ba5d-937be73805d0\") " pod="openshift-marketplace/certified-operators-67r9c" Feb 18 15:35:06 crc kubenswrapper[4817]: I0218 15:35:06.604948 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6765764d-cbb5-44a7-ba5d-937be73805d0-utilities\") pod \"certified-operators-67r9c\" (UID: \"6765764d-cbb5-44a7-ba5d-937be73805d0\") " pod="openshift-marketplace/certified-operators-67r9c" Feb 18 15:35:06 crc kubenswrapper[4817]: I0218 15:35:06.605042 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6765764d-cbb5-44a7-ba5d-937be73805d0-catalog-content\") pod \"certified-operators-67r9c\" (UID: \"6765764d-cbb5-44a7-ba5d-937be73805d0\") " pod="openshift-marketplace/certified-operators-67r9c" Feb 18 15:35:06 crc kubenswrapper[4817]: I0218 15:35:06.605157 4817 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw9t4\" (UniqueName: \"kubernetes.io/projected/6765764d-cbb5-44a7-ba5d-937be73805d0-kube-api-access-vw9t4\") pod \"certified-operators-67r9c\" (UID: \"6765764d-cbb5-44a7-ba5d-937be73805d0\") " pod="openshift-marketplace/certified-operators-67r9c" Feb 18 15:35:06 crc kubenswrapper[4817]: I0218 15:35:06.605671 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6765764d-cbb5-44a7-ba5d-937be73805d0-utilities\") pod \"certified-operators-67r9c\" (UID: \"6765764d-cbb5-44a7-ba5d-937be73805d0\") " pod="openshift-marketplace/certified-operators-67r9c" Feb 18 15:35:06 crc kubenswrapper[4817]: I0218 15:35:06.605699 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6765764d-cbb5-44a7-ba5d-937be73805d0-catalog-content\") pod \"certified-operators-67r9c\" (UID: \"6765764d-cbb5-44a7-ba5d-937be73805d0\") " pod="openshift-marketplace/certified-operators-67r9c" Feb 18 15:35:06 crc kubenswrapper[4817]: I0218 15:35:06.632447 4817 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw9t4\" (UniqueName: \"kubernetes.io/projected/6765764d-cbb5-44a7-ba5d-937be73805d0-kube-api-access-vw9t4\") pod \"certified-operators-67r9c\" (UID: \"6765764d-cbb5-44a7-ba5d-937be73805d0\") " pod="openshift-marketplace/certified-operators-67r9c" Feb 18 15:35:06 crc kubenswrapper[4817]: I0218 15:35:06.741592 4817 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-67r9c" Feb 18 15:35:07 crc kubenswrapper[4817]: I0218 15:35:07.383968 4817 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-67r9c"] Feb 18 15:35:07 crc kubenswrapper[4817]: I0218 15:35:07.472970 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-67r9c" event={"ID":"6765764d-cbb5-44a7-ba5d-937be73805d0","Type":"ContainerStarted","Data":"9f42c95ce3fc33bc77edea6e8f2b2a87c7367a5412509ca1c1edb21f6ff2dd11"} Feb 18 15:35:08 crc kubenswrapper[4817]: I0218 15:35:08.484835 4817 generic.go:334] "Generic (PLEG): container finished" podID="6765764d-cbb5-44a7-ba5d-937be73805d0" containerID="e6323938cbde0f9b8673d839b3fca4b82d205b56978e1d031a5c341ec7684b39" exitCode=0 Feb 18 15:35:08 crc kubenswrapper[4817]: I0218 15:35:08.484932 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-67r9c" event={"ID":"6765764d-cbb5-44a7-ba5d-937be73805d0","Type":"ContainerDied","Data":"e6323938cbde0f9b8673d839b3fca4b82d205b56978e1d031a5c341ec7684b39"} Feb 18 15:35:09 crc kubenswrapper[4817]: I0218 15:35:09.498636 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-67r9c" event={"ID":"6765764d-cbb5-44a7-ba5d-937be73805d0","Type":"ContainerStarted","Data":"56235d2dfd0493c552ce81df82c18fe59415eec6e8a2b872386fa3116a50c369"} Feb 18 15:35:11 crc kubenswrapper[4817]: I0218 15:35:11.518543 4817 generic.go:334] "Generic (PLEG): container finished" podID="6765764d-cbb5-44a7-ba5d-937be73805d0" containerID="56235d2dfd0493c552ce81df82c18fe59415eec6e8a2b872386fa3116a50c369" exitCode=0 Feb 18 15:35:11 crc kubenswrapper[4817]: I0218 15:35:11.518632 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-67r9c" event={"ID":"6765764d-cbb5-44a7-ba5d-937be73805d0","Type":"ContainerDied","Data":"56235d2dfd0493c552ce81df82c18fe59415eec6e8a2b872386fa3116a50c369"} Feb 18 15:35:12 crc kubenswrapper[4817]: I0218 15:35:12.557660 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-67r9c" event={"ID":"6765764d-cbb5-44a7-ba5d-937be73805d0","Type":"ContainerStarted","Data":"7a9a307213eb4a5f50afd603bca65446af543d19c7b5516198bfe6bbd8b5a901"} Feb 18 15:35:12 crc kubenswrapper[4817]: I0218 15:35:12.580104 4817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-67r9c" podStartSLOduration=3.124169819 podStartE2EDuration="6.58008408s" podCreationTimestamp="2026-02-18 15:35:06 +0000 UTC" firstStartedPulling="2026-02-18 15:35:08.487371973 +0000 UTC m=+5771.062907956" lastFinishedPulling="2026-02-18 15:35:11.943286204 +0000 UTC m=+5774.518822217" observedRunningTime="2026-02-18 15:35:12.577418113 +0000 UTC m=+5775.152954106" watchObservedRunningTime="2026-02-18 15:35:12.58008408 +0000 UTC m=+5775.155620063" Feb 18 15:35:12 crc kubenswrapper[4817]: I0218 15:35:12.863538 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 15:35:12 crc kubenswrapper[4817]: I0218 15:35:12.863603 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 15:35:16 crc kubenswrapper[4817]: I0218 15:35:16.742531 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-67r9c" Feb 18 15:35:16 crc kubenswrapper[4817]: I0218 15:35:16.743081 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-67r9c" Feb 18 15:35:16 crc kubenswrapper[4817]: I0218 15:35:16.796226 4817 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-67r9c" Feb 18 15:35:17 crc kubenswrapper[4817]: I0218 15:35:17.648672 4817 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-67r9c" Feb 18 15:35:17 crc kubenswrapper[4817]: I0218 15:35:17.708563 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-67r9c"] Feb 18 15:35:19 crc kubenswrapper[4817]: I0218 15:35:19.620025 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-67r9c" podUID="6765764d-cbb5-44a7-ba5d-937be73805d0" containerName="registry-server" containerID="cri-o://7a9a307213eb4a5f50afd603bca65446af543d19c7b5516198bfe6bbd8b5a901" gracePeriod=2 Feb 18 15:35:20 crc kubenswrapper[4817]: I0218 15:35:20.426624 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-67r9c" Feb 18 15:35:20 crc kubenswrapper[4817]: I0218 15:35:20.511524 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw9t4\" (UniqueName: \"kubernetes.io/projected/6765764d-cbb5-44a7-ba5d-937be73805d0-kube-api-access-vw9t4\") pod \"6765764d-cbb5-44a7-ba5d-937be73805d0\" (UID: \"6765764d-cbb5-44a7-ba5d-937be73805d0\") " Feb 18 15:35:20 crc kubenswrapper[4817]: I0218 15:35:20.512025 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6765764d-cbb5-44a7-ba5d-937be73805d0-utilities\") pod \"6765764d-cbb5-44a7-ba5d-937be73805d0\" (UID: \"6765764d-cbb5-44a7-ba5d-937be73805d0\") " Feb 18 15:35:20 crc kubenswrapper[4817]: I0218 15:35:20.512110 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6765764d-cbb5-44a7-ba5d-937be73805d0-catalog-content\") pod \"6765764d-cbb5-44a7-ba5d-937be73805d0\" (UID: \"6765764d-cbb5-44a7-ba5d-937be73805d0\") " Feb 18 15:35:20 crc kubenswrapper[4817]: I0218 15:35:20.513040 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6765764d-cbb5-44a7-ba5d-937be73805d0-utilities" (OuterVolumeSpecName: "utilities") pod "6765764d-cbb5-44a7-ba5d-937be73805d0" (UID: "6765764d-cbb5-44a7-ba5d-937be73805d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 15:35:20 crc kubenswrapper[4817]: I0218 15:35:20.517721 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6765764d-cbb5-44a7-ba5d-937be73805d0-kube-api-access-vw9t4" (OuterVolumeSpecName: "kube-api-access-vw9t4") pod "6765764d-cbb5-44a7-ba5d-937be73805d0" (UID: "6765764d-cbb5-44a7-ba5d-937be73805d0"). InnerVolumeSpecName "kube-api-access-vw9t4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 15:35:20 crc kubenswrapper[4817]: I0218 15:35:20.614875 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw9t4\" (UniqueName: \"kubernetes.io/projected/6765764d-cbb5-44a7-ba5d-937be73805d0-kube-api-access-vw9t4\") on node \"crc\" DevicePath \"\"" Feb 18 15:35:20 crc kubenswrapper[4817]: I0218 15:35:20.614928 4817 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6765764d-cbb5-44a7-ba5d-937be73805d0-utilities\") on node \"crc\" DevicePath \"\"" Feb 18 15:35:20 crc kubenswrapper[4817]: I0218 15:35:20.632804 4817 generic.go:334] "Generic (PLEG): container finished" podID="6765764d-cbb5-44a7-ba5d-937be73805d0" containerID="7a9a307213eb4a5f50afd603bca65446af543d19c7b5516198bfe6bbd8b5a901" exitCode=0 Feb 18 15:35:20 crc kubenswrapper[4817]: I0218 15:35:20.632842 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-67r9c" Feb 18 15:35:20 crc kubenswrapper[4817]: I0218 15:35:20.632861 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-67r9c" event={"ID":"6765764d-cbb5-44a7-ba5d-937be73805d0","Type":"ContainerDied","Data":"7a9a307213eb4a5f50afd603bca65446af543d19c7b5516198bfe6bbd8b5a901"} Feb 18 15:35:20 crc kubenswrapper[4817]: I0218 15:35:20.632898 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-67r9c" event={"ID":"6765764d-cbb5-44a7-ba5d-937be73805d0","Type":"ContainerDied","Data":"9f42c95ce3fc33bc77edea6e8f2b2a87c7367a5412509ca1c1edb21f6ff2dd11"} Feb 18 15:35:20 crc kubenswrapper[4817]: I0218 15:35:20.632917 4817 scope.go:117] "RemoveContainer" containerID="7a9a307213eb4a5f50afd603bca65446af543d19c7b5516198bfe6bbd8b5a901" Feb 18 15:35:20 crc kubenswrapper[4817]: I0218 15:35:20.663398 4817 scope.go:117] "RemoveContainer" containerID="56235d2dfd0493c552ce81df82c18fe59415eec6e8a2b872386fa3116a50c369" Feb 18 15:35:20 crc kubenswrapper[4817]: I0218 15:35:20.691812 4817 scope.go:117] "RemoveContainer" containerID="e6323938cbde0f9b8673d839b3fca4b82d205b56978e1d031a5c341ec7684b39" Feb 18 15:35:20 crc kubenswrapper[4817]: I0218 15:35:20.693536 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6765764d-cbb5-44a7-ba5d-937be73805d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6765764d-cbb5-44a7-ba5d-937be73805d0" (UID: "6765764d-cbb5-44a7-ba5d-937be73805d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 15:35:20 crc kubenswrapper[4817]: I0218 15:35:20.717750 4817 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6765764d-cbb5-44a7-ba5d-937be73805d0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 18 15:35:20 crc kubenswrapper[4817]: I0218 15:35:20.741529 4817 scope.go:117] "RemoveContainer" containerID="7a9a307213eb4a5f50afd603bca65446af543d19c7b5516198bfe6bbd8b5a901" Feb 18 15:35:20 crc kubenswrapper[4817]: E0218 15:35:20.742023 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a9a307213eb4a5f50afd603bca65446af543d19c7b5516198bfe6bbd8b5a901\": container with ID starting with 7a9a307213eb4a5f50afd603bca65446af543d19c7b5516198bfe6bbd8b5a901 not found: ID does not exist" containerID="7a9a307213eb4a5f50afd603bca65446af543d19c7b5516198bfe6bbd8b5a901" Feb 18 15:35:20 crc kubenswrapper[4817]: I0218 15:35:20.742087 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a9a307213eb4a5f50afd603bca65446af543d19c7b5516198bfe6bbd8b5a901"} err="failed to get container status \"7a9a307213eb4a5f50afd603bca65446af543d19c7b5516198bfe6bbd8b5a901\": rpc error: code = NotFound desc = could not find container \"7a9a307213eb4a5f50afd603bca65446af543d19c7b5516198bfe6bbd8b5a901\": container with ID starting with 7a9a307213eb4a5f50afd603bca65446af543d19c7b5516198bfe6bbd8b5a901 not found: ID does not exist" Feb 18 15:35:20 crc kubenswrapper[4817]: I0218 15:35:20.742123 4817 scope.go:117] "RemoveContainer" containerID="56235d2dfd0493c552ce81df82c18fe59415eec6e8a2b872386fa3116a50c369" Feb 18 15:35:20 crc kubenswrapper[4817]: E0218 15:35:20.742463 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56235d2dfd0493c552ce81df82c18fe59415eec6e8a2b872386fa3116a50c369\": container with ID starting with 56235d2dfd0493c552ce81df82c18fe59415eec6e8a2b872386fa3116a50c369 not found: ID does not exist" containerID="56235d2dfd0493c552ce81df82c18fe59415eec6e8a2b872386fa3116a50c369" Feb 18 15:35:20 crc kubenswrapper[4817]: I0218 15:35:20.742498 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56235d2dfd0493c552ce81df82c18fe59415eec6e8a2b872386fa3116a50c369"} err="failed to get container status \"56235d2dfd0493c552ce81df82c18fe59415eec6e8a2b872386fa3116a50c369\": rpc error: code = NotFound desc = could not find container \"56235d2dfd0493c552ce81df82c18fe59415eec6e8a2b872386fa3116a50c369\": container with ID starting with 56235d2dfd0493c552ce81df82c18fe59415eec6e8a2b872386fa3116a50c369 not found: ID does not exist" Feb 18 15:35:20 crc kubenswrapper[4817]: I0218 15:35:20.742520 4817 scope.go:117] "RemoveContainer" containerID="e6323938cbde0f9b8673d839b3fca4b82d205b56978e1d031a5c341ec7684b39" Feb 18 15:35:20 crc kubenswrapper[4817]: E0218 15:35:20.742791 4817 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6323938cbde0f9b8673d839b3fca4b82d205b56978e1d031a5c341ec7684b39\": container with ID starting with e6323938cbde0f9b8673d839b3fca4b82d205b56978e1d031a5c341ec7684b39 not found: ID does not exist" containerID="e6323938cbde0f9b8673d839b3fca4b82d205b56978e1d031a5c341ec7684b39" Feb 18 15:35:20 crc kubenswrapper[4817]: I0218 15:35:20.742846 4817 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6323938cbde0f9b8673d839b3fca4b82d205b56978e1d031a5c341ec7684b39"} err="failed to get container status \"e6323938cbde0f9b8673d839b3fca4b82d205b56978e1d031a5c341ec7684b39\": rpc error: code = NotFound desc = could not find container \"e6323938cbde0f9b8673d839b3fca4b82d205b56978e1d031a5c341ec7684b39\": container with ID starting with e6323938cbde0f9b8673d839b3fca4b82d205b56978e1d031a5c341ec7684b39 not found: ID does not exist" Feb 18 15:35:20 crc kubenswrapper[4817]: I0218 15:35:20.972524 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-67r9c"] Feb 18 15:35:20 crc kubenswrapper[4817]: I0218 15:35:20.981146 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-67r9c"] Feb 18 15:35:22 crc kubenswrapper[4817]: I0218 15:35:22.183791 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6765764d-cbb5-44a7-ba5d-937be73805d0" path="/var/lib/kubelet/pods/6765764d-cbb5-44a7-ba5d-937be73805d0/volumes" Feb 18 15:35:25 crc kubenswrapper[4817]: I0218 15:35:25.681624 4817 generic.go:334] "Generic (PLEG): container finished" podID="1934f4f8-1ea2-4e98-bf8e-38ae890846e6" containerID="0ca754a60390073d53c6e9963f75a13413d45ba21c76dcea39c27b15c109f77e" exitCode=0 Feb 18 15:35:25 crc kubenswrapper[4817]: I0218 15:35:25.681722 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6c46z/must-gather-t2lxx" event={"ID":"1934f4f8-1ea2-4e98-bf8e-38ae890846e6","Type":"ContainerDied","Data":"0ca754a60390073d53c6e9963f75a13413d45ba21c76dcea39c27b15c109f77e"} Feb 18 15:35:25 crc kubenswrapper[4817]: I0218 15:35:25.682829 4817 scope.go:117] "RemoveContainer" containerID="0ca754a60390073d53c6e9963f75a13413d45ba21c76dcea39c27b15c109f77e" Feb 18 15:35:25 crc kubenswrapper[4817]: I0218 15:35:25.981426 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6c46z_must-gather-t2lxx_1934f4f8-1ea2-4e98-bf8e-38ae890846e6/gather/0.log" Feb 18 15:35:34 crc kubenswrapper[4817]: I0218 15:35:34.940745 4817 scope.go:117] "RemoveContainer" containerID="4c2f4e5f4370220ea6a491124de236510daa3a12d6915cd706d301a3bade5e74" Feb 18 15:35:39 crc kubenswrapper[4817]: I0218 15:35:39.180943 4817 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6c46z/must-gather-t2lxx"] Feb 18 15:35:39 crc kubenswrapper[4817]: I0218 15:35:39.181767 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-6c46z/must-gather-t2lxx" podUID="1934f4f8-1ea2-4e98-bf8e-38ae890846e6" containerName="copy" containerID="cri-o://e09d56384f727cc8a35e6fa01eacf222a5650536714dc270e270fccedc921ad2" gracePeriod=2 Feb 18 15:35:39 crc kubenswrapper[4817]: I0218 15:35:39.191268 4817 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6c46z/must-gather-t2lxx"] Feb 18 15:35:39 crc kubenswrapper[4817]: I0218 15:35:39.821830 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6c46z_must-gather-t2lxx_1934f4f8-1ea2-4e98-bf8e-38ae890846e6/copy/0.log" Feb 18 15:35:39 crc kubenswrapper[4817]: I0218 15:35:39.823367 4817 generic.go:334] "Generic (PLEG): container finished" podID="1934f4f8-1ea2-4e98-bf8e-38ae890846e6" containerID="e09d56384f727cc8a35e6fa01eacf222a5650536714dc270e270fccedc921ad2" exitCode=143 Feb 18 15:35:39 crc kubenswrapper[4817]: I0218 15:35:39.823450 4817 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32dd39c30817a032bcb1421ae52c04ee206cd210191f0a0d54f7002dff80ee92" Feb 18 15:35:39 crc kubenswrapper[4817]: I0218 15:35:39.858988 4817 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6c46z_must-gather-t2lxx_1934f4f8-1ea2-4e98-bf8e-38ae890846e6/copy/0.log" Feb 18 15:35:39 crc kubenswrapper[4817]: I0218 15:35:39.859399 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6c46z/must-gather-t2lxx" Feb 18 15:35:39 crc kubenswrapper[4817]: I0218 15:35:39.935139 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csrj5\" (UniqueName: \"kubernetes.io/projected/1934f4f8-1ea2-4e98-bf8e-38ae890846e6-kube-api-access-csrj5\") pod \"1934f4f8-1ea2-4e98-bf8e-38ae890846e6\" (UID: \"1934f4f8-1ea2-4e98-bf8e-38ae890846e6\") " Feb 18 15:35:39 crc kubenswrapper[4817]: I0218 15:35:39.935435 4817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1934f4f8-1ea2-4e98-bf8e-38ae890846e6-must-gather-output\") pod \"1934f4f8-1ea2-4e98-bf8e-38ae890846e6\" (UID: \"1934f4f8-1ea2-4e98-bf8e-38ae890846e6\") " Feb 18 15:35:39 crc kubenswrapper[4817]: I0218 15:35:39.948167 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1934f4f8-1ea2-4e98-bf8e-38ae890846e6-kube-api-access-csrj5" (OuterVolumeSpecName: "kube-api-access-csrj5") pod "1934f4f8-1ea2-4e98-bf8e-38ae890846e6" (UID: "1934f4f8-1ea2-4e98-bf8e-38ae890846e6"). InnerVolumeSpecName "kube-api-access-csrj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 18 15:35:40 crc kubenswrapper[4817]: I0218 15:35:40.038374 4817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csrj5\" (UniqueName: \"kubernetes.io/projected/1934f4f8-1ea2-4e98-bf8e-38ae890846e6-kube-api-access-csrj5\") on node \"crc\" DevicePath \"\"" Feb 18 15:35:40 crc kubenswrapper[4817]: I0218 15:35:40.220649 4817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1934f4f8-1ea2-4e98-bf8e-38ae890846e6-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "1934f4f8-1ea2-4e98-bf8e-38ae890846e6" (UID: "1934f4f8-1ea2-4e98-bf8e-38ae890846e6"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 18 15:35:40 crc kubenswrapper[4817]: I0218 15:35:40.250639 4817 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1934f4f8-1ea2-4e98-bf8e-38ae890846e6-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 18 15:35:40 crc kubenswrapper[4817]: I0218 15:35:40.831119 4817 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6c46z/must-gather-t2lxx" Feb 18 15:35:42 crc kubenswrapper[4817]: I0218 15:35:42.181887 4817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1934f4f8-1ea2-4e98-bf8e-38ae890846e6" path="/var/lib/kubelet/pods/1934f4f8-1ea2-4e98-bf8e-38ae890846e6/volumes" Feb 18 15:35:42 crc kubenswrapper[4817]: I0218 15:35:42.863239 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 15:35:42 crc kubenswrapper[4817]: I0218 15:35:42.863298 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 15:35:42 crc kubenswrapper[4817]: I0218 15:35:42.863340 4817 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" Feb 18 15:35:42 crc kubenswrapper[4817]: I0218 15:35:42.864112 4817 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cb50bfaa6f54127fe536eead08106baf14e441ffb218d0ce957318bef07ec7c2"} pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 15:35:42 crc kubenswrapper[4817]: I0218 15:35:42.864176 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" containerID="cri-o://cb50bfaa6f54127fe536eead08106baf14e441ffb218d0ce957318bef07ec7c2" gracePeriod=600 Feb 18 15:35:43 crc kubenswrapper[4817]: I0218 15:35:43.864451 4817 generic.go:334] "Generic (PLEG): container finished" podID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerID="cb50bfaa6f54127fe536eead08106baf14e441ffb218d0ce957318bef07ec7c2" exitCode=0 Feb 18 15:35:43 crc kubenswrapper[4817]: I0218 15:35:43.864534 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerDied","Data":"cb50bfaa6f54127fe536eead08106baf14e441ffb218d0ce957318bef07ec7c2"} Feb 18 15:35:43 crc kubenswrapper[4817]: I0218 15:35:43.864744 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerStarted","Data":"6c03c63cb01cb32f04cfc50144b4814885105a57283a5e06879483a8166104ca"} Feb 18 15:35:43 crc kubenswrapper[4817]: I0218 15:35:43.864771 4817 scope.go:117] "RemoveContainer" containerID="4391c0b6a1ccf075965fda152c64cede71fd2ad1d19321476a2f1c8612ddc514" Feb 18 15:36:35 crc kubenswrapper[4817]: I0218 15:36:35.004704 4817 scope.go:117] "RemoveContainer" containerID="0ca754a60390073d53c6e9963f75a13413d45ba21c76dcea39c27b15c109f77e" Feb 18 15:36:35 crc kubenswrapper[4817]: I0218 15:36:35.074687 4817 scope.go:117] "RemoveContainer" containerID="e09d56384f727cc8a35e6fa01eacf222a5650536714dc270e270fccedc921ad2" Feb 18 15:38:12 crc kubenswrapper[4817]: I0218 15:38:12.863841 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 15:38:12 crc kubenswrapper[4817]: I0218 15:38:12.864562 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 15:38:42 crc kubenswrapper[4817]: I0218 15:38:42.863906 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 15:38:42 crc kubenswrapper[4817]: I0218 15:38:42.864502 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 15:39:12 crc kubenswrapper[4817]: I0218 15:39:12.863953 4817 patch_prober.go:28] interesting pod/machine-config-daemon-g6zzb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 18 15:39:12 crc kubenswrapper[4817]: I0218 15:39:12.864635 4817 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 18 15:39:12 crc kubenswrapper[4817]: I0218 15:39:12.864691 4817 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" Feb 18 15:39:12 crc kubenswrapper[4817]: I0218 15:39:12.865624 4817 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6c03c63cb01cb32f04cfc50144b4814885105a57283a5e06879483a8166104ca"} pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 18 15:39:12 crc kubenswrapper[4817]: I0218 15:39:12.865689 4817 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerName="machine-config-daemon" containerID="cri-o://6c03c63cb01cb32f04cfc50144b4814885105a57283a5e06879483a8166104ca" gracePeriod=600 Feb 18 15:39:12 crc kubenswrapper[4817]: E0218 15:39:12.987833 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:39:13 crc kubenswrapper[4817]: I0218 15:39:13.871493 4817 generic.go:334] "Generic (PLEG): container finished" podID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" containerID="6c03c63cb01cb32f04cfc50144b4814885105a57283a5e06879483a8166104ca" exitCode=0 Feb 18 15:39:13 crc kubenswrapper[4817]: I0218 15:39:13.871551 4817 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" event={"ID":"b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5","Type":"ContainerDied","Data":"6c03c63cb01cb32f04cfc50144b4814885105a57283a5e06879483a8166104ca"} Feb 18 15:39:13 crc kubenswrapper[4817]: I0218 15:39:13.871601 4817 scope.go:117] "RemoveContainer" containerID="cb50bfaa6f54127fe536eead08106baf14e441ffb218d0ce957318bef07ec7c2" Feb 18 15:39:13 crc kubenswrapper[4817]: I0218 15:39:13.872695 4817 scope.go:117] "RemoveContainer" containerID="6c03c63cb01cb32f04cfc50144b4814885105a57283a5e06879483a8166104ca" Feb 18 15:39:13 crc kubenswrapper[4817]: E0218 15:39:13.873154 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:39:29 crc kubenswrapper[4817]: I0218 15:39:29.172257 4817 scope.go:117] "RemoveContainer" containerID="6c03c63cb01cb32f04cfc50144b4814885105a57283a5e06879483a8166104ca" Feb 18 15:39:29 crc kubenswrapper[4817]: E0218 15:39:29.173173 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:39:40 crc kubenswrapper[4817]: I0218 15:39:40.172015 4817 scope.go:117] "RemoveContainer" containerID="6c03c63cb01cb32f04cfc50144b4814885105a57283a5e06879483a8166104ca" Feb 18 15:39:40 crc kubenswrapper[4817]: E0218 15:39:40.172823 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:39:52 crc kubenswrapper[4817]: I0218 15:39:52.171832 4817 scope.go:117] "RemoveContainer" containerID="6c03c63cb01cb32f04cfc50144b4814885105a57283a5e06879483a8166104ca" Feb 18 15:39:52 crc kubenswrapper[4817]: E0218 15:39:52.174624 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:40:05 crc kubenswrapper[4817]: I0218 15:40:05.171457 4817 scope.go:117] "RemoveContainer" containerID="6c03c63cb01cb32f04cfc50144b4814885105a57283a5e06879483a8166104ca" Feb 18 15:40:05 crc kubenswrapper[4817]: E0218 15:40:05.172195 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:40:17 crc kubenswrapper[4817]: I0218 15:40:17.172162 4817 scope.go:117] "RemoveContainer" containerID="6c03c63cb01cb32f04cfc50144b4814885105a57283a5e06879483a8166104ca" Feb 18 15:40:17 crc kubenswrapper[4817]: E0218 15:40:17.172789 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" Feb 18 15:40:30 crc kubenswrapper[4817]: I0218 15:40:30.171695 4817 scope.go:117] "RemoveContainer" containerID="6c03c63cb01cb32f04cfc50144b4814885105a57283a5e06879483a8166104ca" Feb 18 15:40:30 crc kubenswrapper[4817]: E0218 15:40:30.173623 4817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-g6zzb_openshift-machine-config-operator(b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5)\"" pod="openshift-machine-config-operator/machine-config-daemon-g6zzb" podUID="b5c599bf-d7b3-4cd0-9fe8-e31ac79192a5" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515145356577024467 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015145356600017367 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015145342342016510 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015145342342015460 5ustar corecore